For Inner Product spaces V,W and a linear map α between the two, the adjoint of α is a linear map α* satisfying
<α(v), w> = <v,α*(w)> for all v ∈ V and w ∈ W. where < , > denotes the inner product (notation varies).
Whilst linear maps are equivalent to matrices, the notation has become slightly complex for adjoints. Firstly, there is a source of confusion with the adjoint matrix. Briefly, this, denoted adj(A) and often described instead as the adjugate matrix of A, is a matrix derived from A in terms of determinants of sub-matrices. It is not the matrix representation of the adjoint map.
In line with the notation for linear maps, A* may be used to indicate the matrix representing the adjoint of the map represented by A. However, this notation is often also used to denote complex conjugation of a matrix. Since the concept of an adjoint requires conjugation, and HTML can't supply the usual bar notation, it makes sense to reserve * for that operation and use the alternative notation of AH for the adjoint. In fact, this indicates the hermitian conjugate of A (that is, in our notation, A*T), but this is appropriate since, for a finite dimensional vector space, the adjoint of a map α represented by A in an orthonormal basis is represented by precisely AH.
Whilst the finite dimensional case will always yield an adjoint (via the map represented by matrix AH), it is not always the case that an adjoint can be constructed for an infinite dimensional space. However, if the adjoint does exist, it is unique:
Proof:Returning to maps instead of matrices, consider α with adjoint α* and a rival adjoint α'. Then by the adjoint defintion:
< v,α'(w) > = < α(v),w > = < v,α*(w) > ∀v,w
Then < v,α'(w)-α*(w) > =0 ∀v,w
So by Inner Product defintion, α'(w)-α*(w)=0 ∀w
So α' = α*
Any claimed rival α' for the adjoint is in fact the original adjoint α*, so the adjoint map is unique.
By simple manipulation of the axioms for inner products and the adjoint definition, the following properties hold:
Let α:V→W, β:W→Z, γ:V→W all have corresponding adjoints α*, β*, γ*. Then
- (α + γ)* = α* + γ*
- (λα)* = λ*α* (Where λ* is the complex conjugate of λ a constant from the field)
- (βoα)* = α*oβ*
- (α*)* = α
Of particular interest are so called self-adjoint maps: those which are their own adjoint. In terms of matrices, this makes them their own hermitian conjugate, and hence by definition a hermitian matrix (for the real case, this simply means it is symmetric). Such maps are useful because they allow the spectral theorem to be applied: the eigenvectors of such a map will constitute an orthonormal basis for the vector space, with all the corresponding eigenvalues being real. This allows for diagonalisation.