The complex analog of an orthogonal map is a unitary map. As discussed previously, the vectors of a complex inner product space have length. Unitary maps are linear maps that retain this length. We will show that results about orthogonal maps have analogs for unitary maps.
Let #V# and #W# be complex inner product spaces.
A map #L :V\rightarrow W# is called an isometry if #\norm{L(\vec{x}-\vec{y})}=\norm{\vec{x}-\vec{y}}# for all #\vec{x}#, #\vec{y}# in #V#.
As a consequence, a linear map #L :V\rightarrow W# is an isometry if and only if #\norm{L(\vec{x})}=\norm{\vec{x}}# for all #\vec{x}\in V#.
A linear isometry #L:V\to V# is called a unitary map.
If #V# is a real inner product space, then #V_{\mathbb{C}} = V\oplus \ii V#, the extension of #V# to a complex vector space, is a complex inner product space with inner product determined by #\norm{\vec{u}+\ii\vec{v}}=\norm{\vec{u}}+\norm{\vec{v}}# for vectors #\vec{u},\vec{v}# of #V#.
An orthogonal map #L:V\to V# can be extended uniquely to a linear map #L_{\mathbb{C}} :V_{\mathbb{C}}\to V_{\mathbb{C}}#, namely, by \[L(\vec{u}+\ii\vec{v}) =L(\vec{u})+\ii L(\vec{v})\] This map is unitary. The fact that #L_{\mathbb{C}}# preserves the length on #V_{\mathbb{C}}# follows from the following length calculation for vectors #\vec{u},\vec{v}# of #V# : \[\norm{L(\vec{u}+\ii\vec{v})}=\norm{L(\vec{u})+\ii L(\vec{v})} =\norm{L(\vec{u})}+\norm{L(\vec{v})}=\norm{\vec{u}}+\norm{\vec{v}}=\norm{\vec{u}+\ii\vec{v}}\]
As in the real case, a linear map #L:V\to W# between complex vector spaces is an isometry if and only if it preserves the inner product. The proof makes use of the complex equivalent of the polarization formula, which we did not cover in the theory; also the proof will not be given explicitly. It is analogous to the proof of the real case.
Consider the inner product space #\mathbb{C}# with the standard inner product #\dotprod{x}{y} = x\cdot\overline{y}#. Each linear mapping #\mathbb{C}\to\mathbb{C}# is the multiplication by a complex number #\lambda#. This multiplication is unitary if and only if #|\lambda|=1#.
After all, by the definition this multiplication is unitary if #\norm{\lambda\cdot x} = \norm{x} # for all complex numbers #x#. In this case, the norm equals the absolute value, so unitarity is equivalent to #| \lambda\cdot x| = | x| # for all complex numbers #x#. Because #| \lambda\cdot x| = |\lambda|\cdot| x| #, this is precisely the case if #|\lambda| = 1#.
The restriction #\left.\mathbb{C}\right|_{\mathbb{R}}# of #\mathbb{C}# is obtained by restricting scalars to #\mathbb{R}#. It is a #2#-dimensional real inner product space with orthonormal basis #\basis{1,\ii}#. The corresponding inner product is the real part of the inner product of #\mathbb{C}#. Multiplication by #\lambda# then has the following matrix with respect to the basis mentioned above:
\[\matrix{\Re{\lambda}& \Re{(\lambda\cdot\ii)}\\ \Im{\lambda}&\Im{(\lambda\cdot \ii)}} =\matrix{\Re{\lambda}& -\Im{\lambda}\\ \Im{\lambda}&\Re{\lambda}}\]
This confirms that multiplication by #\lambda#, viewed as a linear map on the real inner product space #\left.\mathbb{C}\right|_{\mathbb{R}}#, is orthogonal if and only if multiplication by #\lambda# on the complex vector space #\mathbb{C}# is unitary.
Let #V# be a unitary inner product space. A unitary reflection #S_{\vec{a},\lambda}: V\to V# is a unitary map having eigenvector #\vec{a}# with eigenvalue #\lambda# satisfying #\norm{\lambda}= 1# and fixing every vector that is perpendicular to #\vec{a}#.
A mapping rule for #S_{\vec{a},\lambda}# is given by
\[S_{\vec{a},\lambda}(\vec{x}) = \vec{x}-(1-\lambda)\dfrac{\dotprod{\vec{x}}{\vec{a}}}{\dotprod{\vec{a}}{\vec{a}}}\vec{a}\]
If #V = \mathbb{C}#, then each unitary map #V\to V# is a unitary reflection, as is clear from the 1D Example.
In general, for each pair of vectors #\vec{u}#, #\vec{v}# in #V# of the same length, there is a unitary reflection #S_{\vec{a},\lambda}# which sends #\vec{u}# to #\vec{v}#. A proof of this statement is given in an exercise.
Just as in the real case tanslations (maps #T_{\vec{a}}# for a vector #\vec{a}# with rule #T_{\vec{a}}(\vec{x}) =\vec{x}+\vec{a}#) are isometries. Each isometry #V\to W# between complex inner product spaces #V# and #W# is the composition of a translation and a linear isometry. This statement and its proof are similar to those in the real case.
The proof of the characterization of a linear isometry with the aid of the norm is the same as the one of the real case.
The following properties of isometries of complex inner product spaces correspond to the real case.
Let #U#, #V#, #W# be complex inner product spaces.
- If #L:V\rightarrow W# and #M :U\rightarrow V# are linear isometries, then the composition is #L\,M:U\rightarrow W# a linear isometry.
- If #L:V\rightarrow W# is a linear isometry, then #L# is injective.
- If #L:V\rightarrow V# is unitary and #V# finite dimensional, then #L# is invertible and
also #L^{-1}# is unitary.
- The map #L:V\to V# is unitary if and only if, for every orthonormal system #\vec{a}_1,\ldots ,\vec{a}_n# in #V#, the system #L(\vec{a}_1),\ldots , L(\vec{a}_n)# in #V# also is orthonormal.
- If #\vec{a}_1,\ldots ,\vec{a}_n# is an orthonormal basis for #V#, then the map #L:V\to V# is unitary if and only if #L(\vec{a}_1),\ldots , L(\vec{a}_n)# is an orthonormal basis in #V# as well.
The proof is analogous to the proof for a real inner product space.
Also, the correspondence with matrices runs parallel to the real case.
A complex #(n\times n)#-matrix #A# is called unitary if the columns of #A# form an orthonormal system in #\mathbb{C}^n#.
If #A# is a complex #(n\times n)#-matrix, then we denote by #A^\star# the matrix #{\overline A}^\top#, where #{\overline A}# is the #(n\times n)#-matrix each entry of which is the complex conjugate of the corresponding entry of #A#.
Let #V# be a finite-dimensional unitary inner product space. If #\alpha# is a basis for #V# composed of a vector #\vec{a}# and a basis for #\vec{a}^{\perp}# and if #\lambda# is a complex scalar of absolute value #1#, then the matrix of the unitary reflection #S_{\vec{a},\lambda}# with respect to #\alpha# equals
\[\left(S_{\vec{a},\lambda}\right)_\alpha = \matrix{\lambda&0&0&\cdots&0\\ 0&1&0&\cdots&0\\ 0&0&1&\ddots&0\\ \vdots&\vdots&\vdots&\ddots&\vdots\\ 0&0&0&\cdots&1}\] This matrix is unitary.
As we will see below, in general, the matrix of a unitary map with respect to an orthonormal basis is unitary. The basis #\alpha# above is not necessarily orthonormal.
If #A = \matrix{1&-\ii\\ \ii & 1}#, then #\overline{A}= \matrix{1&\ii\\ -\ii & 1}# and #A^\star= \matrix{1&-\ii\\ \ii & 1}#. Accordingly, #A^\star \, A =2\cdot \matrix{1&-\ii\\ \ii &1}#. The #(i,j)#-entry of the last matrix is equal to #\dotprod{\vec{a}_j}{\vec{a}_i}#, where #\vec{a}_i# is the #i#-th column of #A#. Because the #(1,2)#-entry is distinct from #0#, the matrix #A# is not unitary. Each #(2\times2)#-matrix of the form #\matrix{a&b\\ -\overline{b}&\overline{a}}# is unitary if #|a|^2+|b|^2 =1 #.
In physics, #A^\star# is often denoted by #A^\dagger#.
By use of the matrix with respect to an orthonormal basis, it can be verified whether a linear map on a finite-dimensional complex inner product space is unitary or not:
Let #L: \mathbb{C}^n \rightarrow \mathbb{C}^n# be a linear map with matrix #A#. Then the following statements are equivalent:
- The linear map #L# is unitary.
- The matrix #A# is unitary.
- #A^{\star}\, A=I_n#.
- The columns of #A# form an orthonormal system.
- The rows of #A# form an orthonormal system.
A real square matrix #A# is unitary if and only if it is orthogonal. After all, if #A# is real, then #A^{\star}# is equal to #A^\top# because the complex conjugate of a real number is the real number itself.
The proofs run analogously to the proofs in the real context.
Let #V# be a complex inner product space. A unitary reflection on #V# is a unitary map #{S_{\vec{a},\lambda}}:V\to V# having an eigenvector #\vec{a}# with eigenvalue #{\lambda}# distinct from #1# such that #S# fixes each vector in #\vec{a}^{\perp}#.
Now consider the case where #V = \mathbb{C}^3# and \[\vec{a} = \left[ 1 , \complexi , \complexi+1 \right] \phantom{xxx}\text{ and }\phantom{xxx} \lambda = -1\] Determine the matrix of the reflection #S_{\vec{a},\lambda}#.
#{ \matrix{{{1}\over{2}} & {{\complexi}\over{2}} & {{\complexi-1}\over{2}} \\ -{{\complexi}\over{2}} & {{1}\over{2}} & {{-\complexi-1}\over{2}} \\ {{-\complexi-1}\over{2}} & {{\complexi-1}\over{2}} & 0 \\ }}#
We use the mapping rule
\[S_{\vec{a},\lambda}(\vec{x}) = \vec{x} - (1-\lambda)\frac{\dotprod{\vec{x}}{\vec{a}}}{\dotprod{\vec{a}}{\vec{a}}}\cdot \vec{a}\] The proof of this formula follows from the fact that the map defined by the requirement fixes each vector of the hyperplane #\vec{a}^{\perp}# and that #\vec{a}# is an eigenvector with eigenvalue #\lambda#.
Now we substitute the given eigenvector #\vec{a}=\left[ 1 , \complexi , \complexi+1 \right] # and corresponding eigenvalue #\lambda=-1# in the above mapping rule:
\[\begin{array}{rcl}S_{\vec{a},\lambda}(\vec{x})& =&\displaystyle \rv{x_1,x_2,x_3} - 2 \frac{x_{1}-\complexi\cdot x_{2}+\left(1-\complexi\right)\cdot x_{3}}{4}\cdot \left[ 1 , \complexi , \complexi+1 \right] \\ &=& \left[ {{x_{1}+\complexi\cdot x_{2}+\left(\complexi-1\right)\cdot x_{3}}\over{2}} , {{-\complexi\cdot x_{1}+x_{2}-\left(\complexi+1\right)\cdot x_{3}}\over{2}} , {{\left(-\complexi-1\right)\cdot x_{1}+\left(\complexi-1\right)\cdot x_{2}}\over{2}} \right] \end{array}\] Therefore,
\[\begin{array}{rcl}S_{\vec{a},\lambda}(\rv{1,0,0})& =&\displaystyle \left[ {{1}\over{2}} , -{{\complexi}\over{2}} , {{-\complexi-1}\over{2}} \right] \\
S_{\vec{a},\lambda}(\rv{0,1,0})& =&\displaystyle \left[ {{\complexi}\over{2}} , {{1}\over{2}} , {{\complexi-1}\over{2}} \right] \\
S_{\vec{a},\lambda}(\rv{0,0,1})& =&\displaystyle \left[ {{\complexi-1}\over{2}} , {{-\complexi-1}\over{2}} , 0 \right]
\end{array}\]We conclude that the matrix of \(S_{\vec{a},\lambda}(\vec{x})\) equals \[ \matrix{{{1}\over{2}} & {{\complexi}\over{2}} & {{\complexi-1}\over{2}} \\ -{{\complexi}\over{2}} & {{1}\over{2}} & {{-\complexi-1}\over{2}} \\ {{-\complexi-1}\over{2}} & {{\complexi-1}\over{2}} & 0 \\ } \]