In addition to orthogonal maps, symmetric maps form a second important class of linear maps on a real inner product space. Here we introduce these maps.
Let #V# be a real inner product space. A linear map #L:V\rightarrow V# is called symmetric if \[\dotprod{L(\vec{x})}{\vec{y}}=\dotprod{\vec{x}}{L(\vec{y})}\] holds for all #\vec{x}, \vec{y}# in #V#.
The orthogonal projection # P_\ell:V\to V# onto a straight line #\ell# through the origin of an inner product space #V# is symmetric. To see why, we choose a vector #\vec{a}# in #V# with #{\left\lVert \vec{a} \right\rVert} =1# such that #\ell=\linspan{\vec{a}}#. Then the orthogonal projection on #\ell# is determined by \[{P_\ell}(\vec{x})=(\dotprod{\vec{x}}{\vec{a}})\,\vec{a}\] For all #\vec{x}# and #\vec{y}# in #V# we now have
\[\begin{array}{rcl} \dotprod{( P_\ell(\vec{x}))}{\vec{y}}&=&\dotprod{\left((\dotprod{\vec{x}}{\vec{a}})\,\vec{a}\right)}{ \vec{y}}\\ &&\phantom{xx}\color{blue}{\text{mapping rule }P_\ell}\\ &=&(\dotprod{\vec{x}}{\vec{a}})\cdot (\dotprod{\vec{a} }{\vec{y}})\\&&\phantom{xx}\color{blue}{\text{bilinearity of inner product}}\\ &=&(\dotprod{\vec{x}}{\vec{a}})\cdot (\dotprod{\vec{y} }{\vec{a}})\\&&\phantom{xx}\color{blue}{\text{symmetry of inner product}}\\ &=&\dotprod{\vec{x}}{\left((\dotprod{\vec{y}}{\vec{a})}\,\vec{a}\right)}\\&&\phantom{xx}\color{blue}{\text{bilinearity of inner product}}\\ &=& \dotprod{\vec{x}}{({P_\ell}(\vec{y}))}\\ &&\phantom{xx}\color{blue}{\text{mapping rule }P_\ell}\end{array}
\] so #\dotprod{( P_\ell(\vec{x}))}{\vec{y}}=\dotprod{\vec{x}}{( P_\ell(\vec{y}))}#. This shows that #P_\ell# satisfies the definition of a symmetric map.
The above line #\ell# is nothing but a #1#-dimensional linear subspace of #V#. In a similar way it can be shown that the orthogonal projection onto any linear subspace #W# of #V# is a symmetric linear mapping.
Let #L# be the linear map #{\mathbb{R}^2\to\mathbb{R}^2}# determined by the matrix \[ A=\matrix{a&b\\ c&d}\] Then #L# is symmetric if and only if the matrix #A# is symmetric; i.e., if and only if #b=c#. Later we will prove a more general statement for all finite dimensions, but here we show that if #L# is symmetric, then #b=c# must hold:
\[\begin{array}{rcl} b &=& \dotprod{\rv{1,0}}{\rv{b,d}} \\&&\phantom{xx}\color{blue}{\text{inner product calculation}}\\&=& \dotprod{\rv{1,0}}{L(\rv{0,1})}\\&&\phantom{xx}\color{blue}{\text{image is second column vector of }A}\\&=& \dotprod{L(\rv{1,0})}{\rv{0,1}}\\&&\phantom{xx}\color{blue}{\text{symmetry of }L}\\&=& \dotprod{\rv{a,c}}{\rv{0,1}}\\&&\phantom{xx}\color{blue}{\text{image is first column vector of }A}\\&=&c\\&&\phantom{xx}\color{blue}{\text{calculation of inner product}}\end{array}\]
Let #V# be an inner product space with basis #\basis{\vec{e}_1,\vec{e}_2}# (so #V# is #2#-dimensional) and let #L:V\to V# be a linear map. Then #L# is symmetric if and only if \[\dotprod{(L\vec{e}_1)}{\vec{e}_2} =\dotprod{\vec{e}_1}{(L\vec{e}_2)} \]
This fact shows how the verification of symmetry is reduced to a calculation for a single pair of vectors. As we will see later, for higher dimensions, more calculations are needed, but still only for a finite number of vectors.
The statement is a consequence of the following chain of equivalent statements.
\[\begin{array}{rcl}L \text{ is symmetric}& \Leftrightarrow &\text{ For all vectors }\vec{x},\vec{y}\text{ we have }\dotprod{(L\vec{x})}{\vec{y}} =\dotprod{\vec{x}}{(L\vec{y})}\\&&\phantom{xxx}\color{blue}{\text{definition of symmetry}}\\&\Leftrightarrow&\text{For all scalars }a,b,c,d\text{ we have }\\&&\phantom{xxxxx}\dotprod{(L(a\vec{e}_1+b\vec{e}_2))}{(c\vec{e}_1+d\vec{e}_2)} =\dotprod{(a\vec{e}_1+b\vec{e}_2)}{(L(c\vec{e}_1+d\vec{e}_2))}\\&&\phantom{xxx}\color{blue}{\vec{x} =a\vec{e}_1+b\vec{e}_2\text{ and }\vec{y}=c\vec{e}_1+d\vec{e}_2\text{ substituted}}\\ &\Leftrightarrow&\text{For all scalars }a,b,c,d\text{ we have }\\&&\phantom{xxxxx}\dotprod{(aL\vec{e}_1+bL\vec{e}_2)}{(c\vec{e}_1+d\vec{e}_2)} =\dotprod{(a\vec{e}_1+b\vec{e}_2)}{(cL\vec{e}_1+dL\vec{e}_2)}\\&&\phantom{xxx}\color{blue}{\text{ linearity of }L}\\&\Leftrightarrow&\text{For all scalars }a,b,c,d\text{ we have }\\&&\phantom{xxxxx}ac\,(\dotprod{L\vec{e}_1}{\vec{e}_1})+ad\,(\dotprod{L\vec{e}_1}{\vec{e}_2})+bc\,(\dotprod{L\vec{e}_2}{\vec{e}_1})+bd\,(\dotprod{L\vec{e}_2}{\vec{e}_2})\\&&\phantom{xxxxx}=ac\,(\dotprod{\vec{e}_1}{L\vec{e}_1})+ad\,(\dotprod{\vec{e}_1}{L\vec{e}_2})+bc\,(\dotprod{\vec{e}_2}{L\vec{e}_1})+bd\,(\dotprod{\vec{e}_2}{L\vec{e}_2})\\&&\phantom{xxx}\color{blue}{\text{linearity of the inner product }}\\&\Leftrightarrow&\text{For all scalars }a,b,c,d\text{ we have }\\&&\phantom{xxxxx}ad\,(\dotprod{L\vec{e}_1}{\vec{e}_2})+bc\,(\dotprod{L\vec{e}_2}{\vec{e}_1})=ad\,(\dotprod{\vec{e}_1}{L\vec{e}_2})+bc\,(\dotprod{\vec{e}_2}{L\vec{e}_1})\\&&\phantom{xxx}\color{blue}{\text{terms cancel since }\dotprod{L\vec{e}_j}{\vec{e}_j}=\dotprod{\vec{e}_j}{L\vec{e}_j}\text{ by symmetry of the inner product}}\\&\Leftrightarrow&\text{For all scalars }a,b,c,d\text{ we have }\\&&\phantom{xxxxx}ad\,(\dotprod{L\vec{e}_1}{\vec{e}_2})+bc\,(\dotprod{L\vec{e}_2}{\vec{e}_1})=ad\,(\dotprod{L\vec{e}_2}{\vec{e}_1})+bc\,(\dotprod{L\vec{e}_1}{\vec{e}_2})\\&&\phantom{xxx}\color{blue}{\dotprod{L\vec{e}_i}{\vec{e}_j}=\dotprod{\vec{e}_j}{L\vec{e}_i}\text{ by symmetry of the inner product}}\\ &\Leftrightarrow&\dotprod{L\vec{e}_1}{\vec{e}_2}=\dotprod{L\vec{e}_2}{\vec{e}_1}\\&&\phantom{xxx}\color{blue}{\text{the special case }a=d=1\text{ and }b=c=0\text{ suffices}} \end{array}\]
Let #V# be a finite-dimensional real inner product space and #L:V\rightarrow V# a linear map. Then there is a unique linear map #L^\top:V\rightarrow V#, the adjoint of #L#, determined by \[ \dotprod{L(\vec{x})}{\vec{y}}=\dotprod{\vec{x}}{L^\top(\vec{y})}\] for all #\vec{x}, \vec{y}# in #V#.
Here is a proof of this statement. Let #\vec{y}# be a vector of #V#. Then the map sending #\vec{x}# to the inner product #\dotprod{L(\vec{x})}{\vec{y}}# is a linear map #V\to\mathbb{R}#. Because #V# is finite-dimensional, theorem The linear space of linear maps implies that there is a unique vector #\vec{a}# such that #\dotprod{L(\vec{x})}{\vec{y}}=\dotprod{\vec{a}}{\vec{x}}# for all #\vec{x}#. The fact that #\vec{a}# is uniquely determined by #\vec{y}# means that there is a unique map # L^\top: V\to V#, such that #{L^\top}(\vec{y})={\vec{a}}#. Since the inner product is symmetric, this gives #\dotprod{L(\vec{x})}{\vec{y}}=\dotprod{\vec{x}}{L^\top(\vec{y})}# for all #\vec{x}#.
It remains to be proven that #L^\top# is linear. For arbitrary scalars #\lambda#, #\mu# and vectors #\vec{y}#, #\vec{z}# we have
\[\begin{array}{rcl}\dotprod{\vec{x}}{L^\top(\lambda\vec{y}+\mu\vec{z})}&=& \dotprod{(L(\vec{x}))}{(\lambda\vec{y}+\mu\vec{z})}\\&&\phantom{xx}\color{blue}{\text{definition of }L^\top}\\&=&\lambda \dotprod{(L(\vec{x}))}{\vec{y}}+\mu\dotprod{(L(\vec{x}))}{\vec{z}}\\&&\phantom{xx}\color{blue}{\text{bilinearity of inner product}}\\ &=&\lambda \dotprod{\vec{x}}{(L^\top(\vec{y}))}+\mu\dotprod{\vec{x}}{(L^\top(\vec{z}))}\\&&\phantom{xx}\color{blue}{\text{definition of }L^\top}\\&=&\dotprod{\vec{x}}{(\lambda L^\top(\vec{y})+\mu L^\top(\vec{z}))}\\&&\phantom{xx}\color{blue}{\text{bilinearity inner product}}\\\end{array}\]
This implies #L^\top(\lambda\vec{y}+\mu\vec{z})=\lambda L^\top(\vec{y})+\mu L^\top(\vec{z})# (because, as the above equality shows, the difference is perpendicular to #V#) which establishes the linearity of #L^\top#.
The linear map #L# is symmetric if and only if #L=L^\top#. For, if #L=L^\top#, then the uniqueness of the adjoint of #L# implies that #L# is symmetric; and if #L# is symmetric, then #L# satisfies the definition of #L^\top#.
Here are some properties of symmetric maps.
Let #V# be an inner product space, let #L# and #M# be symmetric linear maps #V\to V#, and let #\lambda# be a scalar.
- #L+M# is symmetric.
- #\lambda\, L# is symmetric.
- If #L# and #M# commute (that is, #L\, M = M\, L#), then #L\, M# is symmetric.
- If #L# is invertible, then #L^{-1}# is also symmetric.
1. #L+M# is symmetric: If #\vec{x}# and #\vec{y}# are vectors of #V#, then \[\begin{array}{rcl}\dotprod{((L+M)(\vec{x}))}{\vec{y}}&=&\dotprod{(L(\vec{x})+M(\vec{x}))}{\vec{y}}\\ &=&\dotprod{(L(\vec{x}))}{\vec{y}}+\dotprod{(M(\vec{x}))}{\vec{y}}\\&=&\dotprod{\vec{x}}{(L(\vec{y}))}+\dotprod{\vec{x}}{(M(\vec{y}))}\\&=&\dotprod{\vec{x}}{(L(\vec{y})+M(\vec{y}))}\\ &=&\dotprod{\vec{x}}{((L+M)\,(\vec{y}))}\end{array}\] 2. #\lambda\, L# is symmetric: If #\vec{x}# and #\vec{y}# are vectors of #V#, then \[\begin{array}{rcl}\dotprod{((\lambda\, L)(\vec{x}))}{\vec{y}}&=&\dotprod{(\lambda\,(L(\vec{x})))}{\vec{y}}\\&=&\lambda\cdot\dotprod{(L(\vec{x}))}{\vec{y}}\\&=&\lambda\cdot\dotprod{\vec{x}}{(L(\vec{y}))}\\&=&\dotprod{\vec{x}}{(\lambda\,(L(\vec{y})))}\\&=&\dotprod{\vec{x}}{((\lambda\,L)\,(\vec{y}))}\end{array}\]3. If #L# and #M# commute, then #L\, M# is symmetric: If #\vec{x}# and #\vec{y}# are vectors of #V#, then \[\begin{array}{rcl}\dotprod{(L\,M(\vec{x}))}{\vec{y}} &=& \dotprod{(M(\vec{x}))}{(L(\vec{y}))}\\ &&\phantom{xx}\color{blue}{L\text{ is symmetric}}\\ &=& \dotprod{\vec{x}}{(M\,L(\vec{y}))}\\ &&\phantom{xx}\color{blue}{M\text{ is symmetric}}\\ &=& \dotprod{\vec{x}}{(L\,M(\vec{y}))}\\ &&\phantom{xx}\color{blue}{L\,M = M\, L}\end{array}\] 4. If #L# is invertible, then #L^{-1}# is also symmetric: If #\vec{x}# and #\vec{y}# are vectors of #V#, then write #\vec{v} = L^{-1}(\vec{x})# and #\vec{w} = L^{-1}(\vec{y})#. Then we have \(\dotprod{(L^{-1}(\vec{x}))}{\vec{y}}=\dotprod{\vec{v}}{(L(\vec{w}))}=\dotprod{((L(\vec{v}))}{\vec{w}}=\dotprod{\vec{x}}{(L^{-1}(\vec{y}))}\).
Let #\vec{a}# be a vector distinct from the zero vector of an inner product space #V# and write #\ell=\linspan{\vec{a}}#. Then, the reflection #S_{\vec{a}}# about the plane #\linspan{\vec{a}}^\perp# can be written as
\[ S_{\vec{a}} = I_V -2 P_{\ell}\]
The identity map #I_V# is clearly symmetric from the definition and #P_{\ell}# is symmetric as we saw under the tab Orthogonal projection above. Therefore, statements 1 and 2 show that the reflection #S_{\vec{a}} # is also symmetric.
The condition of statement 3 that #L# and #M# commute is necessary: Take #V = \mathbb{R}^2#, let #L# be determined by the matrix #\matrix{0&1\\ 1&1}# and #M# by the matrix #\matrix{0&2\\ 2&1}#. Then #L\, M = \matrix{2&1\\ 2&3}# so \[ \dotprod{(L\,M\,\rv{1,0})}{\rv{0,1}} =2\ne1= \dotprod{\rv{1,0}}{(L\,M\,\rv{0,1})} \]
The symmetric linear maps #V\to V# form a linear subspace of the vector space of all linear maps #V\to V#. After all, the zero map is symmetric, and the first two statements show that the set of symmetric linear maps is closed under vector addition and scalar multiplication.
Let #P_1# be the inner product space of all polynomials in #x# of degree at most #1# with orthonormal basis #\basis{1, x}#. Consider the linear map #L:P_1\to P_1# given by \[\eqs{ L(1) &=& -5 +9 x\\ L(x) &=& a-8 x}\] For what integer #a# is #L# symmetric?
#a =# #{9}#
One of the requirements of symmetry on #L# is \(\dotprod{(L({1}))}{{x}}= \dotprod{{1}}{(L({x}))}\). We work out this equation as follows
\[\begin{array}{rcl} \dotprod{(-5 + 9 x )}{{x}} &=& \dotprod{{1}}{(a-8 x)}\\&&\phantom{xx}\color{blue}{\text{mapping rule of }L\text{ used}}\\ -5 \cdot \dotprod{1}{{x}}+9 \cdot(\dotprod{x }{x}) &=& \dotprod{{1}}{a}-8\cdot(\dotprod{{1}}{ x})\\&&\phantom{xx}\color{blue}{\text{bilinearity of inner product }}\\
9 &=&a\\&&\phantom{xx}\color{blue}{\dotprod{1}{x}=0\text{ and }\dotprod{x}{x}=1\text{ by orthonormality of the basis}}\\
\end{array}\] We conclude that #a = 9# is the only possibility. Verifying that \[\dotprod{(L(\alpha +\beta x))}{(\gamma+\delta x)} = \dotprod{(\alpha +\beta x)}{(L(\gamma+\delta x))} \] holds for arbitrary scalars #\alpha #, #\beta#, #\gamma#, #\delta # or application of the
2D criterion of the definition of symmetric maps shows that #a= 9# indeed suffices for symmetry. Therefore, the answer is #a = 9#.