Section 37.3 Concepts
In this section.
Subsection 37.3.1 Orthogonal complements
We have already encountered the concept of complement subspace in Subsection 28.6.3 (see Proposition 28.6.6). In our first explorations of Discovery guide 37.1, we considered a very specific kind of complement \orthogcmp{U}\text{,} made up of all vectors orthogonal to a given subspace U\text{,} called the orthogonal complement of U\text{.} And we have already considered this concept geometrically in the cases of \R^2 and \R^3 in Chapter 14. As we reminded ourselves in Discovery 37.1, the orthogonal complement of a plane (through the origin) in space is just the normal line to the plane, and symmetrically the orthogonal complement of a line (through the origin) in space is just the plane to which the line is normal. This symmetric relationship between planes and their normal lines suggests a general pattern of \orthogcmp{(\orthogcmp{U})} = U\text{,} which we will confirm in Corollary 37.5.20. An orthogonal complement is always a subspace of the inner product space. Note that this fact does not depend on the initial collection U of vectors actually being a subspace β for every collection X of vectors in an inner product space, the collection \orthogcmp{X} of all vectors that are orthogonal to every vector in X will form a subspace as well. (See Proposition 37.5.13.) But in the case that U is a subspace, then the complement part of orthogonal complement is intentional: the pair of subspaces U, \orthogcmp{U} will always form a complete set of independent subspaces in a finite-dimensional inner product space. (See Corollary 37.5.19.) As we explored in Discovery 37.2, the Linearity of inner products implies that when U is a subspace, it is enough to check inclusion in \orthogcmp{U} by checking orthogonality against each vector in some basis for U\text{.} We saw in Discovery 37.3 how this fact can be used to set up a homogeneous system that will lead from a basis for U to a basis for \orthogcmp{U}\text{.} (Also see the examples in Subsection 37.4.1.)Subsection 37.3.2 Expansion relative to an orthogonal basis
An orthogonal basis for an inner product space is one where each vector in the basis is orthogonal to every other vector in the basis. We are already well-familiar with the concept of orthogonal basis from our experience in \R^n β the standard basis is always an orthogonal basis (actually, an orthonormal basis) when using the standard inner product. A basis affords a unique expression (or βexpansionβ) for each vector in a vector space as a linear combination of basis vectors (Theorem 19.5.3). In Discovery 37.5, we found that the zero result of an inner product for orthogonal vectors, along with the the Linearity of inner products, leads to a pattern for the coefficients in the expansion for a vector relative to an orthogonal basis
\begin{equation*}
\basisfont{B} = \{ \uvec{e}_1, \uvec{e}_2, \dotsc, \uvec{e}_n \}
\end{equation*}
in a finite-dimensional inner product space:
\begin{equation*}
\uvec{v}
= \frac{\inprod{\uvec{v}}{\uvec{e}_1}}{\norm{\uvec{e}_1}^2} \, \uvec{e}_1
+ \frac{\inprod{\uvec{v}}{\uvec{e}_2}}{\norm{\uvec{e}_2}^2} \, \uvec{e}_2
+ \dotsb
+ \frac{\inprod{\uvec{v}}{\uvec{e}_n}}{\norm{\uvec{e}_n}^2} \, \uvec{e}_n\text{.}
\end{equation*}
Warning 37.3.1.
Because of the conjugate-symmetry of complex inner products (Axiom CIP 1), the order \inprod{\uvec{v}}{\uvec{e}_j} in each scalar numerator in the general expansion above is important in order to maintain consistency between both the real and complex contexts.
\begin{equation*}
\uvec{v}
= \inprod{\uvec{v}}{\uvec{e}_1} \, \uvec{e}_1
+ \inprod{\uvec{v}}{\uvec{e}_2} \, \uvec{e}_2
+ \dotsb
+ \inprod{\uvec{v}}{\uvec{e}_n} \, \uvec{e}_n\text{.}
\end{equation*}
Subsection 37.3.3 The Gram-Schmidt orthogonalization process
The point of a basis for a vector space is to have a means to uniquely describe all infinity of vectors in the space as linear combinations of a finite number of basis vectors. We just saw in Subsection 37.3.2 that, relative to an orthogonal basis for a finite-dimensional inner product space, determining that linear combination for a given vector is particularly easy β no row reducing necessary, just compute some inner products and norms. So being able to produce an orthogonal basis for an inner product space is of particular value. In Discovery 37.6, we attempted to reinvent a procedure by which we can convert a basis into an orthogonal one. (And then, if desired, into an orthonormal one by normalizing each orthogonal basis vector.)Procedure 37.3.2. Gram-Schmidt orthogonalization.
Given a basis
\begin{equation*}
\basisfont{B}_0 = \{ \uvec{v}_1, \uvec{v}_2, \dotsc, \uvec{v}_n \}
\end{equation*}
of a finite-dimensional inner product space, to construct an orthogonal basis
\begin{equation*}
\basisfont{B} = \{ \uvec{e}_1, \uvec{e}_2, \dotsc, \uvec{e}_n \}
\end{equation*}
for that space, calculate
\begin{align*}
\uvec{e}_1 \amp = \uvec{v}_1 \text{,} \\
\uvec{e}_2 \amp = \uvec{v}_2 - \frac{\inprod{\uvec{v}_2}{\uvec{e}_1}}{\norm{\uvec{e}_1}^2} \, \uvec{e}_1 \text{,} \\
\uvec{e}_3 \amp = \uvec{v}_3 - \left(
\frac{\inprod{\uvec{v}_3}{\uvec{e}_1}}{\norm{\uvec{e}_1}^2} \, \uvec{e}_1
+ \frac{\inprod{\uvec{v}_3}{\uvec{e}_2}}{\norm{\uvec{e}_2}^2} \, \uvec{e}_2
\right)
\text{,}\\
\amp \vdots \\
\uvec{e}_n \amp = \uvec{v}_n - \left(
\frac{\inprod{\uvec{v}_n}{\uvec{e}_1}}{\norm{\uvec{e}_1}^2} \, \uvec{e}_1
+ \dotsb
+ \frac{\inprod{\uvec{v}_n}{\uvec{e}_{n-1}}}{\norm{\uvec{e}_{n-1}}^2} \, \uvec{e}_{n-1}
\right)\text{.}
\end{align*}
If an orthonormal basis is desired, each of the \uvec{e}_j can be normalized to unit vectors.
Remark 37.3.3.
If aiming to produce an orthogonormal basis, the \uvec{e}_j can be normalized at any point in the process β either all at the end, or one at a time as they are produced. Both choices have pros and cons.
\begin{equation*}
\frac{\inprod{\uvec{v}_2}{\uvec{e}_1}}{\norm{\uvec{e}_1}^2} \text{.}
\end{equation*}
In other words, the portion of \uvec{v}_2 parallel to \uvec{e}_1 is
\begin{equation*}
\widetilde{\uvec{v}}_2 = \frac{\inprod{\uvec{v}_2}{\uvec{e}_1}}{\norm{\uvec{e}_1}^2} \, \uvec{e}_1 \text{.}
\end{equation*}
\begin{equation*}
\proj_{\uvec{a}} \uvec{u} = \frac{\udotprod{u}{a}}{\unorm{a}^2} \, \uvec{a} \text{.}
\end{equation*}
And from our experience with orthogonal projection onto a line, the vector running from the head of \widetilde{\uvec{v}}_2 to the head of \uvec{v}_2 should be both a difference vector
\begin{equation*}
\uvec{v}_2 - \widetilde{\uvec{v}}_2
\end{equation*}
and a normal vector to the line spanned by \uvec{e}_1\text{.}
\begin{equation*}
\widetilde{\uvec{v}}_3
= \frac{\inprod{\uvec{v}_3}{\uvec{e}_1}}{\norm{\uvec{e}_1}^2} \, \uvec{e}_1
+ \frac{\inprod{\uvec{v}_3}{\uvec{e}_2}}{\norm{\uvec{e}_2}^2} \, \uvec{e}_2
\end{equation*}
represents the portion of \uvec{v}_3 that lies parallel to that plane.
Subsection 37.3.4 Orthogonal complements from an orthogonal basis
In Discovery 37.8.a, we explored applying the Gram-Schmidt orthogonalization process using the basis of a subspace as the starting point, instead of a basis for the whole inner product space. Suppose U is a subspace of an inner product space, with basis \basisfont{B}_U = \{ \uvec{u}_1, \dotsc, \uvec{u}_m \} \text{.} When we apply the orthogonalization process to just this basis, we start by setting
\begin{equation*}
\uvec{e}_1 = \uvec{u}_1 \text{,}
\end{equation*}
so \uvec{e}_1 is in U\text{.} Then we set
\begin{equation*}
\uvec{e}_2 = \uvec{v}_2 - \frac{\inprod{\uvec{v}_2}{\uvec{e}_1}}{\norm{\uvec{e}_1}^2} \, \uvec{e}_1 \text{,}
\end{equation*}
a linear combination of vectors in U\text{.} So \uvec{e}_2 is also in U\text{.} And so on: at each step the next vector \uvec{e}_j is a linear combination of a vector from \basisfont{B}_U and the previous vectors \uvec{e}_1,\dotsc,\uvec{e}_{j-1}\text{.} If each of those previous vectors is in U\text{,} then so will the next be.
So the end result will be an orthogonal set of vectors in U which contains the same number of vectors as \basisfont{B}_U\text{.} Since orthogonal vectors are independent, this is enough to conclude that the Gram-Schmidt process, applied to a basis for a subspace, will produce an orthogonal basis for that subspace.
As in Discovery 37.8.b, let's take this one step further. The basis \basisfont{B}_U for U can be enlarged to a basis
\begin{equation*}
\basisfont{B}_V = \{ \uvec{u}_1, \dotsc, \uvec{u}_m, \uvec{v}_1, \dotsc, \uvec{v}_\ell \}
\end{equation*}
for V (Statement 1 of Proposition 20.5.8). As we apply the Gram-Schmidt process to \basisfont{B}_V\text{,} the first m steps will not involve the \uvec{v}_j\text{,} and we will effectively be applying the process to \basisfont{B}_U to obtain an orthogonal basis
\begin{equation*}
\basisfont{B}_U' = \{ \uvec{e}_1, \dotsc, \uvec{e}_m \}
\end{equation*}
of U\text{.} We then continue the process to obtain an orthogonal basis
\begin{equation*}
\basisfont{B}_V' = \{ \uvec{e}_1, \dotsc, \uvec{e}_m, \uvec{f}_1, \dotsc, \uvec{f}_\ell \}
\end{equation*}
for V\text{.} But then each of these new \uvec{f}_j vectors is orthogonal to each of the previous \uvec{e}_i\text{.} Using our thinking from Discovery 37.2, this means that each \uvec{f}_j is in \orthogcmp{U}\text{.} And in fact, the collection of \uvec{f}_j will form an orthogonal basis for \orthogcmp{U}.
We can say that every orthogonal complement pair U,\orthogcmp{U} in a finite-dimensional inner product space occurs this way. That is, we can create an orthogonal complement pair U,\orthogcmp{U} by taking an orthogonal basis
\begin{equation*}
\basisfont{B} = \{ \uvec{e}_1, \uvec{e}_2, \dotsc, \uvec{e}_n \}
\end{equation*}
for the whole space V and choosing an index \ell at which to split it in two:
\begin{align*}
\basisfont{B}_U \amp = \{ \uvec{e}_1, \dotsc, \uvec{e}_\ell \} \text{,} \amp
\basisfont{B}_{\orthogcmp{U}} \amp = \{ \uvec{e}_{\ell + 1}, \dotsc, \uvec{e}_\ell \} \text{.}
\end{align*}
As indicated by the subscripts, the second βhalfβ of the original set of basis vectors will form an orthogonal basis for the orthogonal complement of the subspace that the first βhalfβ of original basis vectors span:
\begin{align*}
U \amp = \Span \{ \uvec{e}_1, \dotsc, \uvec{e}_\ell \} \text{,} \amp
\orthogcmp{U} \amp = \Span \{ \uvec{e}_{\ell + 1}, \dotsc, \uvec{e}_\ell \} \text{.}
\end{align*}