Section 37.4 Examples
In this section.
Subsection 37.4.1 Orthogonal complements
Example 37.4.1. Determining a basis for an orthogonal complement.
Let's carry out the example explored in Discovery 37.3, where we considered the space \(V = \matrixring_{2 \times 2}(\R)\) equipped with the inner product \(\inprod{A}{B} = \trace(\utrans{B} A) \text{,}\) and the subspace \(U\) of all upper triangular matrices whose upper-right entry is equal to its trace.
A typical element of \(U\) can be described parametrically as
The parameters \(x,y\) have no further dependence relation between them, so they each have an associated basis vector:
To be an element of \(\orthogcmp{U}\text{,}\) an arbitrary element
of \(\matrixring_{2 \times 2} (\R)\) must be orthogonal to each of the basis vectors \(A_1,A_2\) for \(U\) above. We have
For orthogonality, we need both of these results to be zero, leading to homogeneous system
Since parameter \(b\) appears in both equations, we choose to leave that free, and then have
Parameter \(c\) does not appear in the system, hence must be free as well. So a typical element of \(\orthogcmp{U}\) can be described parametrically as
leading to
Example 37.4.2. Determining a basis for an orthogonal complement in a complex space.
Consider space \(\C^4\) with the complex dot product, and subspace
Let \(\uvec{w}_1,\uvec{w}_2\) represent the two spanning vectors above.
To be in \(\orthogcmp{W}\text{,}\) an arbitrary vector \(\uvec{z} = (z_1,z_2,z_3,z_4)\) in \(\C^4\) must be orthogonal to each of \(\uvec{w}_1,\uvec{w}_2\text{.}\) So compute
Setting the above expressions to be equal to zero for orthogonality leads to homogeneous system
So \(z_3,z_4\) are free, \(z_2 = 0\text{,}\) and
so that a typical vector in \(\orthogcmp{W}\) can be described parametrically as
Finally, the above parametric expression for elements of \(\orthogcmp{W}\) leads to
Subsection 37.4.2 Expansion relative to an orthogonal basis
Here we will provide an example of using the pattern of Discovery 37.5 (discussed in Subsection 37.3.2) to express a vector as a linear combination of orthogonal basis vectors.
Example 37.4.3.
Consider \(V = \poly_2(\R)\) equipped with inner product
The collection
forms an orthogonal basis \(\basisfont{B}\) for \(V\text{.}\)
What is the coordinate vector \(\rmatrixOf{q}{B}\) for \(q(x) = 3 x^2 + 3 x + 3 \text{?}\)
Compute
We also need the norms
Putting these together, we have
so that
Subsection 37.4.3 Using the Gram-Schmidt orthogonalization process
Example 37.4.4. Producing an orthogonal basis.
Consider \(V = \poly_2(\R)\) equipped with inner product
Begin with the standard basis
Following the Gram-Schmidt orthogonalization process, set
In order to compute \(e_2(x)\text{,}\) first compute
Then we set
Computing \(e_3(x)\) will require quite a few more calculations:
Using these results, compute
We know have orthogonal basis
If we would like an orthonormal basis, we need one further calculation:
Using \(\sqrt{12} = 2\sqrt{3}\) and \(\sqrt{180} = 6\sqrt{5}\text{,}\) we can normalize to
Example 37.4.5. Producing an orthogonal basis for a complex inner product space.
Let's apply the Gram-Schmidt orthogonalization process to \(\C^4\text{,}\) but starting with the basis
which is made up of the basis vectors for \(W\) and \(\orthogcmp{W}\) from Example 37.4.2.
Let \(\uvec{w}_1,\uvec{w}_2,\uvec{w}_3,\uvec{w}_4\) represent the four initial basis vectors above. Set \(\uvec{e}_1 = \uvec{w}_1\text{,}\) and compute
Normally, we would now set \(\uvec{e}_2\) to be
However, scalar multiples do not affect orthogonality, so let's clear the fractions by multiplying by \(2\text{,}\) giving
From here, we already know from Example 37.4.2 that \(\uvec{w}_3,\uvec{w}_4\) are in \(\orthogcmp{W}\) so they will produce inner products of \(0\) against both \(\uvec{e}_1,\uvec{e}_2\text{.}\) This means the formulas for \(\uvec{e}_3,\uvec{e}_4\) in the Gram-Schmidt orthogonalization process will reduce to
So using \(\uvec{e}_3 = \uvec{w}_3\text{,}\) calculate
from which we would normally set \(\uvec{e}_4\) to be
but again we will clear fractions to obtain
Putting all four vectors together gives us orthogonal basis
Subsection 37.4.4 Obtaining an orthogonal complement using the Gram-Schmidt process
As discussed in Subsection 37.3.4, orthogonal bases tell us about subspaces and their complements. Since the Gram-Schmidt process is our main tool for obtaining orthogonal bases, we can use the process to determine orthogonal complements.
Example 37.4.6.
Let's revisit Example 37.4.1, where we considered subspace
of \(V = \matrixring_{2 \times 2}(\R)\text{.}\) (We equip \(V\) with the standard inner product \(\inprod{A}{B} = \trace (\utrans{B} A)\text{.}\))
The basis for \(U\) can be enlarged into a basis for \(V\) by including a couple of standard basis vectors:
We won't go through all the calculations this time, but applying the Gram-Schmidt orthogonalization process to \(\basisfont{B}_0\) (and clearing fractions along the way) results in orthogonal basis
As \(\dim U\) was \(2\) in the first place, the first two vectors in \(\basisfont{B}\) form a basis for \(U\text{.}\) And as the entire basis \(\basisfont{B}\) is an orthogonal set, the last two vectors in \(\basisfont{B}\) must form a basis for \(\orthogcmp{U}\text{.}\) That is, we can split \(\basisfont{B}\) in two to obtain an orthogonal basis for each of \(U,\orthogcmp{U}\text{:}\)
Example 37.4.7.
Now we'll revisit Example 37.4.2, where we considered the subspace
of \(\C^4\text{.}\)
In Example 37.4.5, we created an orthogonal basis for \(\C^4\) by applying the Gram-Schmidt process to an initial basis formed by joining a basis for \(W\) with a basis for \(\orthogcmp{W}\text{.}\) So we could obtain orthogonal bases for both \(W,\orthogcmp{W}\) by splitting that orthogonal \(\C^4\) basis apart. However, in this example we will proceed as if we did not initially know any basis for \(\orthogcmp{W}\text{.}\)
The above basis for \(W\) can be enlarged into a basis for \(\C^4\) by including two of the standard basis vectors:
Applying the Gram-Schmidt orthogonalization process to \(\basisfont{B}_0\) (and clearing fractions along the way) results in orthogonal basis
As \(\dim W\) was \(2\) in the first place, similarly to Example 37.4.6 we can split \(\basisfont{B}\) in two to obtain an orthogonal basis for each of \(W,\orthogcmp{W}\text{:}\)
Subsection 37.4.5 An infinite orthogonal set
In Example 36.5.7, we showed that the sine and cosine functions are orthogonal with respect to a certain inner product on a space of continuous functions. We can extend that orthogonal pair into an infinite orthogonal set by varying the frequency of the sine and cosine functions.
Example 37.4.8. An infinite orthogonal set of sines and cosines.
Consider the space \(V = C[0,1]\) of all continuous functions defined on the interval \(0 \le x \le 1\text{,}\) equipped with the inner product
Then the infinite set
is an infinite orthogonal set in \(V\text{.}\)
To verify this, assume \(m,n\) are integers with \(m \neq n\text{.}\) Setting \(k = m - n\) and \(K = m + n\) for convenience, we could calculate
The first two expressions evaluate to zero because sine evaluates to zero at every integer multiple of \(2 \pi\text{.}\) On the other hand, cosine evaluates to one at every integer multiple of \(2 \pi\text{,}\) so the numerator of the third expression becomes
and the numerator of the fourth expression also obviously evaluates to zero.
Since all possible combinations of different functions from \(S\) evaluate to zero in the inner product, \(S\) is an orthogonal set.