Section 38.3 Examples
In this section.
Subsection 38.3.1 Orthogonal projection
Example 38.3.1. Using the Expansion theorem to compute an orthogonal decomposition.
Equip the space \(V = \matrixring_{2 \times 2}(\R)\) with the standard inner product \(\inprod{A}{B} = \trace (\utrans{B} A)\text{,}\) and consider the subspace \(U\) of \(V\) consisting of upper-triangular matrices with \((1,2)\) entry equal to the trace of the matrix.
We explored this subspace in Discovery 37.3 and in Example 37.4.1, and we produced orthogonal basis
for \(U\) in Example 37.4.6. Write \(A_1\) and \(A_2\) for these orthogonal basis vectors, respectively.
Using this orthogonal basis, we can (for example) compute \(\proj_U I\text{,}\) where \(I\) is the \(2 \times 2\) identity matrix.
Let's separately compute:
With these calculations, we are able to calculate
By inspection we can see that
and a quick calculation confirms
Subsection 38.3.2 Best approximation
Example 38.3.2. Approximating sine with a quadratic polynomial.
Consider the horizontally compressed sine function
What quadratic polynomial best approximates \(f(x)\) over the interval \(0 \le x \le 1\text{?}\)
Looking at the graph of \(f(x)\text{,}\) one “naive” guess would be to choose the parabola that passes through the points
We can determine this parabola using the methods of Subsection 3.2.4 to be
Let's plot \(f(x)\) and \(q(x)\) on the same set of axes to compare.
Let's see what kind of result we can get using best approximation. Our domain of interest is \(0 \le x \le 1\text{,}\) so we are working in the vector space \(C[0,1]\text{.}\) But we can constrain our problem to the finite-dimensional subspace
And to focus on our domain of interest, let's use inner product
Our best approximation will be \(\proj_U f\text{,}\) where \(U\) is the subspace \(\poly_2(\R)\) of \(V\text{.}\) Conveniently, we have already used the Gram-Schmidt orthogonalization process to produce orthogonal basis
of \(U\) in Example 37.4.4. As in that example, write \(e_1(x),e_2(x),e_3(x)\) for these three basis polynomials, respectively.
To compute \(\proj_U f\text{,}\) first separately calculate
Note that we skipped calculating \(\norm{e_2}^2\) since \(\inprod{f}{e_2} = 0\text{.}\)
Now compute
Again, let's plot \(f(x)\) and \((\proj_U f)(x)\) on the same set of axes to compare.
You can see that while this parabola does not hit the same peak as the sine graph, it stays much closer to the curve over our domain than the previous “naive” approximation.
Subsection 38.3.3 Least-squares solutions to an inconsistent system
Example 38.3.3.
The system
is inconsistent. What values of \(x,y,z\) are closest to being a solution?
We have coefficient matrix and vector of constants
The inconsistent system
has associated normal system
where
Row reduce:
Setting free variable \(z\) to be a parameter, we have general solution
So there is a whole line of approximate solutions, and each \(\uvec{x}\) on this line satisfies
Let's compare by calculating \(\proj_U \uvec{b}\text{,}\) where \(U\) is the column space of \(A\text{.}\) By inspection, we can see that the third column of \(A\) is the sum of the first two, whereas the first two columns of \(A\) are already orthogonal. So the first two columns \(\uvec{a}_1,\uvec{a}_2\) form an orthogonal basis for the column space of \(A\text{.}\) So if we compute
we then have
as expected.