Section 21.3 Motivation
We have seen that when considering a specific matrix \(A\text{,}\) looking for patterns in the process of computing matrix-times-column-vector helps us to understand the matrix. In turn, this helps us understand all of the various systems \(A\uvec{x}=\uvec{b}\) with common coefficient matrix \(A\text{,}\) since obviously the left-hand side of the matrix version of the system has matrix-times-column-vector form.
When we compute \(A\uvec{e}_j\) for a standard basis vector \(\uvec{e}_j\text{,}\) the result is the \(\nth[j]\) column of \(A\text{.}\) So if we computed each of \(A\uvec{e}_1,A\uvec{e}_2,\dotsc,A\uvec{e}_n\text{,}\) we would have all of the columns of \(A\) as the results, which contain all of the data contained in \(A\text{.}\) These computations certainly let us know the matrix \(A\text{,}\) but they don’t necessarily help us understand what \(A\) is really like as a matrix. In short, the standard basis for \(\R^n\) is a great basis for understanding the vector space \(\R^n\text{,}\) but it is not so great for helping understand matrix products \(A\uvec{x}\) for a particular matrix \(A\text{.}\)
In Discovery 21.1, we discovered that for an \(n\times n\) matrix \(A\text{,}\) if we can build a basis for \(\R^n\) consisting of eigenvectors of \(A\text{,}\) then every matrix product \(A\uvec{x}\) becomes simple to compute once \(\uvec{x}\) is decomposed as a linear combination of these basis vectors. Indeed, if \(\{\uvec{u}_1,\uvec{u}_2,\dotsc,\uvec{u}_n\} \) is a basis for \(\R^n\text{,}\) and we have
\begin{align*}
A\uvec{u}_1 \amp= \lambda_1\uvec{u}_1, \amp
A\uvec{u}_2 \amp= \lambda_2\uvec{u}_2, \amp
\amp\dotsc, \amp
A\uvec{u}_n \amp= \lambda_n\uvec{u}_n,
\end{align*}
then multiplication by \(A\) can be achieved by scalar multiplication:
\begin{align*}
\amp\amp \uvec{x} \amp= k_1\uvec{u}_1 + k_2\uvec{u}_2 + \dotsb + k_n\uvec{u}_n\\
\\
\amp\implies \amp
A\uvec{x} \amp= k_1A\uvec{u}_1 + k_2A\uvec{u}_2 + \dotsb + k_nA\uvec{u}_n\\
\amp\amp\amp= k_1\lambda_1\uvec{u}_1 + k_2\lambda_2\uvec{u}_2 + \dotsb + k_n\lambda_n\uvec{u}_n.
\end{align*}
A complete study of how the concepts of eigenvalues and eigenvectors unlock all the mysteries of a matrix is too involved to carry out in full at this point, but we will get a glimpse of how it all works for a certain kind of square matrix in the next chapter. For the remainder of this chapter, we will be more concerned with how to calculate eigenvalues and eigenvectors.