Section 24.3 Motivation
We have seen that when considering a specific matrix \(A\text{,}\) looking for patterns in the process of computing matrix-times-column-vector helps us to understand the matrix. In turn, this helps us understand all of the various systems \(A\uvec{x}=\uvec{b}\) with common coefficient matrix \(A\text{,}\) since obviously the left-hand side of the matrix version of the system has matrix-times-column-vector form.
When we compute \(A\uvec{e}_j\) for a standard basis vector \(\uvec{e}_j\text{,}\) the result is the \(\nth[j]\) column of \(A\text{.}\) So if we computed each of \(A\uvec{e}_1,A\uvec{e}_2,\dotsc,A\uvec{e}_n\text{,}\) we would have all of the columns of \(A\) as the results, which contain all of the data contained in \(A\text{.}\) These computations certainly let us know the matrix \(A\text{,}\) but they don't necessarily help us understand what \(A\) is really like as a matrix. In short, the standard basis for \(\R^n\) is a great basis for understanding the vector space \(\R^n\text{,}\) but it is not so great for helping understand matrix products \(A\uvec{x}\) for a particular matrix \(A\text{.}\)
In Discovery 24.1, we discovered that for an \(n\times n\) matrix \(A\text{,}\) if we can build a basis for \(\R^n\) consisting of eigenvectors of \(A\text{,}\) then every matrix product \(A\uvec{x}\) becomes simple to compute once \(\uvec{x}\) is decomposed as a linear combination of these basis vectors. Indeed, if \(\{\uvec{u}_1,\uvec{u}_2,\dotsc,\uvec{u}_n\} \) is a basis for \(\R^n\text{,}\) and we have
then multiplication by \(A\) can be achieved by scalar multiplication:
A complete study of how the concepts of eigenvalues and eigenvectors unlock all the mysteries of a matrix is too involved to carry out in full at this point, but we will get a glimpse of how it all works for a certain kind of square matrix in the next chapter. For the remainder of this chapter, we will be more concerned with how to calculate eigenvalues and eigenvectors.