Skip to main content
Logo image

Section 21.3 Motivation

We have seen that when considering a specific matrix A, looking for patterns in the process of computing matrix-times-column-vector helps us to understand the matrix. In turn, this helps us understand all of the various systems Ax=b with common coefficient matrix A, since obviously the left-hand side of the matrix version of the system has matrix-times-column-vector form.
When we compute Aej for a standard basis vector ej, the result is the jth column of A. So if we computed each of Ae1,Ae2,,Aen, we would have all of the columns of A as the results, which contain all of the data contained in A. These computations certainly let us know the matrix A, but they don’t necessarily help us understand what A is really like as a matrix. In short, the standard basis for Rn is a great basis for understanding the vector space Rn, but it is not so great for helping understand matrix products Ax for a particular matrix A.
In Discovery 21.1, we discovered that for an n×n matrix A, if we can build a basis for Rn consisting of eigenvectors of A, then every matrix product Ax becomes simple to compute once x is decomposed as a linear combination of these basis vectors. Indeed, if {u1,u2,,un} is a basis for Rn, and we have
Au1=λ1u1,Au2=λ2u2,,Aun=λnun,
then multiplication by A can be achieved by scalar multiplication:
x=k1u1+k2u2++knunAx=k1Au1+k2Au2++knAun=k1λ1u1+k2λ2u2++knλnun.
A complete study of how the concepts of eigenvalues and eigenvectors unlock all the mysteries of a matrix is too involved to carry out in full at this point, but we will get a glimpse of how it all works for a certain kind of square matrix in the next chapter. For the remainder of this chapter, we will be more concerned with how to calculate eigenvalues and eigenvectors.