Skip to main content

Discovery guide 25.1 Discovery guide

A diagonal matrix is one of the simplest kinds of matrix. In this discovery guide, we will attempt to make any matrix similar to a diagonal one.

Recall.

When \(A\uvec{x} = \lambda \uvec{x}\text{,}\) column vector \(\uvec{x}\) is called an eigenvector of \(A\) and scalar \(\lambda\) is called the corresponding eigenvalue of \(A\text{.}\)

Discovery 25.1.

Suppose \(3\times 3\) matrices \(A,P,D\) are related by \(\inv{P}AP = D\text{.}\) (Remember, order matters in matrix multiplication, so in general \(\inv{P}AP \neq A\text{.}\))

As an example, consider

\begin{equation*} D = \left[\begin{array}{rrr} 3 \amp 0 \amp 0 \\ 0 \amp 3 \amp 0 \\ 0 \amp 0 \amp -1 \end{array}\right]. \end{equation*}

We will leave \(A\) and \(P\) unspecified for now, but think of \(P\) as a collection of column vectors:

\begin{equation*} P = \begin{bmatrix} | \amp | \amp | \\ \uvec{p}_1 \amp \uvec{p}_2 \amp \uvec{p}_3 \\ | \amp | \amp | \end{bmatrix}. \end{equation*}

Multiplying both sides of \(\inv{P}AP = D\) on the left by \(P\text{,}\) we could instead write \(AP = PD\text{.}\)

(a)

Do you remember how we defined matrix multiplication, one column at a time?

\begin{equation*} AP = \begin{bmatrix} | \amp | \amp | \\ \boxed{\phantom{XX}} \amp \boxed{\phantom{XX}} \amp \boxed{\phantom{XX}} \\ | \amp | \amp | \end{bmatrix} \end{equation*}
(b)

Do you remember how multiplication on the right by a diagonal matrix affects a matrix of columns?

\begin{equation*} PD = \begin{bmatrix} | \amp | \amp | \\ \boxed{\phantom{XX}} \amp \boxed{\phantom{XX}} \amp \boxed{\phantom{XX}} \\ | \amp | \amp | \end{bmatrix} \end{equation*}
(c)

Compare your patterns for products \(AP\) and \(PD\) from Task a and Task b. For \(AP = PD\) to true, each column of \(P\) must be an .

Hint

Reread the introduction to this discovery guide above.

(d)

In Task c, we have identified a condition for \(AP = PD\) to be true. But to get from \(AP = PD\) back to the original equation \(\inv{P}AP = D\text{,}\) we also need \(P\) to be invertible, so we need the columns of \(P\) to be .

Discovery 25.2.

Let's try out what we learned in Discovery 25.1 for matrix

\begin{equation*} A = \left[\begin{array}{rrr} -1 \amp 9 \amp 0\\ 0 \amp 2 \amp 0\\ 0 \amp -3 \amp -1 \end{array}\right]. \end{equation*}
(a)

Compute the eigenvalues of \(A\) by solving the characteristic equation \(\det (\lambda I - A) = 0\text{.}\)

(b)

For each eigenvalue of \(A\text{,}\) determine a basis for the corresponding eigenspace. That is, determine a basis for the null space of \(\lambda I - A\) by row reducing.

(d)

If you succeeded in meeting both conditions in the previous step, then \(\inv{P}AP\) will be a diagonal matrix. Is it possible to know what diagonal matrix \(\inv{P}AP\) will be without actually computing \(\inv{P}\) and multiplying out \(\inv{P}AP\text{?}\)

Hint

Look back at how the diagonal entries of matrix \(D\) fit in the pattern between \(AP\) and \(PD\) that you identified in Discovery 25.1.c.

Discovery 25.3.

Summarize the patterns you've determined in the first two activities of this discovery guide by completing the following statements in the case that \(D = \inv{P}AP\) is diagonal.

(a)

The diagonal entries of \(D\) are precisely the of \(A\text{.}\)

(b)

The number of times a value is repeated down the diagonal of \(D\) corresponds to .

(c)

The order of the entries down the diagonal of \(D\) corresponds to the of \(P\text{.}\)

Discovery 25.4.

Repeat the procedure of Discovery 25.2 for

\begin{equation*} A = \left[\begin{array}{rrr} -1 \amp 1 \amp 0 \\ 0 \amp -1 \amp 0 \\ 0 \amp 0 \amp 2 \end{array}\right]. \end{equation*}
Discovery 25.5.

Compare the results of Discovery 25.2 and Discovery 25.4 by filling in the table below. We will call the number of times an eigenvalue is “repeated” as a root of the characteristic polynomial its algebraic multiplicity , and we will call the dimension of the eigenspace corresponding to an eigenvalue its geometric multiplicity .

After completing the table, discuss: What can you conclude about algebraic and geometric multiplicities of eigenvalues with respect to attempting to find a suitable \(P\) to make \(\inv{P}AP\) diagonal?

Table 25.1.1. Comparison of examples in this discovery guide.
Discovery 25.2 Discovery 25.4
eigenvalues
algebraic multiplicities
geometric multiplicities
suitable \(P\) exists?

When we attempt to form a transition matrix \(P\) to make \(\inv{P}AP\) diagonal, we need its columns to satisfy both conditions identified in Task c and Task d of Discovery 25.1.

In particular, consider the second of these two conditions. When you determine a basis for a particular eigenspace, these vectors are automatically linearly independent from each other (since they form a basis for a subspace of \(\R^n\)). However, unless \(A\) has only a single eigenvalue, you will need to include eigenvectors from different eigenvalues together in filling out the columns of \(P\). How can we be sure that the collection of all columns of \(P\) will satisfy the condition identified in Discovery 25.1.d?

The next discovery activity will help you with this potential problem.

Discovery 25.6.

Suppose \(\{\uvec{v}_1,\uvec{v}_2,\uvec{v}_3\}\) is a linearly independent set of eigenvectors of \(A\text{,}\) corresponding to eigenvalue \(\lambda_1\text{,}\) and suppose \(\uvec{w}\) is an eigenvector of \(A\) corresponding to a different eigenvalue \(\lambda_2\text{.}\) (So \(\lambda_2\neq\lambda_1\text{.}\))

(a)

Set up the vector equation to begin the test for independence for the set \(\{\uvec{v}_1,\uvec{v}_2,\uvec{v}_3,\uvec{w}\}\text{.}\) Call this equation (1).

(b)

Multiply both sides of equation (1) by \(A\text{,}\) then use the definition of eigenvalue/eigenvector to “simplify.” Call the result equation (2).

(c)

Multiply equation (1) by \(\lambda_1\) — call this equation (3).

(d)

Subtract equation (3) from equation (2). What can you conclude?

(e)

Use your conclusion from Task d to simplify equation (1). Then finish the test for independence.