If \(P\) is a square matrix, write \(\uvec{p}_1, \uvec{p}_2, \dotsc, \uvec{p}_n\) for the columns of \(P\text{,}\) so that \(P = \begin{bmatrix} \uvec{p}_1 \amp \uvec{p}_2 \amp \cdots \amp \uvec{p}_n \end{bmatrix} \text{.}\)
Similarity relation \(\inv{P} A P = B\) holds if and only if each column of \(B\) consists of coefficients for expressing the corresponding transformed vector \(A \uvec{p}_j\) as a linear combination of the columns of \(P\) (Subsection 26.3.2).
Use the characterization of similarity in the introduction of this discovery guide to write down conditions on \(\uvec{p}_1, \uvec{p}_2, \uvec{p}_3, \uvec{p}_4\) for \(\inv{P}AP = B\) to be true.
We want to turn the conditions from Task a into a general pattern for achieving block-diagonal form. So the actual numbers in your conditions are irrelevant; what’s important is the pattern.
Columns \(\uvec{p}_1\) and \(\uvec{p}_2\) must satisfy a similar condition relative to \(A\text{.}\) Assuming they do, do you think combinations of these two columns will satisfy a similar condition? Test it with combination \(2 \uvec{p}_1 - 3 \uvec{p}_2\text{.}\)
Given an \(n \times n\) matrix \(A\text{,}\) a subspace of \(\R^n\) (or \(\C^n\)) is called \(A\)-invariant if the following is true: for each vector \(\uvec{u}\) in the subspace, the transformed vector \(A\uvec{u}\) is back in the subspace.
Suppose that \(A\) is a \(3 \times 3\) matrix. Recall that multiplication by \(A\) can be regarded as geometrically transforming vectors in \(\R^3\text{.}\)
In each of the following, decide whether any proper, nontrivial subspaces of \(\R^3\) are invariant under the described geometric transformation. That is, determine whether any subspaces have the property that the described transformation returns vectors from the subspace back into the subspace.
Projection onto some arbitrary plane (assume the plane passes through the origin); i.e. given a vector \(\uvec{u}\) in \(\R^3\text{,}\) drop a perpendicular from the head of \(\uvec{u}\) onto the plane, and where that perpendicular lands on the plane will be considered the “transformed image” of \(\uvec{u}\text{.}\)
Hopefully by now you’ve discovered that for \(\inv{P}AP\) to be in block-diagonal form, it must be possible to partition the columns of \(P\) into subcollections, where the columns in a particular subcollection come from a particular \(A\)-invariant subspace of \(\R^n\text{.}\) But when we try to put a matrix into block-diagonal form, we won’t know the transition matrix ahead of time, so we’ll be looking for \(A\)-invariant subspaces from which to take vectors to form the columns of \(P\text{.}\)
But what about the additional requirement that the columns of \(P\) form a basis of \(\R^n\text{?}\) When we determine some \(A\)-invariant subspaces, what relationship to each other will they need to satisfy for this extra condition on the columns of \(P\) to be satisfied?
Every null space vector of \(B\) can be split into a sum of a null space vector (suitably embedded from \(\R^2\) into \(\R^4\)) of the upper-left block and a null space vector (suitably embedded from \(\R^2\) into \(\R^4\)) of the lower-right block.
Will the same sort of pattern as for null space vectors of \(B\) in Task 28.5.b also work here for eigenvectors? That is, is an eigenvector for \(B\) somehow the sum of two (nonzero) vectors, each of which corresponds to an eigenvector of one of the blocks of \(B\text{?}\)
Consider how you created column vectors in Task a of Discovery 28.5 and Task a of Discovery 28.6, and the pattern of how multiplying block-diagonal \(B\) against your constructed column vectors worked. Use this experience to compute the product \(MN\) for the following two block-diagonal \(3 \times 3\) matrices.
By restricting ourselves to row operations that only swap or combine rows involving a single block, we can reduce a block-diagonal matrix by reducing the blocks. For example,
The reduced matrix is not technically in RREF, because there is a row of zeros that is not at the bottom. But it still allows us to see the patterns of leading variables versus free variables:
The rank of a block-diagonal matrix is the of the ranks of the blocks.
Combine Task a with Task b of Discovery 28.9 to determine the relationship between the characteristic polynomial of a block-diagonal matrix and the characteristic polynomials of its blocks.
Use Task b to determine the relationship between the eigenvalues of a block-diagonal matrix and the eigenvalues of its blocks. What about the algebraic multiplicities of eigenvalues?