Skip to main content

Section 45.3 Concepts

Subsection 45.3.1 Attaching a matrix to a transformation

As usual, we will deal with the real case, as the complex case is identical, just with different scalars.

If \(V\) and \(W\) are finite-dimensional real vector spaces, then they are isomorphic to \(\R^n\) and \(\R^m\text{,}\) respectively, where \(n = \dim V\) and \(m = \dim W\) (Statement 1 of Corollary 44.5.16). Isomorphisms effectively create an identification between vectors in two spaces, so that the two vector spaces can be viewed as essentially the same space. So if we have a linear transformation \(\funcdef{T}{V}{W}\text{,}\) with \(V \iso \R^n\) and \(W \iso \R^m\text{,}\) then effectively we have a linear transformation \(\funcdef{\widetilde{T}}{\R^n}{\R^m}\text{.}\)

Diagram of transformations \(\R^n \to V \to W \to \R^m\)

How can we realize the induced transformation \(\funcdef{\widetilde{T}}{\R^n}{\R^m}\text{?}\) We just need to decide how to traverse the two vertical arrows in the diagram above, by choosing specific isomorphisms \(\funcdef{S_1}{V}{\R^n}\) and \(\funcdef{S_2}{W}{\R^m}\text{.}\) Then we can take \(\widetilde{T}\) to the composition \(S_2 T \inv{S}_1\text{.}\)

However, we know how to realize isomorphisms \(V \to \R^n\) and \(W \to \R^m\text{:}\) through coordinate maps! (See Corollary 44.5.17.) So all we have to do is choose a basis \(\basisfont{B}\) for \(V\) and a basis \(\basisfont{B}'\) for \(W\text{,}\) and take

\begin{equation*} \widetilde{T} = \coordmap{B'} T \invcoordmap{B} \text{.} \end{equation*}

Diagram of using coordinate maps to extend a transformation \(V \to W\) to a transform \(\R^n \to V \to W \to \R^m\)
We know that every transformation \(\R^n \to \R^m\) is a matrix transformation by

\begin{equation*} \widetilde{T}(\uvec{x}) = \stdmatrixOf{\widetilde{T}} \uvec{x} \text{,} \end{equation*}

where \(\stdmatrixOf{\widetilde{T}}\) is the standard matrix of \(\funcdef{\widetilde{T}}{\R^n}{\R^m}\text{,}\) as usual. So we define the matrix of \(T\) relative to bases \(\basisfont{B}\) of \(V\) and \(\basisfont{B}'\) of \(W\) to be the matrix \(\stdmatrixOf{\widetilde{T}}\):

\begin{equation*} \matrixOf{T}{B'B} = \stdmatrixOf{\widetilde{T}} = \stdmatrixOf{\coordmap{B'} T \invcoordmap{B}} \text{.} \end{equation*}

What use is this matrix? By reversing the relationship between \(T\) and \(\widetilde{T}\) to

\begin{equation*} T = \invcoordmap{B'} \widetilde{T} \coordmap{B} \text{,} \end{equation*}

we can use matrix multiplication to compute outputs for \(T\text{.}\)

Diagram of realizing a transformation through its matrix
Computationally, it is more instructive to split the four transformations into two pairs.
Diagram of tracing a transformation from domain space to coordinate space of its image
Equating the two legs in this diagram, we have

\begin{equation*} \coordmap{B'} T = \widetilde{T} \coordmap{B} \text{.} \end{equation*}

Consider the results of starting with a vector \(\uvec{v}\) in \(V\) and “chasing” it through each leg of the diagram:

\begin{align*} \coordmap{B'} T (\uvec{v}) \amp = \coordmap{B'} \bigl(T(\uvec{v})\bigr) \amp \widetilde{T} \coordmap{B} (\uvec{v}) \amp = \widetilde{T} \bigl(\matrixOf{\uvec{v}}{B}\bigr)\\ \amp = \matrixOf{T(\uvec{v})}{B'} \text{,} \amp \amp = \stdmatrixOf{\widetilde{T}} \matrixOf{\uvec{v}}{B}\\ \amp \amp \amp = \matrixOf{T}{B'B} \matrixOf{\uvec{v}}{B}\text{.} \end{align*}

Since these two results must be equal, we obtain the main computational pattern regarding the use of the matrix of a linear transformation:

\begin{equation*} \matrixOf{T(\uvec{v})}{B'} = \matrixOf{T}{B'B} \matrixOf{\uvec{v}}{B} \text{.} \end{equation*}

In other words, to obtain the coordinate vector of image vector \(T(\uvec{v})\) relative to codomain space basis \(\basisfont{B}'\text{,}\) multiply the coordinate vector of \(\uvec{v}\) relative to domain space basis \(\basisfont{B}\) by the matrix \(\matrixOf{T}{B'B}\).

This input-output pattern for coordinate vectors explains the reason for writing \(\basisfont{B'B}\) instead of \(\basisfont{BB'}\) in the notation \(\matrixOf{T}{B'B}\text{:}\) the \(\basisfont{B}\) on the right puts it on the inside of the product \(\matrixOf{T}{B'B} \matrixOf{\uvec{v}}{B}\text{,}\) closest to the \(\basisfont{B}\)-coordinate vector \(\matrixOf{\uvec{v}}{B}\text{,}\) and the \(\basisfont{B}'\) on the left puts it closest to the resulting \(\basisfont{B}'\)-coordinate vector \(\matrixOf{T(\uvec{v})}{B'}\text{.}\)

Finally, consider the case for a linear operator \(\funcdef{T}{V}{V}\text{.}\) In this case we may (but do not have to) choose the same basis \(\basisfont{B}\) for both the domain and codomain spaces, as these are the same space. With this choice we simply write \(\matrixOf{T}{B}\) for the matrix of \(T\text{,}\) instead of the redundant \(\matrixOf{T}{BB}\text{.}\)

Diagram of realizing an operator through its matrix
With this choice, our input-output relationship between coordinate vecors becomes

\begin{equation*} \matrixOf{T(\uvec{v})}{B} = \matrixOf{T}{B} \matrixOf{\uvec{v}}{B} \text{.} \end{equation*}

Subsection 45.3.2 Computing the matrix of a transformation

Recall that the columns of the standard matrix of a transformation \(\R^n \to \R^m\) are precisely the image vectors of the standard basis vectors for \(\R^n\text{.}\) (See (\(\dagger\)) in Subsection 42.3.4.) So for transformation \(\funcdef{T}{V}{W}\) and bases \(\basisfont{B},\basisfont{B}'\) of the domain space \(V\) and codomain space \(W\text{,}\) respectively, we can compute the matrix

\begin{equation*} \matrixOf{T}{B'B} = \stdmatrixOf{\widetilde{T}} = \stdmatrixOf{\coordmap{B'} T \invcoordmap{B}} \end{equation*}

by determining the images under \(\widetilde{T}\) of the standard basis vectors for \(\R^n\text{,}\) where \(n = \dim V\text{.}\)

This means that to obtain the \(\nth[j]\) column of \(\matrixOf{T}{B'}\text{,}\) we will be calculating

\begin{equation*} \widetilde{T}(\uvec{e}_j) = \stdmatrixOf{\widetilde{T}} \uvec{e}_j = \matrixOf{T}{B'B} \uvec{e}_j \text{,} \end{equation*}

where \(\uvec{e}_j\) is the \(\nth[j]\) standard basis vector. Compare the above with our input-output pattern involving \(\matrixOf{T}{B'B}\text{:}\)

\begin{equation*} \matrixOf{T(\uvec{v})}{B'} = \matrixOf{T}{B'B} \matrixOf{\uvec{v}}{B} \text{.} \end{equation*}

For what \(\uvec{v}\) in \(V\) will

\begin{equation*} \matrixOf{\uvec{v}}{B} = \uvec{e}_j \text{?} \end{equation*}

Remember that the components in a coordinate vector are coefficients to use in a linear combination of the basis vectors that will recreate the original vector. But the only nonzero component in \(\uvec{e}_j\) is the \(\nth[j]\) component, which is a \(1\text{,}\) so \(\uvec{e}_j\) corresponds to linear combination

\begin{equation*} 0 \uvec{v}_1 + \dotsb + 0 \uvec{v}_{j-1} + 1 \uvec{v}_j + 0 \uvec{v}_{j+1} + \dotsb 0 \uvec{v}_n = \uvec{v}_j\text{,} \end{equation*}

where the \(\uvec{v}_j\) are the domain space vectors in \(\basisfont{B}\text{.}\) In other words,

\begin{equation*} \uvec{e}_j = \matrixOf{\uvec{v}_j}{B} \text{.} \end{equation*}

So the \(\nth[j]\) column of \(\matrixOf{T}{B'B}\) is

\begin{equation*} \matrixOf{T}{B'B} \uvec{e}_j = \matrixOf{T}{B'B} \matrixOf{\uvec{v}_j}{B} = \matrixOf{T(\uvec{v}_j)}{B'}\text{.} \end{equation*}

In words, the pattern is that the \(\nth[j]\) column of \(\matrixOf{T}{B'B}\) is the coordinate vector, relative to the codomain space basis \(\basisfont{B}'\text{,}\) of the image vector \(T(\uvec{v}_j)\) of the \(\nth[j]\) vector \(\uvec{v}_j\) in the domain space basis \(\basisfont{B}\). If we write \(\basisfont{B} = \{ \uvec{v}_1, \uvec{v}_2, \dotsc, \uvec{v}_n \}\text{,}\) then

\begin{align} \matrixOf{T}{B'B} = \begin{bmatrix} | \amp | \amp \amp | \\ \matrixOf{T(\uvec{v}_1)}{B'} \amp \matrixOf{T(\uvec{v}_2)}{B'} \amp \cdots \amp \matrixOf{T(\uvec{v}_n)}{B'} \\ | \amp | \amp \amp | \end{bmatrix}\text{.}\label{equation-lintrans-matrix-concepts-cols}\tag{\(\star\)} \end{align}

Subsection 45.3.3 Important examples

The standard matrix of a transformation \(\R^n \to \R^m\).

Compare Procedure 45.3.1 above with Procedure 42.3.3 for computing the standard matrix of a transformation \(\R^n \to \R^m\text{.}\) Suppose we apply Procedure 45.3.1 to a transformation \(\funcdef{T}{\R^n}{\R^m}\text{,}\) taking both \(\basisfont{B},\basisfont{B}'\) to be standard bases. Since every \(\R^m\)-vector is equal to its own coordinate vector relative to the standard basis, the columns of \(\matrixOf{T}{B'B}\) will simply be the image vectors \(T(\uvec{e}_j)\) of the standard basis vectors of \(\R^n\text{.}\) So the result of applying Procedure 45.3.1 will be precisely as in Procedure 42.3.3.

Warning 45.3.2.

If a nonstandard basis is used for either \(\R^n\) or \(\R^m\) (or both), then the matrix \(\matrixOf{T}{B'B}\) will be different from the standard matrix \(\stdmatrixOf{T}\text{.}\) In Chapter 46, we will investigate how \(\matrixOf{T}{B}\) relates to \(\stdmatrixOf{T}\) in the case of a linear operator \(\funcdef{T}{\R^n}{\R^n}\) (and, more generally, how matrices for a linear operator \(\funcdef{T}{V}{V}\) relative to different bases for \(V\) relate to one another).

Matrices for a zero transformation.

Consider the zero transformation \(\funcdef{\zerovec_{V,W}}{V}{W}\) relative to any choice of a pair of bases \(\basisfont{B},\basisfont{B}'\) for spaces \(V,W\text{,}\) respectively. To compute \(\matrixOf{\zerovec_{V,W}}{B'B}\text{,}\) we follow Procedure 45.3.1 and compute the coordinate vectors for image vectors

\begin{equation*} \zerovec_{V,W}(\uvec{v}_j) \text{,} \end{equation*}

where the \(\uvec{v}_j\) are the domain space basis vectors in \(\basisfont{B}\text{.}\) But each

\begin{equation*} \zerovec_{V,W}(\uvec{v}_j) = \zerovec_W \text{,} \end{equation*}

which has coordinate vector

\begin{equation*} \matrixOf{\zerovec_W}{B'} = \zerovec_m \text{,} \end{equation*}

where \(\zerovec_m\) indicates the zero vector in \(\R^m\) for \(m = \dim W\text{.}\) Therefore, we always have

\begin{equation*} \matrixOf{\zerovec_{V,W}}{B'B} = \zerovec_{m \times n} \text{,} \end{equation*}

the \(m \times n\) zero matrix (for \(n = \dim V\)).

Matrix for an identity operator relative to a single basis.

Consider the identity operator \(\funcdef{I_V}{V}{V}\) relative to a choice of a single basis \(\basisfont{B}\) for space \(V\text{.}\) To compute \(\matrixOf{I_V}{B}\text{,}\) we follow Procedure 45.3.1 and compute the coordinate vectors for image vectors

\begin{equation*} I_V(\uvec{v}_j) \text{,} \end{equation*}

where the \(\uvec{v}_j\) are the basis vectors in \(\basisfont{B}\text{.}\) But each

\begin{equation*} I_V(\uvec{v}_j) = \uvec{v}_j = 0 \uvec{v}_1 + \dotsb + 0 \uvec{v}_{j-1} + 1 \uvec{v}_j + 0 \uvec{v}_{j+1} + \dotsb + 0 \uvec{v}_n \text{,} \end{equation*}

where \(n = \dim V\text{.}\) So each \(I_V(\uvec{v}_j)\) has coordinate vector

\begin{equation*} \matrixOf{I_V(\uvec{v}_j)}{B} = \uvec{e}_j \text{,} \end{equation*}

the \(\nth[j]\) standard basis vector of \(\R^n\text{.}\) Therefore, we always have

\begin{equation*} \matrixOf{I_V}{B} = I_n \text{,} \end{equation*}

the \(n \times n\) identity matrix.

Matrix for an identity operator relative to two bases.

Again, consider the identity operator \(\funcdef{I_V}{V}{V}\text{,}\) but suppose we choose two different bases for \(V\text{:}\) one basis \(\basisfont{B}\) to be considered as the domain space basis, and another basis \(\basisfont{B}'\) to be considered as the codomain space basis.

To compute \(\matrixOf{I_V}{B'B}\text{,}\) we follow Procedure 45.3.1 and compute the coordinate vectors for image vectors

\begin{equation*} I_V(\uvec{v}_j) = \uvec{v}_j \text{,} \end{equation*}

where the \(\uvec{v}_j\) are the basis vectors in \(\basisfont{B}\text{.}\) But now we will be computing these coordinate vectors relative to the second basis \(\basisfont{B}'\text{:}\)

\begin{equation*} \matrixOf{I_V(\uvec{v}_j)}{B'} = \matrixOf{\uvec{v}_j}{B'} \text{.} \end{equation*}

Inserting this pattern into (\(\star\)), we obtain

\begin{align*} \matrixOf{I_V}{B'B} \amp = \begin{bmatrix} | \amp | \amp \amp | \\ \matrixOf{I_V(\uvec{v}_1)}{B'} \amp \matrixOf{I_V(\uvec{v}_2)}{B'} \amp \cdots \amp \matrixOf{I_V(\uvec{v}_n)}{B'} \\ | \amp | \amp \amp | \end{bmatrix}\\ \amp = \begin{bmatrix} | \amp | \amp \amp | \\ \matrixOf{\uvec{v}_1}{B'} \amp \matrixOf{\uvec{v}_2}{B'} \amp \cdots \amp \matrixOf{\uvec{v}_n}{B'} \\ | \amp | \amp \amp | \end{bmatrix}\text{.} \end{align*}

We've seen this pattern before! It is precisely the pattern of the transition matrix \(\ucobmtrx{B}{B'}\) (see (\(\dagger\dagger\dagger\)) in Subsection 22.3.3). That is, a transition matrix is the matrix of the identity operator relative to the two different bases of the space:

\begin{equation*} \ucobmtrx{B}{B'} = \matrixOf{I_V}{B'B} \text{.} \end{equation*}

This makes sense, as the purpose of a transition matrix is to convert coordinate vectors relative to one basis into coordinate vectors relative to another basis, but without “transforming” the underlying vector in the process — whether we consider \(\matrixOf{\uvec{v}}{B}\) or \(\matrixOf{\uvec{v}}{B'}\text{,}\) it's the same vector \(\uvec{v}\text{,}\) just different linear combinations (of the different basis vectors) to realize that vector.

Matrix for a linear transformation relative to kernel and image bases.

Consider linear transformation \(\funcdef{T}{V}{W}\) between finite-dimensional spaces \(V,W\text{.}\) Suppose we carry out Procedure 43.3.1 to obtain a basis for \(\im T\text{.}\) We begin by determining a basis \(\basisfont{K}\) for \(\ker T\text{,}\) and then we extend that to a basis for \(V\) by obtaining additional linearly independent vectors \(\basisfont{K}'\text{.}\) But here, let's reverse the order of these vectors in constructing a basis for \(V\text{:}\) write

\begin{equation*} \basisfont{B} = \{ \uvec{v}_1, \dotsc, \uvec{v}_r, \uvec{u}_1, \dotsc, \uvec{u}_t \} \text{,} \end{equation*}

where the \(\uvec{v}_j\) are the vectors from \(\basisfont{K}'\) and the \(\uvec{u}_j\) are the vectors from \(\basisfont{K}\text{.}\)

According to Theorem 43.5.3, the collection of image vectors

\begin{equation*} \{ T(\uvec{v}_1), \dotsc, T(\uvec{v}_r) \} \end{equation*}

will be a basis for \(\im T\) in \(W\text{.}\) Now extend this linearly independent set of vectors to a basis for all of \(W\) by including some additional linearly independent collection of vectors, giving us basis

\begin{equation*} \basisfont{B}' = \{ T(\uvec{v}_1), \dotsc, T(\uvec{v}_r), \uvec{w}_1, \dotsc, \uvec{w}_s \} \end{equation*}

for \(W\text{.}\)

According to Procedure 45.3.1, to compute \(\matrixOf{T}{B'B}\) we should express each of the image vectors in the collection

\begin{align*} T(\basisfont{B}) \amp = \{ T(\uvec{v}_1), \dotsc, T(\uvec{v}_r), T(\uvec{u}_1), \dotsc, T(\uvec{u}_t) \} \\ \amp = \{ T(\uvec{v}_1), \dotsc, T(\uvec{v}_r), \zerovec_W, \dotsc, \zerovec_W \} \end{align*}

as linear combinations of the vectors in \(\basisfont{B}'\text{.}\) But this is straightforward, as we have

\begin{align*} T(\uvec{v}_1) \amp = 1 \, T(\uvec{v}_1) + \dotsb + 0 \, T(\uvec{v}_r) + 0 \, \uvec{w}_1 + \dotsb + 0 \, \uvec{w}_s \text{,} \\ \amp \vdots \\ T(\uvec{v}_r) \amp = 0 \, T(\uvec{v}_1) + \dotsb + 1 \, T(\uvec{v}_r) + 0 \, \uvec{w}_1 + \dotsb + 0 \, \uvec{w}_s \text{,} \\ \\ T(\uvec{u}_1) \amp = 0 \, T(\uvec{v}_1) + \dotsb + 0 \, T(\uvec{v}_r) + 0 \, \uvec{w}_1 + \dotsb + 0 \, \uvec{w}_s \text{,} \\ \amp \vdots \\ T(\uvec{u}_s) \amp = 0 \, T(\uvec{v}_1) + \dotsb + 0 \, T(\uvec{v}_r) + 0 \, \uvec{w}_1 + \dotsb + 0 \, \uvec{w}_s \text{.} \end{align*}

Thus for this particular choice of bases, the matrix \(\matrixOf{T}{B'B}\) has block form

\begin{equation*} \matrixOf{T}{B'B} = \begin{bmatrix} I_r \\ \amp \zerovec_{s \times t} \end{bmatrix} \text{,} \end{equation*}

where

\begin{align*} r \amp = \rank T \text{,} \amp s \amp = \dim W - \rank T \text{,} \amp t \amp = \nullity T \text{,} \end{align*}

and \(I_r\) and \(\zerovec_{s \times t}\) are the \(r \times r\) identity matrix and the \(s \times t\) zero matrix, respectively.

Subsection 45.3.4 Matrices of compositions and inverses

The matrix of a composition.

In Discovery 45.3, we explored the relationship of the matrix of a composition to the matrices of the constituent transformations. Suppose

\begin{align*} \amp\funcdef{T}{U}{V} \text{,} \amp \amp\funcdef{S}{V}{W} \end{align*}

are linear, and we have chosen bases \(\basisfont{B},\basisfont{B}',\basisfont{B}'''\) for spaces \(U,V,W\text{,}\) respectively. Then we can create compositions both of \(S,T\) and of the induced transformations \(\R^\ell \to \R^n\text{,}\) \(\R^n \to \R^m\text{.}\)

Diagram of a composition of transformations and the corresponding composition of matrix transformations
Recall that the matrices \(\matrixOf{T}{B'B}\) and \(\matrixOf{S}{B''B'}\) are the standard matrices of the induced transformations

\begin{align*} \amp\funcdef{\widetilde{T} = \coordmap{B'} T \invcoordmap{B}}{\R^\ell}{\R^n} \text{,} \amp \amp\funcdef{\widetilde{S} = \coordmap{B''} S \invcoordmap{B'}}{\R^n}{\R^m}\text{.} \end{align*}

The induced map of the composition \(S T\) satisfies

\begin{align*} \widetilde{ST} \amp = \coordmap{B''} (S T) \invcoordmap{B} \\ \amp = \coordmap{B''} S \invcoordmap{B'} \coordmap{B'} T \invcoordmap{B} \\ \amp = \widetilde{S} \widetilde{T} \text{,} \end{align*}

and so is the composition of the two induced transformations \(\widetilde{S},\widetilde{T}\text{.}\) We know that the standard matrix of a composition is the product of the standard matrices, which yields:

\begin{equation*} \matrixOf{ST}{B''B} = \stdmatrixOf{\widetilde{ST}} = \stdmatrixOf{\widetilde{S} \widetilde{T}} = \stdmatrixOf{\widetilde{S}} \stdmatrixOf{\widetilde{T}} = \matrixOf{S}{B''B'} \matrixOf{T}{B' B}\text{.} \end{equation*}

So the matrix of the composition of general linear transformations satisfies the same pattern as standard matrices do: the matrix of a composition is the product of the matrices, as long as the same intermediate basis \(\basisfont{B'}\) is used for both constituent transformations.

Remark 45.3.3.

Notice how the notation acts as a guide to the correct composition matrix. An \(m \times n\) matrix and an \(n \times \ell\) matrix can be multiplied because the dimension on the inside matches, and the result will correspond to the outside dimensions, so that the size of the matrix product will be \(m \times \ell\text{.}\) Similarly, you can think of the two matrices \(\matrixOf{S}{B''B'}\) and \(\matrixOf{T}{B'B}\) as being compatible for multiplication because the two intermediate \(\basisfont{B}'\) match, and then the result is \(\matrixOf{ST}{B''B}\text{,}\) a matrix relative to what were the two “outside” bases.

The matrix of an operator composed with itself.

Applying the pattern of the matrix of a composition to the case of a linear operator \(\funcdef{T}{V}{V}\) composed with itself, relative to the choice of a single basis for the space \(V\text{,}\) we find that

\begin{equation*} \matrixOf{TT}{B} = \matrixOf{T}{B} \matrixOf{T}{B} = (\matrixOf{T}{B})^2 \text{.} \end{equation*}

In analogy with this squaring of matrices, we write \(T^2\) instead of \(TT\) for the composition of \(T\) with itself.

Extending this pattern, writing \(T^k\) for the composition of \(k\) copies of operator \(T\text{,}\) we always have

\begin{gather} \matrixOf{T^k}{B} = (\matrixOf{T}{B})^k\text{.}\label{equation-lintrans-matrix-concepts-matrix-of-power}\tag{\(\dagger\)} \end{gather}
The matrix of an inverse.

In Discovery 45.5, we explored matrices of isomorphisms and the relationship of the matrix of the inverse transformation to the matrix of the original isomorphism. Suppose \(\funcdef{T}{V}{W}\) is an isomorphism, in which case spaces \(V,W\) must have the same dimesion, say \(n\text{.}\) Also suppose we have chosen bases \(\basisfont{B},\basisfont{B}'\) of spaces \(V,W\text{,}\) respectively. Then we can compose \(T\) with its inverse to get \(\inv{T} T = I_V\text{,}\) the identity operator on \(V\text{.}\)

Diagram of a composition of an isomorphism and its inverse
Using our pattern of compositions from above, we have

\begin{equation*} \matrixOf{I_V}{B} = \matrixOf{\inv{T} T}{BB} = \matrixOf{\inv{T}}{BB'} \matrixOf{T}{B'B} \text{.} \end{equation*}

However, the matrix of the identity operator relative to a single basis is always the identity matrix, so the above becomes

\begin{equation*} \matrixOf{\inv{T}}{BB'} \matrixOf{T}{B'B} = I \text{,} \end{equation*}

from which we can conclude that these two matrices are inverses of each other (Proposition 6.5.4); i.e.

\begin{equation*} \matrixOf{\inv{T}}{BB'} = \invmatrixOf{T}{B'B} \text{.} \end{equation*}

That is, the matrix of an inverse is the inverse of the matrix, as long as the same bases (in reverse order) are used to create both matrices.

For a linear operator \(\funcdef{T}{V}{V}\) with basis choice \(\basisfont{B}' = \basisfont{B}\) that is an isomorphism of \(V\) onto itself, we can use the above pattern to extend (\(\dagger\)):

\begin{equation*} \matrixOf{T^k}{B} = (\matrixOf{T}{B})^k \end{equation*}

holds for all integer exponents \(k\text{.}\)

Subsection 45.3.5 Properties of a transformation reflected in its matrix

As we began to explore in Discovery 45.5, we can tell a lot about a transformation from its matrix relative to any choice of bases for the domain and codomain space.

For example, since the only vector in a space that can have \(\zerovec\) as its coordinate vector is the zero vector, we can determine kernel vectors from the null space of the matrix of the transformation. And since linearity allows us to tie injectivity to the kernel (Theorem 44.5.5), we can tell if a transformation is injective by determining if its matrix has trivial null space.

Similarly, we can connect image vectors of the transformation to the column space of its matrix. In particular we can tie the rank of the transformation to the rank of the matrix, and thereby determine surjectivity.

Combining these two considerations, for a transformation between spaces of the same dimension (and, in particular, for linear operators), we can determine whether the transformation is an isomorphism by considering the invertibility of its matrix.

See the Theory section for this chapter for formal statements of these connections between the properties of a transformation and the properties of its matrix.