Section 45.5 Theory
In this section.
Subsection 45.5.1 The matrix of a linear transformation
Once a choice of bases for domain and codomain are made, there is one single matrix that will represent a transformation.
Theorem 45.5.1. Transformation matrices are unique relative to a choice of bases.
Suppose \(\funcdef{T}{V}{W}\) is a linear transformation between finite-dimensional vector spaces \(V,W\text{,}\) with \(\dim V = n\) and \(\dim W = m\text{.}\) Further suppose we have chosen bases \(\basisfont{B},\basisfont{B}'\) of \(V\) and \(W\text{,}\) respectively.
Then there exists one unique \(m \times n\) matrix \(\matrixOf{T}{B'B}\) so that
for all vectors \(\uvec{v}\) in the domain space \(V\text{.}\)
Proof idea.
We already discussed this in Subsection 45.3.1. The matrix of \(T\) relative to \(\basisfont{B},\basisfont{B}'\) is defined to be the standard matrix of the composition \(\coordmap{B'} T \invcoordmap{B}\text{,}\) where \(\coordmap{B},\coordmap{B'}\) are the coordinate maps on \(V,W\) relative to \(\basisfont{B},\basisfont{B}'\text{,}\) respectively. And we know that standard matrices are unique from Corollary 42.5.4.
If the bases for domain and codomain are made in an informed manner, the matrix for the transformation will be particularly simple.
Theorem 45.5.2. Block form relative to bases for kernel and image.
For linear transformation \(\funcdef{T}{V}{W}\) between finite-dimensional vector spaces \(V,W\text{,}\) there exist bases \(\basisfont{B},\basisfont{B}'\) of \(V,W\text{,}\) respectively, such that
where
and \(I_r\) and \(\zerovec_{s \times t}\) are the \(r \times r\) identity matrix and the \(s \times t\) zero matrix, respectively.
Proof idea.
We already discussed this in Subsection 45.3.3. The matrix \(\matrixOf{T}{B'B}\) will have this form when we choose \(\basisfont{B}\) to be an extension of a basis for \(\ker T\) to a basis for \(V\) (with the kernel basis vectors appearing after all the non-kernel vectors), and we choose \(\basisfont{B}'\) to be an extension of the basis for \(\im T\) afforded by the nonzero image vectors in \(T(\basisfont{B})\) to a basis for \(W\) (with the image basis vectors appearing before all the non-image vectors).
An identity operator does not transform the vectors in the domain space, but its matrix can change the basis relative to which coordinate vectors are formed. As this was already explored in Discovery 45.6 and examined in detail in Subsection 45.3.3, we state it without proof
Proposition 45.5.3. Transition matrices represent an identity operator.
If \(\basisfont{B},\basisfont{B}'\) are bases of finite-dimensional vector space \(V\text{,}\) then the matrix for the identity operator \(\funcdef{I_V}{V}{V}\) is the transition matrix between \(\basisfont{B}\) and \(\basisfont{B}'\text{.}\)
That is, \(\matrixOf{I_V}{B'B} = \ucobmtrx{B}{B'} \text{.}\)
Subsection 45.5.2 Properties of a transformation from properties of its matrix
Proposition 45.5.4. Kernel and image versus null and column spaces.
Suppose that \(\funcdef{T}{V}{W}\) is a linear transformation between finite-dimensional vector spaces, and \(\basisfont{B},\basisfont{B}'\) is a choice of bases for \(V,W\text{,}\) respectively. Then the following hold.
- A vector \(\uvec{v}\) in the domain space \(V\) is in \(\ker T\) if and only if the coordinate vector \(\matrixOf{\uvec{v}}{B}\) is in the null space of \(\matrixOf{T}{B'B}\text{.}\)
- The dimension of the kernel of \(T\) is equal to the dimension of the null space of \(\matrixOf{T}{B'B}\text{.}\) That is, the nullity of \(T\) is equal to the nullity of \(\matrixOf{T}{B'B}\text{.}\)
- A vector \(\uvec{w}\) in the codomain space \(W\) is in \(\im T\) if and only if the coordinate vector \(\matrixOf{\uvec{w}}{B'}\) is in the column space of \(\matrixOf{T}{B'B}\text{.}\)
- The dimension of the image of \(T\) is equal to the dimension of the column space of \(\matrixOf{T}{B'B}\text{.}\) That is, the rank of \(T\) is equal to the rank of \(\matrixOf{T}{B'B}\text{.}\)
Proof.
- By definition, vector \(\uvec{v}\) in \(V\) is in \(\ker T\) precisely when \(T(\uvec{v}) = \zerovec_W\text{.}\) As the coordinate map is an isomorphism, the only vector in \(W\) with coordinate vector \(\zerovec_m\) (where \(m = \dim W\)) is \(\zerovec_W\text{.}\) So \(T(\uvec{v}) = \zerovec_W\) if and only if\begin{equation*} \matrixOf{T(\uvec{v})}{B'} = \zerovec_m \text{.} \end{equation*}Using (\(\star\)), we can say that \(\uvec{v}\) is in \(\ker T\) if and only if\begin{equation*} \matrixOf{T}{B'B} \matrixOf{\uvec{v}}{B} = \zerovec_m \text{,} \end{equation*}which says that \(\matrixOf{\uvec{v}}{B}\) is in the null space of \(\matrixOf{T}{B'B}\text{.}\)
- It follows from Statement 1 that the isomorphism \(\coordmap{B}\) must send a basis for \(\ker T\) to a basis for the null space of \(\matrixOf{T}{B'B}\text{.}\) Hence the dimensions of these two spaces must be equal.
- By definition, vector \(\uvec{w}\) in \(W\) is in \(\im T\) precisely when there exists at least one vector \(\uvec{v}\) in \(V\) with \(\uvec{w} = T(\uvec{v})\text{.}\) In this case, using (\(\star\)) and the fact that \(\coordmap{B'}\) is an isomorphism, we would have\begin{equation*} \matrixOf{\uvec{w}}{B'} = \matrixOf{T(\uvec{v})}{B'} = \matrixOf{T}{B'B} \matrixOf{\uvec{v}}{B} \text{.} \end{equation*}But then, using the fact that \(\coordmap{B}\) is also an isomorphism, we can say that \(\uvec{w}\) is in \(\im T\) if and only if there exists a column vector \(\uvec{x}\) with\begin{equation*} \matrixOf{\uvec{w}}{B'} = \matrixOf{T}{B'B} \uvec{x} \text{.} \end{equation*}As the column space of a matrix \(A\) consists of the results of all possible products \(A \uvec{x}\) (Subsection 21.3.1), we arrive at the statement at hand.
- Again, it follows from Statement 3 that the isomorphism \(\coordmap{B'}\) must send a basis for \(\im T\) to a basis for the column space of \(\matrixOf{T}{B'B}\text{.}\) Hence the dimensions of these two spaces must be equal.
We can also characterize isomorphisms via invertibility of their matrices.
Corollary 45.5.5. Isomorphism has invertible matrix.
Suppose that \(\funcdef{T}{V}{W}\) is a linear transformation between finite-dimensional vector spaces. Then the following are equivalent.
- Transformation \(T\) is an isomorphism.
- For every choice of bases \(\basisfont{B},\basisfont{B}'\) for \(V,W\text{,}\) respectively, the matrix \(\matrixOf{T}{B'B}\) is square and invertible.
- For at least one choice of bases \(\basisfont{B},\basisfont{B}'\) for \(V,W\text{,}\) respectively, the matrix \(\matrixOf{T}{B'B}\) is square and invertible.
Proof.
Statement 1 implies Statement 2.
Assume \(T\) is an isomorphism, and suppose \(\basisfont{B},\basisfont{B}'\) is an arbitrary choice of bases for \(V,W\text{,}\) respectively. Then the dimensions of \(V\) and \(W\) must be equal (Corollary 44.5.15), hence \(\matrixOf{T}{B'B}\) must be square. Furthermore, to be an isomorphism the transformation \(T\) must be injective, hence its kernel is trivial (Theorem 44.5.5). Using Statement 1 of Proposition 45.5.4, we can conclude that the null space of \(\matrixOf{T}{B'B}\) is also trivial, in which case \(\matrixOf{T}{B'B}\) is invertible (Statement 9 of Theorem 21.5.5).
Statement 2 implies Statement 3.
This is obvious.
Statement 3 implies Statement 1.
Suppose \(\basisfont{B},\basisfont{B}'\) is a choice of bases for \(V,W\text{,}\) respectively, for which the matrix \(\matrixOf{T}{B'B}\) is square and invertible. Since the dimensions of \(\matrixOf{T}{B'B}\) reflect the dimensions of \(V\) and \(W\text{,}\) these two spaces must have the same dimension. Applying Corollary 44.5.12, it now suffices to verify that \(T\) is injective, which we can do by considering \(\ker T\) (Theorem 44.5.5). But if \(\matrixOf{T}{B'B}\) is invertible, it must have trivial null space (Statement 9 of Theorem 21.5.5), which implies that \(\ker T\) is trivial as well (Statement 1 of Proposition 45.5.4).
We have now completed the cycle of logical dependence to demonstrate that these three statements are equivalent.
Subsection 45.5.3 Matrices of compositions and inverses
As we have thoroughly discussed these relationships in Subsection 45.3.4, we will state them here without proof.
Proposition 45.5.6.
- Suppose \(\funcdef{T}{U}{V}\) and \(\funcdef{S}{V}{W}\) are linear transformations between finite-dimensional vector spaces, and we have chosen bases \(\basisfont{B},\basisfont{B}',\basisfont{B}'''\) for spaces \(U,V,W\text{,}\) respectively. Then\begin{equation*} \matrixOf{ST}{B''B} = \matrixOf{S}{B''B'} \matrixOf{T}{B' B} \text{.} \end{equation*}
- Suppose \(\funcdef{T}{V}{W}\) is an isomorphism between finite-dimensional vector spaces, and we have chosen bases \(\basisfont{B},\basisfont{B}'\) for spaces \(V,W\text{,}\) respectively. Then\begin{equation*} \matrixOf{\inv{T}}{BB'} = \inv{(\matrixOf{T}{B'B})} \text{.} \end{equation*}
- Suppose \(\funcdef{T}{V}{V}\) is a linear operator on a finite-dimensional vector space, and we have chosen a basis \(\basisfont{B}\) for \(V\text{.}\) Then for all positive exponents \(k\text{,}\) we have\begin{equation*} \matrixOf{T^k}{B} = (\matrixOf{T}{B})^k \text{.} \end{equation*}In addition, if \(T\) is an isomorphism then we can say that the above equality holds for all integer exponents.
Subsection 45.5.4 The space of linear transformations as a space of matrices
We now have a correspondence that associates a matrix \(\matrixOf{T}{B'B}\) to each linear transformation \(\funcdef{T}{V}{W}\) via a choice of bases \(\basisfont{B},\basisfont{B}'\) of spaces \(V,W\text{,}\) respectively, such that the properties of the transformation are reflected in the properties of the matrix, and vice versa. Essentially, the space of transformations \(L(V,W)\) comes to resemble a space of matrices \(\matrixring_{m \times n}(\R)\) (or \(\matrixring_{m \times n}(\C)\text{,}\) as appropriate).
Theorem 45.5.7. Transformations are matrices.
Suppose \(\basisfont{B},\basisfont{B}'\) are bases for finite-dimensional vector spaces \(V,W\text{,}\) respectively. Then the correspondence
is an isomorphism between \(L(V,W)\text{,}\) the space of all transformations \(V \to W\text{,}\) and \(\matrixring_{m \times n}(\R)\) (real case) or \(\matrixring_{m \times n}(\C)\) (complex case), where \(n = \dim V\) and \(m = \dim W\text{.}\)
Proof.
As usual, we treat only the real case, as the complex case is identical.
Let \(\funcdef{M}{L(V,W)}{\matrixring_{m \times n}(\R)}\) represent the function defined by
First, we must verify that \(M\) is linear.
Additivity.
Suppose \(T_1,T_2\) are transformations in \(L(V,W)\text{,}\) and let \(\uvec{v}\) be an arbitrary vector in the domain space \(V\text{.}\) Then we have
with justifications
- matrix algebra (Rule 1.d of Proposition 4.5.1);
- equality (\(\star\)), applied to both \(T_1\) and \(T_2\text{;}\)
- linearity of the coordinate map (Theorem 22.5.1); and
- definition of the sum of linear transformations (see Subsection 42.3.6).
But \(\matrixOf{T_1 + T_2}{B'B}\) should be the one unique matrix that satisfies
for every vector \(\uvec{v}\) in the domain space \(V\) (Theorem 45.5.1). As the matrix \(\matrixOf{T_1}{B'B} + \matrixOf{T_2}{B'B}\) satisfies the same condition, we must have
Using the notation of \(\funcdef{M}{L(V,W)}{\matrixring_{m \times n}(\R)}\text{,}\) this says that
as required.
Homogeneity.
Suppose \(T\) is a transformation in \(L(V,W)\text{,}\) \(k\) is a scalar, and let \(\uvec{v}\) be an arbitrary vector in the domain space \(V\text{.}\) Then we have
with justifications
- matrix algebra (Rule 2.c of Proposition 4.5.1);
- equality (\(\star\));
- linearity of the coordinate map (Theorem 22.5.1); and
- definition of the scalar multiple of a linear transformation (see Subsection 42.3.6).
But \(\matrixOf{k T}{B'B}\) should be the one unique matrix that satisfies
for every vector \(\uvec{v}\) in the domain space \(V\) (Theorem 45.5.1). As the matrix \(k \, \matrixOf{T}{B'B}\) satisfies the same condition, we must have
Using the notation of \(\funcdef{M}{L(V,W)}{\matrixring_{m \times n}(\R)}\text{,}\) this says that
as required.
Now we check that \(M\) is an isomorphism.
Injectivity.
As we have already shown that \(M\) is linear, we may check that \(M\) is injective by verifying that \(\ker M\) is trivial (Theorem 44.5.5). So suppose that transformation \(T\) in \(L(V,W)\) satisfies \(M(T) = \zerovec_{m \times n}\text{,}\) the \(m \times n\) zero matrix. Then for every \(\uvec{v}\) in \(V\text{,}\) we have
But the coordinate map \(\coordmap{B'}\) is an isomorphism, and \(\zerovec_W\) is the one unique vector in \(W\) satisfying
so we must have
for each \(\uvec{v}\) in \(V\text{.}\) In particular, this holds for each basis vector in \(\basisfont{B}\text{.}\) However, the zero transformation is the unique transformation \(V \to W\) that sends each vector in the domain space basis \(\basisfont{B}\) to \(\zerovec_W\) (Corollary 42.5.3), and so we can conclude \(T = \zerovec_{V,W}\text{,}\) as desired.
Surjectivity.
We wish to verify that, given matrix \(A\) in \(\matrixring_{m \times n}(\R)\text{,}\) there exists a transformation \(T\) in \(L(V,W)\) with
i.e. with
As in Corollary 42.5.3, we may attempt to define such a \(T\) by simply specifying the image vectors for each of the domain space basis vectors in \(\basisfont{B}\text{.}\) Writing \(\uvec{a}_1,\dotsc,\uvec{a}_n\) for the columns of \(A\) and \(\uvec{v}_1,\dotsc,\uvec{v}_n\) for the vectors in \(\basisfont{B}\text{,}\) for each index \(j\) set \(T(\uvec{v}_j)\) to be the vector in \(W\) whose coordinate vector in \(\R^m\) is \(\uvec{a}_j\text{.}\) That is, set each
Then, using the computation pattern (\(\star\)) developed in Subsection 45.3.2, we see that \(\matrixOf{T}{B'B} = A\) will be true, as desired.
Remark 45.5.8.
Recall that the dual space of a vector space \(V\) is the space \(\vecdual{V} = L(V,\R^1)\) (real case) or \(\vecdual{V} = L(V,\C^1)\) (complex case). Applying the theorem to this situation, we have
as appropriate, where
This is merely another version of Corollary 44.5.16, where we realize \(\R^n\) and \(\C^n\) as spaces of \(1 \times n\) row vectors instead of \(n \times 1\) column vectors. In other words, in realizing \(V \iso \R^n\) (or \(V \iso \C^n\text{,}\) as appropriate) via coordinate maps, we naturally associate vectors in \(V\) with column vectors, whereas in realizing \(\vecdual{V} \iso \R^n\) (or \(\vecdual{V} \iso \C^n\text{,}\) as appropriate) it is actually more natural to associate vectors in \(\vecdual{V}\) with row vectors.
These associations allow us to think of evaluation of a linear functional \(f\) in \(\vecdual{V}\) on a vector \(\uvec{v}\) in \(V\) as a dot-product-like pairing
so that evaluation \(f(\uvec{v})\) is akin to multiplication of the row vector for \(f\) times the column vector for \(\uvec{v}\text{.}\)
Corollary 45.5.9. Dimension of \(L(V,W)\).
The dimension of the space of transformations between two finite-dimensional vector spaces is the product of their dimensions. That is, if
then