Section 43.5 Theory
Subsection 43.5.1 Kernel and image are subspaces
Theorem 43.5.1.
For linear transformation \(\funcdef{T}{V}{W}\text{,}\) \(\ker T\) is a subspace of the domain space \(V\) and \(\im T\) is a subspace of the codomain space \(W\text{.}\)
Proof: Kernel is a subspace.
Nonempty.
By Statement 1 of Proposition 42.5.1, \(\ker T\) always contains \(\zerovec_V\text{,}\) the zero vector in the domain space \(V\text{.}\)
Closed under addition.
Suppose \(\uvec{v}_1,\uvec{v}_2\) are in \(\ker T\text{.}\) Is \(\uvec{v}_1 + \uvec{v}_2\) also in \(\ker T\text{?}\) Using the additivity of \(T\text{,}\) we have
\begin{equation*}
T(\uvec{v}_1 + \uvec{v}_2)
= T(\uvec{v}_1) + T(\uvec{v}_2)
= \zerovec_W + \zerovec_W
= \zerovec_W\text{,}
\end{equation*}
so yes, \(\uvec{v}_1 + \uvec{v}_2\) is also in \(\ker T\text{.}\)
Closed under scalar multiplication.
Suppose \(\uvec{v}\) is in \(\ker T\text{.}\) Is \(k \uvec{v}\) also in \(\ker T\) for arbitrary scalar \(k\text{?}\) Using the homogeneity of \(T\text{,}\) we have
\begin{equation*}
T(k \uvec{v}) = k T(\uvec{v}) = k \zerovec_W = \zerovec_W \text{,}
\end{equation*}
so yes, \(k \uvec{v}\) is also in \(\ker T\text{.}\)
Proof: Image is a subspace.
Nonempty.
By Statement 1 of Proposition 42.5.1, \(\im T\) always contains \(\zerovec_W\text{,}\) the zero vector in the codomain space \(W\text{.}\)
Closed under addition.
Suppose \(\uvec{w}_1,\uvec{w}_2\) are in \(\im T\text{,}\) so that there exist \(\uvec{v}_1,\uvec{v}_2\) in \(V\) with
\begin{align*}
T(\uvec{v}_1) \amp = \uvec{w}_1 \text{,} \amp
T(\uvec{v}_2) \amp = \uvec{w}_2\text{.}
\end{align*}
Is \(\uvec{w}_1 + \uvec{w}_2\) also in \(\im T\text{?}\) Using the additivity of \(T\text{,}\) we have
\begin{equation*}
\uvec{w}_1 + \uvec{w}_2
= T(\uvec{v}_1) + T(\uvec{v}_2)
= T(\uvec{v}_1 + \uvec{v}_2)\text{,}
\end{equation*}
so yes, \(\uvec{w}_1 + \uvec{w}_2\) is also in \(\im T\text{.}\)
Closed under scalar multiplication.
Suppose \(\uvec{w}\) is in \(\im T\text{,}\) so that there exists \(\uvec{v}\) in \(V\) with
\begin{equation*}
T(\uvec{v}) = \uvec{w} \text{.}
\end{equation*}
Is \(k \uvec{w}\) also in \(\im T\) for arbitrary scalar \(k\text{?}\) Using the homogeneity of \(T\text{,}\) we have
\begin{equation*}
k \uvec{w} = k T(\uvec{v}) = T(k \uvec{v}) \text{,}
\end{equation*}
so yes, \(k \uvec{w}\) is also in \(\im T\text{.}\)
Subsection 43.5.2 Basis and dimension of kernel and image
First we formally state the pattern of Discovery 43.5, which we discussed further in Subsection 43.3.2.
Lemma 43.5.2. Image of a spanning set is a spanning set for the image.
If \(\funcdef{T}{V}{W}\) is linear and
\begin{equation*}
S = \{ \uvec{v}_1, \uvec{v}_2, \dotsc, \uvec{v}_\ell \}
\end{equation*}
is a spanning set for the domain space \(V\text{,}\) then the collection of image vectors
\begin{equation*}
T(S) = \{ T(\uvec{v}_1), T(\uvec{v}_2), \dotsc, T(\uvec{v}_\ell) \}
\end{equation*}
is a spanning set for \(\im T\text{.}\)
Now we will justify the conclusion of Procedure 43.3.1.
Theorem 43.5.3. Basis for image.
Suppose \(\funcdef{T}{V}{W}\) is linear with \(V\) finite-dimensional, and
\begin{align*}
\basisfont{K} \amp = \{ \uvec{u}_1, \dotsc, \uvec{u}_k \} \text{,} \amp
\basisfont{K}' \amp = \{ \uvec{v}_1, \dotsc, \uvec{v}_r \}
\end{align*}
are collections of vectors in \(V\) so that
-
\(\basisfont{K}\) is a basis for \(\ker T\text{,}\) and
-
the vectors of \(\basisfont{K},\basisfont{K}'\text{,}\) taken all together, form a basis for \(V\text{.}\)
Then the collection of image vectors
\begin{equation*}
T(\basisfont{K}') = \{ T(\uvec{v}_1), \dotsc, T(\uvec{v}_r) \}
\end{equation*}
is a basis for \(\im T\text{.}\)
Proof.
We need to establish that the collection \(T(\basisfont{K}')\) is both linearly independent and a spanning set for \(\im T\text{.}\)
Linear independence.
To apply the Test for Linear Dependence/Independence (Procedure 18.3.1), we begin with a homogeneous vector equation
\begin{equation*}
c_1 T(\uvec{v}_1) + c_2 T(\uvec{v}_2) + \dotsb + c_r T(\uvec{v}_r) = \zerovec_W \text{.}
\end{equation*}
The linearity of \(T\) can be used to collapse the linear combination on the left into a single image vector
\begin{equation*}
T(c_1 \uvec{v}_1 + c_2 \uvec{v}_2 + \dotsb + c_r \uvec{v}_r) = \zerovec_W \text{,}
\end{equation*}
which implies that the domain space linear combination
\begin{equation*}
c_1 \uvec{v}_1 + c_2 \uvec{v}_2 + \dotsb + c_r \uvec{v}_r
\end{equation*}
is in \(\ker T\text{.}\) As \(\ker T = \Span \basisfont{K}\text{,}\) this means that the above linear combination is simultaneously in both of the subspaces \(\Span \basisfont{K}\) and \(\Span \basisfont{K}'\text{.}\) However, these spaces are independent, since we have assumed that their bases (\(\basisfont{K}\) and \(\basisfont{K}'\)) together form a basis for \(V\text{,}\) hence these two bases form an independent set when taken together. By Theorem 28.6.4, a pair of independent spaces can only intersect at the zero vector, so we may conclude that
\begin{equation*}
c_1 \uvec{v}_1 + c_2 \uvec{v}_2 + \dotsb + c_r \uvec{v}_r = \zerovec_V \text{.}
\end{equation*}
As the vectors in \(\basisfont{K}'\) are assumed to be independent (since they form part of a basis for \(V\)), the only way this last vector equation above is possible is the trivial way:
\begin{equation*}
c_1 = c_2 = \dotsb = c_r = 0 \text{.}
\end{equation*}
In particular, since all of the \(c_j\) scalars must be zero, we can conclude that the vectors in \(T(\basisfont{K}')\) are independent, as desired.
Spans.
Write \(\basisfont{B}_V\) for the collection of all vectors from \(\basisfont{K}\) and \(\basisfont{K}'\) taken together, which we have assumed is a basis for the domain space \(V\text{.}\) By Lemma 43.5.2, the collection of image vectors
\begin{equation*}
T(\basisfont{B}_V) = \{
T(\uvec{u}_1), \dotsc, T(\uvec{u}_k),
T(\uvec{v}_1), \dotsc, T(\uvec{v}_r)
\}
\end{equation*}
is a spanning set for \(\im T\text{.}\) But since the \(\uvec{u}_j\) are in \(\ker T\text{,}\) we actually have
\begin{equation*}
T(\basisfont{B}_V) = \{
\zerovec_W, \dotsc, \zerovec_W,
T(\uvec{v}_1), \dotsc, T(\uvec{v}_r)
\}\text{,}
\end{equation*}
and clearly the collection of image vectors
\begin{equation*}
T(\basisfont{K}') = \{ T(\uvec{v}_1), \dotsc, T(\uvec{v}_r) \}
\end{equation*}
will span the same space that \(T(\basisfont{B}_V)\) does.
Finally, we can connect the dimensions of the kernel and image of a transformation.
Theorem 43.5.4. Dimension Theorem.
Suppose \(\funcdef{T}{V}{W}\) is linear with \(V\) finite-dimensional. Then the sum of the rank and nullity of \(T\) is the dimension of the domain space \(V\text{.}\) That is,
\begin{equation*}
\dim (\ker T) + \dim (\im T) = \dim V \text{.}
\end{equation*}
Proof.
Let \(\basisfont{K}\) be a basis for \(\ker T\text{,}\) and let \(\basisfont{K}'\) be a collection of vectors in \(V\) that enlarges \(\basisfont{K}\) to a basis \(\basisfont{B}_V\) of \(V\) (Proposition 20.5.4). Then Theorem 43.5.3 says that the collection of image vectors \(T(\basisfont{K}')\) is a basis for \(\im T\text{.}\) Now, the collection \(T(\basisfont{K}')\) cannot contain any duplicates because it is linearly independent. So \(T(\basisfont{K}')\) contains exactly the same number of vectors as \(\basisfont{K}'\text{.}\)
Using the notation \(\# S\) to mean the number of vectors in collection \(S\text{,}\) we now have
\begin{gather*}
\dim (\ker T) = \# \basisfont{K} \text{,} \\
\dim (\im T) = \# T(\basisfont{K}') = \# \basisfont{K}' \text{,} \\
\dim V = \# \basisfont{K} + \# \basisfont{K}' \text{,}
\end{gather*}
and the result follows.

