If the spaces
\(W_1,W_2,\dotsc,W_\ell\) are independent, then by definition their bases taken all together remain linearly independent. Since a subcollection of an independent set is still independent (
Statement 2 of
Proposition 18.5.3), it follows that the collection of final null space vectors taken from each cyclic basis will remain independent.
Now consider the reverse implication. We will prove this statement in the case of two cyclic spaces,
\(W_1\) and
\(W_2\text{.}\) (The proof in the case of more than two spaces is similar, but much more tedious.) The proof we give is similar to the independence argument in the proof of
Theorem 32.6.3.
Suppose
\begin{equation*}
\{ \uvec{u}, A\uvec{u}, \dotsc, A^{k-1} \uvec{u} \}
\end{equation*}
is a cylic basis for \(W_1\) and
\begin{equation*}
\{ \uvec{v}, A\uvec{v}, \dotsc, A^{m-1} \uvec{v} \}
\end{equation*}
is a cylic basis for \(W_2\text{.}\) Further assume that
\begin{equation*}
\{ A^{k-1} \uvec{u}, A^{m-1} \uvec{v} \}
\end{equation*}
is a linearly independent set contained in the null space of \(A\text{.}\) Note that these vectors being in the null space means that
\begin{equation*}
A^k \uvec{u} = \zerovec, \qquad A^m \uvec{v} = \zerovec.
\end{equation*}
Case \(m=k\).
\begin{equation*}
a_0 \uvec{u} + a_1 A \uvec{u} + \dotsb + a_{k-1} A^{k-1} \uvec{u}
+ b_0 \uvec{v} + b_1 A \uvec{v} + \dotsb + b_{k-1} A^{k-1} \uvec{v}
= \zerovec\text{.}
\end{equation*}
We wish to show that all of the scalars in the above linear combination are zero. First, multiply each side of the equation by \(A^{k-1}\text{.}\) Since
\begin{equation*}
A^k \uvec{u} = A^k \uvec{v} = \zerovec\text{,}
\end{equation*}
this eliminates all of the terms except the \(a_0\) and \(b_0\) terms, leaving
\begin{equation*}
a_0 A^{k-1} \uvec{u} + b_0 A^{k-1} \uvec{v} = \zerovec \text{.}
\end{equation*}
However, we have assumed that \(A^{k-1} \uvec{u}\) and \(A^{k-1} \uvec{v}\) are linearly independent. Therefore, we must have \(a_0 = b_0 = 0\text{,}\) leaving
\begin{equation*}
a_1 A \uvec{u} + \dotsb + a_{k-1} A^{k-1} \uvec{u}
+ b_1 A \uvec{v} + \dotsb + b_{k-1} A^{k-1} \uvec{v}
= \zerovec
\end{equation*}
in the original homogeneous vector equation. Multiplying both sides of this new equation by \(A^{k-2}\text{,}\) we can use the same argument to conclude that \(a_1 = b_1 = 0\text{.}\) Continuing in this fashion, we can argue that all the coefficients are zero.
Case \(m \gt k\).
Again, assume that
\begin{equation*}
a_0 \uvec{u} + a_1 A \uvec{u} + \dotsb + a_{k-1} A^{k-1} \uvec{u}
+ b_0 \uvec{v} + b_1 A \uvec{v} + \dotsb + b_{m-1} A^{m-1} \uvec{v}
= \zerovec\text{.}
\end{equation*}
Multiply both sides of this equation by \(A^{m-1}\text{.}\) Since \(m \gt k\text{,}\) this will eliminate all of the \(\uvec{u}\) terms, and just leave \(b_0 A^{m-1} \uvec{v} = \zerovec\text{.}\) Since \(A^{m-1} \uvec{v}\) is part of a basis, it cannot be zero, hence \(b_0 = 0\text{.}\) We are then left with
\begin{equation*}
a_0 \uvec{u} + a_1 A \uvec{u} + \dotsb + a_{k-1} A^{k-1} \uvec{u}
+ b_1 A \uvec{v} + \dotsb + b_{m-1} A^{m-1} \uvec{v}
= \zerovec
\end{equation*}
in the original homogeneous vector equation. Continuing in this fashion, we can argue that \(b_0 = b_1 = \dotsb = b_{m-k-1} = 0\text{,}\) leaving
\begin{align*}
a_0 \uvec{u} \amp + a_1 A \uvec{u} + \dotsb + a_{k-1} A^{k-1} \uvec{u}\\
\amp + b_{m-k} A^{m-k} \uvec{v} + b_{m-k+1} A^{m-k+1} \uvec{v} + \dotsb + b_{m-1} A^{m-1} \uvec{v}
= \zerovec\text{.}
\end{align*}
If we set \(\uvec{v}' = A^{m-k}\uvec{v}\text{,}\) we then have
\begin{align*}
a_0 \uvec{u} \amp + a_1 A \uvec{u} + \dotsb + a_{k-1} A^{k-1} \uvec{u}\\
\amp + b_{m-k} \uvec{v}' + b_{m-k+1} A \uvec{v}' + \dotsb + b_{m-1} A^{k-1} \uvec{v}
= \zerovec\text{,}
\end{align*}
and from here we can repeat the argument of the \(m=k\) case.
Case \(m \lt k\).
Argue exactly as in the
\(m \gt k\) case with the roles of
\(m\) and
\(k\) reversed.