Skip to main content

Section 18.5 Theory

Subsection 18.5.1 Basic facts about linear dependence/independence

First we'll formally record our test, but we will let our discussion in Subsection 18.3.2 serve as a proof.

We will further explore the connection between linear dependence/independence and spanning sets in the next subsection below, but for now recall that we introduced these new concepts to help us determine when a spanning set could be reduced. The next statement reflects the fact that the zero vector does not help span anything other than itself, so it is not useful as a member of a spanning set.

Suppose \(S\) is a set of vectors containing the zero vector. We'll break into cases depending on what else is in \(S\) besides \(\zerovec\text{.}\)

\(S\) contains at least one nonzero vector \(\uvec{v}\).

But then \(\zerovec\) can be expressed as the linear combination \(\zerovec = 0\uvec{v}\text{.}\) Since we have found a vector in \(S\) that can be expressed as a linear combination of another vector in \(S\text{,}\) the set of vectors is linearly dependent.

Here are some facts about how linear dependence/independence behaves when enlarging/reducing collections of vectors.

Suppose \(S\) is a collection of vectors in a vector space, and \(S'\) is a linearly dependent subcollection of \(S\text{.}\) Then some vector in \(S'\) can be expressed as a linear combination of other vectors in \(S'\text{.}\) But because \(S'\) is a subcollection of \(S\text{,}\) all these vectors in \(S'\) are also vectors in \(S\text{.}\) So we can also say that some vector in \(S\) can be expressed as a linear combination of other vectors in \(S\text{,}\) making \(S\) a linearly dependent set.

Suppose \(S\) is a linearly independent collection of vectors in a vector space. Then no subcollection of \(S\) can be linearly dependent, because if \(S\) contained such a linearly dependent subcollection then Statement 1 of this proposition would imply that \(S\) itself is linearly dependent, which we assume it is not. So every subcollection of \(S\) must be linearly independent.

Subsection 18.5.2 Linear dependence/independence of spanning sets

First we'll record our observation about preserving spans when reducing spanning sets. Then, in the following proposition, we'll take this idea to its logical conclusion.

Using Statement 2 of Proposition 17.5.6, we just need to show that every vector in \(S\) can be expressed as a linear combination of vectors in \(S'\text{,}\) and vice versa. However, \(S\) and \(S'\) are the same set of vectors except that \(S\) contains \(\uvec{w}\) while \(S'\) does not. So from the trivial expression \(\uvec{v} = 1\uvec{v}\text{,}\) we immediately have that every vector \(\uvec{v}\) in \(S\) (other than \(\uvec{w}\)) is a linear combination of itself (which is a vector in \(S'\)), and vice versa. And we have also assumed that \(\uvec{w}\) is expressible as a linear combination of vectors in \(S\) besides itself. Since the vectors in such a linear combination are also in \(S'\text{,}\) we know that \(\uvec{w}\) is expressible as a linear combination of vectors in \(S'\text{.}\)

If \(S\) is already linearly independent, then we have our desired linearly independent spanning set. Otherwise, there is some vector \(\uvec{w}\) in \(S\) that is a linear combination of the other vectors in \(S\text{.}\) If we set \(S'\) to be the subcollection of \(S\) consisting of every vector except \(\uvec{w}\text{,}\) then from Lemma 18.5.4 we know that \(S'\) will remain a spanning set for the vector space. If \(S'\) is linearly independent, then we have our desired linearly independent spanning set. Otherwise, we can continue removing linearly dependent vectors in this way until we end up with a linearly independent spanning set. And since we assumed there were a finite number of vectors in \(S\text{,}\) this one-by-one removal process must indeed come to an end at some point.

Just as a vector that points up out of a plane in \(\R^3\) must be linearly independent from vectors parallel to the plane, in any vector space we can enlarge a linearly independent set by including new vectors that are not linear combinations of the old. The next statement encapsulates this idea, and will help us in the next chapter to develop a “bottom-up” approach to building an optimal spanning set for a vector space, as opposed to the “top-down” approach made possible by Lemma 18.5.4 and Proposition 18.5.5.

Write \(S'\) for the set of vectors containing both the vector \(\uvec{v}\) and every vector in \(S\text{.}\) The set \(S'\) will be linearly independent if none of its vectors can be expressed as a linear combination of other vectors in the set.

So suppose \(\uvec{w}\) is a vector in \(S'\text{.}\) There are two cases to consider.

Case \(\uvec{w} = \uvec{v}\).

In this case, we already know that \(\uvec{w}\) cannot be a linear combination of other vectors in \(S'\text{,}\) because the other vectors in \(S'\) are the vectors in \(S\text{,}\) and we assumed that \(\uvec{v}\) is not in \(\Span S\text{.}\)

Case \(\uvec{w} \neq \uvec{v}\).

In this case, \(\uvec{w}\) is in the set \(S\text{.}\) Since \(S\) is assumed to be linearly independent, we know that \(\uvec{w}\) cannot be a linear combination of other vectors from just \(S\text{.}\) Could it be a linear combination involving other vectors in \(S\) and \(\uvec{v}\text{?}\) Suppose we had

\begin{equation*} \uvec{w} = k_1\uvec{u}_1 + k_2\uvec{u}_2 + \dotsb + k_m\uvec{u}_m + k\uvec{v} \text{,} \end{equation*}

for some vectors \(\uvec{u}_1,\uvec{u}_2,\dotsc,\uvec{u}_m\) in \(S\) and scalars \(k_1,k_2,\dotsc,k_m,k\) (assuming \(k\neq 0\) so that \(\uvec{v}\) is indeed involved in the linear combination). But then we could isolate \(\uvec{v}\) as

\begin{equation*} \uvec{v} = \frac{1}{k}\uvec{w} - \frac{k_1}{k}\uvec{u}_1 - \frac{k_2}{k}\uvec{u}_2 - \dotsb - \frac{k_m}{k}\uvec{u}_m\text{,} \end{equation*}

a linear combination of vectors in \(S\text{,}\) which is not possible because we have assumed that \(\uvec{v}\) is not in \(\Span S\text{.}\)

The final fact below records our observation in Discovery 18.5 and Subsection 18.3.4 that after a certain number, a collection of vectors can be too numerous to be linearly independent.

Suppose \(S\) is a set of \(n\) vectors in a vector space so that \(S\) is a spanning set for the space. By Proposition 18.5.5, there are vectors \(\uvec{v}_1,\uvec{v}_2,\dotsc,\uvec{v}_{n'}\) in \(S\) that are both linearly independent and also a spanning set for the vectors space. Since this is a subcollection of \(S\text{,}\) we must have \(n' \le n\text{.}\) We'll refer to this subcollection of \(S\) as \(S'\text{.}\)

Now further suppose we have a collection of vectors \(\uvec{w}_1,\uvec{w}_2,\dotsc,\uvec{w}_m\) in the vector space, with \(m>n\text{.}\) Since \(S'\) is a spanning set, we can express each \(\uvec{w}_i\) as a linear combination of the vectors in \(S\text{:}\)

\begin{align*} \uvec{w}_1 \amp= a_{11}\uvec{v}_1 + a_{21}\uvec{v}_2 + \dotsb + a_{n'1}\uvec{v}_{n'}, \\ \uvec{w}_2 \amp= a_{12}\uvec{v}_1 + a_{22}\uvec{v}_2 + \dotsb + a_{n'2}\uvec{v}_{n'}, \\ \amp\;\;\vdots \\ \uvec{w}_m \amp= a_{1m}\uvec{v}_1 + a_{2m}\uvec{v}_2 + \dotsb + a_{n'm}\uvec{v}_{n'}. \end{align*}

Let's apply the Test for Linear Dependence/Independence to \(\uvec{w}_1,\uvec{w}_2,\dotsc,\uvec{w}_m\text{:}\) suppose there are scalars \(k_1,k_2,\dotsc,k_m\) so that

\begin{equation*} k_1\uvec{w}_1 + k_2\uvec{w}_2 + \dotsb + k_m\uvec{w}_m = \zerovec \text{.} \end{equation*}

If we substitute in the above expressions for each \(\uvec{w}_i\) in terms of the vectors in \(S'\) and collect like terms, we get

\begin{gather*} (a_{11}k_1 + a_{12}k_2 + \dotsb + a_{1m}k_m) \uvec{v}_1 + (a_{21}k_1 + a_{22}k_2 + \dotsb + a_{2m}k_m) \uvec{v}_2\\ {} + \dotsb + (a_{n'1}k_1 + a_{n'2}k_2 + \dotsb + a_{n'm}k_m) \uvec{v}_{n'} = \zerovec\text{.} \end{gather*}

If we set \(c_1\) to be the coefficient expression on \(\uvec{v}_1\) in the expression above, and \(c_2\) to be the coefficient expression on \(\uvec{v}_2\text{,}\) and so on, then we obtain a vector equality

\begin{equation*} c_1\uvec{v}_1 + c_2\uvec{v}_2 + \dotsb + c_{n'}\uvec{v}_{n'} = \zerovec \text{.} \end{equation*}

But the vectors in this linear combination are the vectors in \(S'\text{,}\) and we have assumed that \(S'\) is a linearly independent set. So this vector equality can only be true for the trivial solution where each \(c_i = 0\text{.}\) This leads to homogeneous system

\begin{equation*} \left\{\begin{array}{rcrcccrcr} a_{11}k_1 \amp + \amp a_{12}k_2 \amp + \amp \dotsb \amp + \amp a_{1m}k_m \amp = \amp 0 \text{,} \\ a_{21}k_1 \amp + \amp a_{22}k_2 \amp + \amp \dotsb \amp + \amp a_{2m}k_m \amp = \amp 0 \text{,} \\ \amp \amp \amp \amp \vdots \amp \amp \amp \\ a_{n'1}k_1 \amp + \amp a_{n'2}k_2 \amp + \amp \dotsb \amp + \amp a_{n'm}k_m \amp = \amp 0 \text{,} \end{array}\right. \end{equation*}

in the variables \(k_1,k_2,\dotsc,k_m\text{.}\) Now, we have assumed \(m > n \ge n'\text{,}\) so we have more variables than equations in the homogeneous system above. But then the solution will require parameters, which means there are nontrivial solutions. Thus, the Test for Linear Dependence/Independence tells us that the collection \(\uvec{w}_1,\uvec{w}_2,\dotsc,\uvec{w}_m\) is linearly dependent.