Section 18.5 Theory
In this section.
Subsection 18.5.1 Basic facts about linear dependence/independence
First we'll formally record our test, but we will let our discussion in Subsection 18.3.2 serve as a proof.
Proposition 18.5.1. Test for Linear Dependence/Independence.
Vectors \(\uvec{v}_1,\uvec{v}_2,\dotsc,\uvec{v}_m\) are linearly dependent if the vector equation
has a nontrivial solution in the (scalar) variables \(k_1,k_2,\dotsc,k_m\text{.}\) Otherwise, if this vector equation has only the trivial solution \(k_1=0,k_2=0,\dotsc,k_m=0\text{,}\) then the vectors \(\uvec{v}_1,\uvec{v}_2,\dotsc,\uvec{v}_m\) are linearly independent.
We will further explore the connection between linear dependence/independence and spanning sets in the next subsection below, but for now recall that we introduced these new concepts to help us determine when a spanning set could be reduced. The next statement reflects the fact that the zero vector does not help span anything other than itself, so it is not useful as a member of a spanning set.
Proposition 18.5.2. Zero is linearly dependent.
Any set of vectors that contains the zero vector is linearly dependent.
Proof.
Suppose \(S\) is a set of vectors containing the zero vector. We'll break into cases depending on what else is in \(S\) besides \(\zerovec\text{.}\)
\(S\) consists of only the zero vector.
Then \(S\) is linearly dependent by definition.
\(S\) contains at least one nonzero vector \(\uvec{v}\).
But then \(\zerovec\) can be expressed as the linear combination \(\zerovec = 0\uvec{v}\text{.}\) Since we have found a vector in \(S\) that can be expressed as a linear combination of another vector in \(S\text{,}\) the set of vectors is linearly dependent.
Here are some facts about how linear dependence/independence behaves when enlarging/reducing collections of vectors.
Proposition 18.5.3. Dependence/independence versus subcollections.
- A collection of vectors that contains a subcollection that is linearly dependent is itself linearly dependent.
- In a linearly independent collection of vectors, every subcollection is also linearly independent.
Proof of Statement 1.
Suppose \(S\) is a collection of vectors in a vector space, and \(S'\) is a linearly dependent subcollection of \(S\text{.}\) Then some vector in \(S'\) can be expressed as a linear combination of other vectors in \(S'\text{.}\) But because \(S'\) is a subcollection of \(S\text{,}\) all these vectors in \(S'\) are also vectors in \(S\text{.}\) So we can also say that some vector in \(S\) can be expressed as a linear combination of other vectors in \(S\text{,}\) making \(S\) a linearly dependent set.
Proof of Statement 2.
Suppose \(S\) is a linearly independent collection of vectors in a vector space. Then no subcollection of \(S\) can be linearly dependent, because if \(S\) contained such a linearly dependent subcollection then Statement 1 of this proposition would imply that \(S\) itself is linearly dependent, which we assume it is not. So every subcollection of \(S\) must be linearly independent.
Subsection 18.5.2 Linear dependence/independence of spanning sets
First we'll record our observation about preserving spans when reducing spanning sets. Then, in the following proposition, we'll take this idea to its logical conclusion.
Lemma 18.5.4. Dependent spanning sets can be reduced.
Suppose \(S\) is a set of vectors in a vector space and \(\uvec{w}\) is both a vector in \(S\) and expressible as a linear combination of vectors in \(S\) besides itself. Then \(\Span S = \Span S'\text{,}\) where \(S'\) is the one-smaller set of vectors obtained by removing \(\uvec{w}\) from \(S\text{.}\)
Proof.
Using Statement 2 of Proposition 17.5.6, we just need to show that every vector in \(S\) can be expressed as a linear combination of vectors in \(S'\text{,}\) and vice versa. However, \(S\) and \(S'\) are the same set of vectors except that \(S\) contains \(\uvec{w}\) while \(S'\) does not. So from the trivial expression \(\uvec{v} = 1\uvec{v}\text{,}\) we immediately have that every vector \(\uvec{v}\) in \(S\) (other than \(\uvec{w}\)) is a linear combination of itself (which is a vector in \(S'\)), and vice versa. And we have also assumed that \(\uvec{w}\) is expressible as a linear combination of vectors in \(S\) besides itself. Since the vectors in such a linear combination are also in \(S'\text{,}\) we know that \(\uvec{w}\) is expressible as a linear combination of vectors in \(S'\text{.}\)
Proposition 18.5.5. Fully reducing finite spanning sets.
Every finite spanning set can be reduced to a linearly independent spanning set. That is, if \(S\) is a spanning set for a vector space and contains a finite number of vectors, then some subcollection of vectors in \(S\) will both span that vector space and be linearly independent.
Proof.
If \(S\) is already linearly independent, then we have our desired linearly independent spanning set. Otherwise, there is some vector \(\uvec{w}\) in \(S\) that is a linear combination of the other vectors in \(S\text{.}\) If we set \(S'\) to be the subcollection of \(S\) consisting of every vector except \(\uvec{w}\text{,}\) then from Lemma 18.5.4 we know that \(S'\) will remain a spanning set for the vector space. If \(S'\) is linearly independent, then we have our desired linearly independent spanning set. Otherwise, we can continue removing linearly dependent vectors in this way until we end up with a linearly independent spanning set. And since we assumed there were a finite number of vectors in \(S\text{,}\) this one-by-one removal process must indeed come to an end at some point.
Just as a vector that points up out of a plane in \(\R^3\) must be linearly independent from vectors parallel to the plane, in any vector space we can enlarge a linearly independent set by including new vectors that are not linear combinations of the old. The next statement encapsulates this idea, and will help us in the next chapter to develop a “bottom-up” approach to building an optimal spanning set for a vector space, as opposed to the “top-down” approach made possible by Lemma 18.5.4 and Proposition 18.5.5.
Proposition 18.5.6. Enlarging independent sets.
If \(S\) is a linearly independent set of vectors and vector \(\uvec{v}\) is not in \(\Span S\text{,}\) then the set of vectors containing both \(\uvec{v}\) and every vector in \(S\) is still linearly independent.
Proof.
Write \(S'\) for the set of vectors containing both the vector \(\uvec{v}\) and every vector in \(S\text{.}\) The set \(S'\) will be linearly independent if none of its vectors can be expressed as a linear combination of other vectors in the set.
So suppose \(\uvec{w}\) is a vector in \(S'\text{.}\) There are two cases to consider.
Case \(\uvec{w} = \uvec{v}\).
In this case, we already know that \(\uvec{w}\) cannot be a linear combination of other vectors in \(S'\text{,}\) because the other vectors in \(S'\) are the vectors in \(S\text{,}\) and we assumed that \(\uvec{v}\) is not in \(\Span S\text{.}\)
Case \(\uvec{w} \neq \uvec{v}\).
In this case, \(\uvec{w}\) is in the set \(S\text{.}\) Since \(S\) is assumed to be linearly independent, we know that \(\uvec{w}\) cannot be a linear combination of other vectors from just \(S\text{.}\) Could it be a linear combination involving other vectors in \(S\) and \(\uvec{v}\text{?}\) Suppose we had
for some vectors \(\uvec{u}_1,\uvec{u}_2,\dotsc,\uvec{u}_m\) in \(S\) and scalars \(k_1,k_2,\dotsc,k_m,k\) (assuming \(k\neq 0\) so that \(\uvec{v}\) is indeed involved in the linear combination). But then we could isolate \(\uvec{v}\) as
a linear combination of vectors in \(S\text{,}\) which is not possible because we have assumed that \(\uvec{v}\) is not in \(\Span S\text{.}\)
The final fact below records our observation in Discovery 18.5 and Subsection 18.3.4 that after a certain number, a collection of vectors can be too numerous to be linearly independent.
Lemma 18.5.7. Too-large sets must be dependent.
If a vector space can be spanned by \(n\) vectors, then every collection containing more than \(n\) vectors must be linearly dependent.
Proof.
Suppose \(S\) is a set of \(n\) vectors in a vector space so that \(S\) is a spanning set for the space. By Proposition 18.5.5, there are vectors \(\uvec{v}_1,\uvec{v}_2,\dotsc,\uvec{v}_{n'}\) in \(S\) that are both linearly independent and also a spanning set for the vectors space. Since this is a subcollection of \(S\text{,}\) we must have \(n' \le n\text{.}\) We'll refer to this subcollection of \(S\) as \(S'\text{.}\)
Now further suppose we have a collection of vectors \(\uvec{w}_1,\uvec{w}_2,\dotsc,\uvec{w}_m\) in the vector space, with \(m>n\text{.}\) Since \(S'\) is a spanning set, we can express each \(\uvec{w}_i\) as a linear combination of the vectors in \(S\text{:}\)
Let's apply the Test for Linear Dependence/Independence to \(\uvec{w}_1,\uvec{w}_2,\dotsc,\uvec{w}_m\text{:}\) suppose there are scalars \(k_1,k_2,\dotsc,k_m\) so that
If we substitute in the above expressions for each \(\uvec{w}_i\) in terms of the vectors in \(S'\) and collect like terms, we get
If we set \(c_1\) to be the coefficient expression on \(\uvec{v}_1\) in the expression above, and \(c_2\) to be the coefficient expression on \(\uvec{v}_2\text{,}\) and so on, then we obtain a vector equality
But the vectors in this linear combination are the vectors in \(S'\text{,}\) and we have assumed that \(S'\) is a linearly independent set. So this vector equality can only be true for the trivial solution where each \(c_i = 0\text{.}\) This leads to homogeneous system
in the variables \(k_1,k_2,\dotsc,k_m\text{.}\) Now, we have assumed \(m > n \ge n'\text{,}\) so we have more variables than equations in the homogeneous system above. But then the solution will require parameters, which means there are nontrivial solutions. Thus, the Test for Linear Dependence/Independence tells us that the collection \(\uvec{w}_1,\uvec{w}_2,\dotsc,\uvec{w}_m\) is linearly dependent.