Discovery guide 18.1 Discovery guide
Discovery 18.1.
Consider the vectors \(\uvec{v}_1 = (1,0,1)\text{,}\) \(\uvec{v}_2 = (1,1,2)\text{,}\) and \(\uvec{v}_3 = (1,-1,0)\text{.}\)
(a)
Do you remember what \(\Span\) means? Explain why the vector \(\uvec{x} = 3\uvec{v}_1 + 2\uvec{v}_2 - \uvec{v}_3\) is in \(\Span \{\uvec{v}_1,\uvec{v}_2,\uvec{v}_3\}\text{.}\)
(b)
Actually, \(\uvec{v}_2\) can be expressed as a linear combination of \(\uvec{v}_1\) and \(\uvec{v}_3\) — do you see how?
Use this and the expression for \(\uvec{x}\) in Task a to express \(\uvec{x}\) as a linear combination of just \(\uvec{v}_1\) and \(\uvec{v}_3\text{.}\)
(c)
Task b shows that \(\uvec{x}\) is in \(\Span \{\uvec{v}_1,\uvec{v}_3\}\text{.}\) Do you think that similar calculations and the same reasoning can be carried out for every vector in \(\Span \{\uvec{v}_1,\uvec{v}_2,\uvec{v}_3\}\text{?}\) What does this say about \(\Span \{\uvec{v}_1,\uvec{v}_2,\uvec{v}_3\}\) versus \(\Span \{\uvec{v}_1,\uvec{v}_3\}\text{?}\)
Discovery 18.1 demonstrates a common pattern: when one of the vectors in a spanning set can be expressed as a linear combination of the others, that vector becomes redundant, and a smaller spanning set can be used in place of the original one. We'll give this situation a name: a set of vectors is called linearly dependent if (at least) one of the vectors in the set can be written as a linear combination of other vectors in the set; otherwise the set of vectors is called linearly independent. However, it can be tedious to check each vector in a set one-by-one to see if it is a linear combination of others. Luckily, for a finite set of vectors, there is a way to check all of them all at once.
Test for Linear Dependence/Independence.
To test whether vectors \(\uvec{v}_1,\uvec{v}_2,\dotsc,\uvec{v}_m\) are linearly dependent or independent, set up the vector equation
where the coefficients \(k_1,k_2,\dotsc,k_m\) are (scalar) variables.
- If vector equation (\(\star\)) has a nontrivial solution in the variables \(k_1,k_2,\dotsc,k_m\text{,}\) then the vectors \(\uvec{v}_1,\uvec{v}_2,\dotsc,\uvec{v}_m\) are linearly dependent.
- Otherwise, if vector equation (\(\star\)) has only the trivial solution \(k_1=0,k_2=0,\dotsc,k_m=0\text{,}\) then the vectors \(\uvec{v}_1,\uvec{v}_2,\dotsc,\uvec{v}_m\) are linearly independent.
Discovery 18.2.
(a)
Use the test to verify that \(\uvec{v}_1,\uvec{v}_2,\uvec{v}_3\) from Discovery 18.1 are linearly dependent.
(b)
Use the test to verify that \(\uvec{v}_1,\uvec{v}_3\) from Discovery 18.1 are linearly independent.
The next discovery activity will help you understand the Test for Linear Independence/Dependence. To keep it simple, we'll consider just three vectors at a time.
Discovery 18.3.
(a)
Consider abstract vectors \(\uvec{u}_1,\uvec{u}_2,\uvec{u}_3\text{,}\) and suppose the vector equation
has a nontrivial solution. This means that there are values for the scalars \(k_1,k_2,k_3\text{,}\) at least one of which is not zero, so that equation (\(\star\star\)) is true.
Use some algebra to manipulate equation (\(\star\star\)) to demonstrate that one of the vectors can be expressed as a linear combination of the others (and hence, by definition, the vectors \(\uvec{u}_1,\uvec{u}_2,\uvec{u}_3\) are linearly dependent).
(b)
Consider abstract vectors \(\uvec{w}_1,\uvec{w}_2,\uvec{w}_3\text{,}\) and suppose the vector equation
has only the trivial solution. We would like to see why this means that \(\uvec{w}_1,\uvec{w}_2,\uvec{w}_3\) are linearly independent.
Suppose they weren't: for example, suppose \(\uvec{w}_3 = c_1\uvec{w}_1 + c_2\uvec{w}_2\) were true for some scalars \(c_1,c_2\text{.}\) Manipulate this expression for \(\uvec{w}_3\) until is says something about equation (\(\star\star\star\)). Do you see now why \(\uvec{w}_1,\uvec{w}_2,\uvec{w}_3\) cannot satisfy the definition of linearly dependence, and hence must be linearly independent?
Discovery 18.4.
In each of the following vector spaces, practise using the Test for Linear Dependence/Independence of the given set of vectors.
(a)
\(V = \matrixring_2(\R)\text{,}\) \(S= \left\{\; \begin{bmatrix} 1 \amp 0 \\ 0 \amp 1 \end{bmatrix}, \;\; \begin{bmatrix} 0 \amp 1\\1 \amp 0 \end{bmatrix}, \;\; \begin{bmatrix} 0 \amp 0\\0 \amp 1 \end{bmatrix} \; \right\} \text{.}\)
(b)
\(V = \matrixring_2(\R)\text{,}\) \(S= \left\{\; \begin{bmatrix} 1 \amp 0 \\ 0 \amp 1 \end{bmatrix}, \;\; \begin{bmatrix} 1 \amp 0 \\ 0 \amp -1 \end{bmatrix}, \;\; \begin{bmatrix} 3 \amp 0 \\ 0 \amp -2 \end{bmatrix} \; \right\} \text{.}\)
(c)
\(V = \poly(\R)\text{,}\) \(S = \{ 1+x, 1+x^2, 2 - x + 3x^2 \}\text{.}\)
After setting up the vector equation from the test for linear dependence/independence, you are solving for the scalars \(k_1,k_2,k_3\text{,}\) not for \(x\text{.}\) On the right-hand side, the zero represents the zero vector, which in this space is the zero polynomial. What are the coefficients on powers of \(x\) in the zero polynomial? The left-hand side, being equal, must have the same coefficients.
(d)
\(V = \poly(\R)\)\(S = \{ 1, x, x^2, x^3 \}\)Discovery 18.5.
(a)
Do you think it's possible to have a set of three linearly independent vectors in \(\R^2\text{?}\) Why or why not?
(b)
Do you think it's possible to have a set of four linearly independent vectors in \(\R^3\text{?}\) Why or why not?
Discovery 18.6.
(a)
What does the definition of linear dependence say in the case of just two vectors?
(b)
If the test for linear dependence/independence is to remain true in the case of a “set” of vectors consisting of just one vector, how should we define linear dependence/independence for such a set?