Section 17.4 Examples
Subsection 17.4.1 Testing dependence/independence
Here we will carry out examples of applying the Test for Linear Dependence/Independence.
Example 17.4.1. Testing dependence/independence in \(\R^n\).
Are the vectors \((1,0,0,1),(1,1,0,-1),(2,1,0,0),(5,1,0,5)\) in \(\R^4\) linearly dependent or independent? Set up the test:
\begin{equation*}
k_1 (1,0,0,1) + k_2(1,1,0,-1) + k_3(2,1,0,0) + k_4(5,1,0,5) = (0,0,0,0) \text{.}
\end{equation*}
Notice how we have used the proper zero vector in this space on the right-hand side. On the left-hand side, we want to combine the expression into one vector so that we can compare with the zero vector.
\begin{align*}
(k_1,0,0,k_1) + (k_2,k_2,0,-k_2) + (2k_3,k_3,0,0) + (5k_4,k_4,0,5k_4) \amp= (0,0,0,0) \\
(k_1+k_2+2k_3+5k_4,k_2+k_3+k_4,0,k_1-k_2+5k_4) \amp= (0,0,0,0)
\end{align*}
Comparing components on either side, we obtain a system of four equations in the unknown scalars from the linear combination:
\begin{equation*}
\left\{\begin{array}{rcrcrcrcr}
k_1 \amp + \amp k_2 \amp + \amp 2k_3 \amp + \amp 5k_4 \amp = \amp 0, \\
\amp \amp k_2 \amp + \amp k_3 \amp + \amp k_4 \amp = \amp 0, \\
\amp \amp \amp \amp \amp \amp 0 \amp = \amp 0, \\
k_1 \amp - \amp k_2 \amp \amp \amp + \amp 5k_4 \amp = \amp 0. \\
\end{array}\right.
\end{equation*}
Now we’ll solve this homogeneous system by row reducing its coefficient matrix.
\begin{align}
\left[\begin{array}{rrrr}
1 \amp 1 \amp 2 \amp 5 \\
0 \amp 1 \amp 1 \amp 1 \\
0 \amp 0 \amp 0 \amp 0 \\
1 \amp -1 \amp 0 \amp 5
\end{array}\right]
\qquad\rowredarrow\qquad
\begin{bmatrix}
1 \amp 0 \amp 1 \amp 0 \\
0 \amp 1 \amp 1 \amp 0 \\
0 \amp 0 \amp 0 \amp 1 \\
0 \amp 0 \amp 0 \amp 0
\end{bmatrix}\tag{✶}
\end{align}
Note that here it was not necessary to reduce all the way to RREF, as we are not actually interested in the solutions to this system — we only need to know whether there exist nontrivial solutions. From the reduced matrix, we can see that \(k_3\) is a free variable and would be assigned a parameter in the general solution. The necessity of a parameter means there are an infinite number of solutions, which in particular means there are nontrivial solutions. Therefore, this collection of vectors is linearly dependent.
Remark 17.4.2.
Notice how the vectors from \(\R^4\) that we were testing in the previous example ended up as columns in the coefficient matrix in (✶) — we saw a similar pattern in Example 16.4.10 (and in the other examples in Subsection 16.4.3), where we tested whether a particular vector was in the span of some collection of vectors.
Example 17.4.3. Testing dependence/independence in \(\matrixring_{m \times n}(\R)\).
- Consider the matrices in Discovery 17.4.a. First we set up the Test for Linear Dependence/Independence. Again, we use the proper zero vector on the right-hand side, and then we combine the expression on the left-hand side into one vector so that we may compare against the zero vector.\begin{align*} k_1\begin{bmatrix}1 \amp 0\\0 \amp 1\end{bmatrix} + k_2\begin{bmatrix}0 \amp 1\\1 \amp 0\end{bmatrix} + k_3\begin{bmatrix}0 \amp 0\\0 \amp 1\end{bmatrix} \amp = \begin{bmatrix}0 \amp 0\\0 \amp 0\end{bmatrix}\\ \begin{bmatrix}k_1 \amp 0\\0 \amp k_1\end{bmatrix} + \begin{bmatrix}0 \amp k_2\\k_2 \amp 0\end{bmatrix} + \begin{bmatrix}0 \amp 0\\0 \amp k_3\end{bmatrix} \amp = \begin{bmatrix}0 \amp 0\\0 \amp 0\end{bmatrix}\\ \begin{bmatrix}k_1 \amp k_2\\k_2 \amp k_1+k_3\end{bmatrix} \amp = \begin{bmatrix}0 \amp 0\\0 \amp 0\end{bmatrix} \end{align*}There is no need to set up a system of equations here — we can see from comparing the top rows on either side that \(k_1=0\) and \(k_2=0\text{.}\) Then, from the \((2,2)\) entries, we see that \(k_1+k_3=0\text{.}\) But since we already have \(k_1=0\text{,}\) we get \(k_3=0\) as well. So there is only the trivial solution, and these vectors are linearly independent.
-
Consider the matrices in Discovery 17.4.b. Again, we start by setting up the Test for Linear Dependence/Independence using the appropriate zero vector.\begin{equation*} k_1 \begin{bmatrix} 1 \amp 0 \\ 0 \amp 1 \end{bmatrix} + k_2 \left[\begin{array}{rr} 1 \amp 0 \\ 0 \amp -1 \end{array}\right] + k_3 \left[\begin{array}{rr} 3 \amp 0 \\ 0 \amp -2 \end{array}\right] = \begin{bmatrix} 0 \amp 0 \\ 0 \amp 0 \end{bmatrix} \end{equation*}As before, this will lead to a homogeneous system of equations in the unknown scalars \(k_1,k_2,k_3\text{,}\) and the coefficient matrix of this system will have the entries of the three vectors as columns:\begin{equation*} \left[\begin{array}{rrr} 1 \amp 1 \amp 3 \\ 0 \amp 0 \amp 0 \\ 0 \amp 0 \amp 0 \\ 1 \amp -1 \amp -2 \end{array}\right] \qquad\rowredarrow\qquad \begin{bmatrix} 1 \amp 0 \amp 1/2 \\ 0 \amp 1 \amp 5/2 \\ 0 \amp 0 \amp 0 \\ 0 \amp 0 \amp 0 \end{bmatrix}\text{.} \end{equation*}From the reduced matrix, we see that \(k_3\) is a free variable and will be assigned a parameter in the general solution. The necessity of a parameter implies nontrivial solutions, so these vectors are linearly dependent.The reduced matrix can also be used to tell us exactly how these vectors are linearly dependent. Since \(k_3\) is free, we obtain a solution to the system for every possible value we assign to it. To get a simple nontrivial solution, let’s set \(k_3=1\text{.}\) Then solving the equations represented by the nonzero rows of the reduced matrix gives us \(k_1=-1/2\) and \(k_2=-5/2\text{.}\) Putting these back into the vector equation from when we first set up the Test for Linear Dependence/Independence, we get\begin{align*} \left(-\frac{1}{2}\right)\begin{bmatrix} 1 \amp 0 \\ 0 \amp 1 \end{bmatrix} + \left(-\frac{5}{2}\right)\left[\begin{array}{rr} 1 \amp 0\\0 \amp -1 \end{array}\right] + \left[\begin{array}{rr} 3 \amp 0 \\ 0 \amp -2 \end{array}\right] = \begin{bmatrix}0 \amp 0\\0 \amp 0\end{bmatrix}\\ \\ \implies\quad \left[\begin{array}{rr} 3 \amp 0 \\ 0 \amp -2 \end{array}\right] = \frac{1}{2}\begin{bmatrix} 1 \amp 0\\0 \amp 1 \end{bmatrix} + \frac{5}{2}\left[\begin{array}{rr} 1 \amp 0 \\ 0 \amp -1 \end{array}\right]\text{.} \end{align*}From this we see exactly how one of the vectors in our collection can be expressed as a linear combination of others in the collection.
Example 17.4.4. Testing dependence/independence in \(\poly_n(\R)\).
Consider the polynomials from Discovery 17.4.c. Are they linearly dependent or independent? Set up the test, using the zero polynomial as the zero vector on the right-hand side:
\begin{equation*}
k_1(1+x) + k_2(1+x^2) + k_3(2-x+3x^2) = 0 \text{.}
\end{equation*}
As usual, we simplify the linear combination on the left-hand side into one vector. Here, this means collecting like terms.
\begin{equation*}
(k_1+k_2+2k_3) + (k_1-k_3)x + (k_2+3k_3)x^2 = 0 \text{.}
\end{equation*}
The polynomial on the left can only be equal to the zero polynomial if all its coefficients are zero, leading to the following system of equations:
\begin{equation*}
\left\{\begin{array}{rcrcrcr}
k_1 \amp + \amp k_2 \amp + \amp 2k_3 \amp = \amp 0, \\
k_1 \amp \amp \amp - \amp k_3 \amp = \amp 0, \\
\amp \amp k_2 \amp + \amp 3k_3 \amp = \amp 0.
\end{array}\right.
\end{equation*}
Once again, we reduce the coefficient matrix to determine if there are nontrivial solutions:
\begin{equation*}
\left[\begin{array}{rrr} 1 \amp 1 \amp 2 \\ 1 \amp 0 \amp -1 \\ 0 \amp 1 \amp 3 \end{array}\right]
\qquad\rowredarrow\qquad
\begin{bmatrix} 1 \amp 0 \amp -1 \\ 0 \amp 1 \amp 3 \\ 0 \amp 0 \amp 0 \end{bmatrix}\text{.}
\end{equation*}
Since variable \(k_3\) is free, there exist nontrivial solutions and the vectors are linearly dependent.
Example 17.4.5. Testing dependence/independence in \(F(D)\).
Let’s do an example in a function space. Consider vectors \(f(x) = x\text{,}\) \(g(x) = \sin(\pi x/2)\) and \(h(x) = \cos(\pi x/2)\) in \(F(\R)\text{,}\) the space of functions defined on the whole real number line. Are these functions linearly dependent or independent? Let’s start the test by setting up the vector equation
\begin{equation*}
k_1 x + k_2\sin(\pi x/2) + k_3\cos(\pi x/2) = 0 \text{.}
\end{equation*}
Here, there is no algebraic way we can simplify the expression on the left-hand side. However, remember that the \(0\) on the right-hand side represents the zero function, and that functions are only equal when they always produce the same output given the same input (Definition 15.5.1). So let’s try substituting some input \(x\)-values into the functions on either side of our vector equation above:
\begin{align*}
x=0 \amp\colon \amp k_1\cdot 0 + k_2\cdot 0 + k_3 \cdot 1 \amp= 0, \\
x=1 \amp\colon \amp k_1\cdot 1 + k_2\cdot 1 + k_3 \cdot 0 \amp= 0, \\
x=2 \amp\colon \amp k_1\cdot 2 + k_2\cdot 0 + k_3 \cdot (-1) \amp= 0.
\end{align*}
From the first equation we see \(k_3=0\text{.}\) Combining this with the third equation we also get \(k_1=0\text{.}\) Then combining that with the second equation we finally get \(k_2=0\text{.}\) Since only the trivial solution is possible, these vectors are linearly independent.
Subsection 17.4.2 Linear independence of “standard” spanning sets
Finally, let’s check the “standard” spanning sets of our favourite example vector spaces.
Example 17.4.6. Independence of the standard basis vectors in \(\R^n\).
The standard basis vectors \(\uvec{e}_1,\uvec{e}_2,\dotsc,\uvec{e}_n\) form a spanning set for \(\R^n\text{,}\) and they are also linearly independent, as we see if we apply the test:
\begin{align*}
k_1\uvec{e}_1 + k_2\uvec{e}_1 + \dotsb + k_n\uvec{e}_n \amp= \zerovec \amp
\amp\implies \amp
(k_1,k_2,\dotsc,k_n) \amp= (0,0,\dotsc,0)\text{.}
\end{align*}
So clearly each scalar \(k_j\) must be zero, which means there is only the trivial solution.
Example 17.4.7. Independence of the standard spanning vectors in \(\matrixring_{m \times n}(\R)\).
In Remark 16.4.16, we noted that there is also a “standard” set of spanning vectors in \(\matrixring_{m \times n}(\R)\text{,}\) consisting of those matrices that have all zero entries except for a single \(1\) in one specific entry. We might call these “standard basis vectors” for \(\matrixring_{m \times n}(\R)\text{.}\) Write \(E_{ij}\) for the matrix of this type with a \(1\) in the \(\nth[(i,j)]\) entry. These spanning vectors are also linearly independent. Here, when we apply the Test for Linear Dependence/Independence, it is best if we enumerate our scalars with the same scheme as the vectors:
\begin{align*}
k_{11}E_{11} + k_{12}E_{12} + \dotsb + k_{mn}E_{mn} \amp= \zerovec \amp
\amp\implies \amp
[k_{ij}] \amp= \zerovec\text{.}
\end{align*}
Again, we immediately see that only the trivial solution is possible.
Example 17.4.8. Independence of the standard spanning vectors in \(\poly_n(\R)\).
Also in Remark 16.4.16, we noted that the powers of \(x\) (along with the constant polynomial \(1\)) form a spanning set for \(\poly_n(\R)\text{.}\) We might call these “standard basis vectors” for \(\poly_n(\R)\text{.}\) Are they linearly independent? Apply the Test:
\begin{equation*}
k_0 \cdot 1 + k_1x + k_2 x^2 + \dotsb + k_n x^n = 0 \text{.}
\end{equation*}
If any of these coefficients are nonzero, the polynomial on the left-hand side will be nonzero, so only the trivial solution is possible. Therefore, powers of \(x\) are always linearly independent in \(\poly_n(\R)\text{.}\)