Section 18.4 Examples
In this section.
Subsection 18.4.1 Testing dependence/independence
Here we will carry out examples of applying the Test for Linear Dependence/Independence.
Example 18.4.1. Testing dependence/independence in \(\R^n\).
Are the vectors \((1,0,0,1),(1,1,0,-1),(2,1,0,0),(5,1,0,5)\) in \(\R^4\) linearly dependent or independent? Set up the test:
Notice how we have used the proper zero vector in this space on the right-hand side. On the left-hand side, we want to combine the expression into one vector so that we can compare with the zero vector.
Comparing components on either side, we obtain a system of four equations in the unknown scalars from the linear combination:
Now we'll solve this homogeneous system by row reducing its coefficient matrix.
Note that here it was not necessary to reduce all the way to RREF, as we are not actually interested in the solutions to this system — we only need to know whether there exist nontrivial solutions. From the reduced matrix, we can see that \(k_3\) is a free variable and would be assigned a parameter in the general solution. The necessity of a parameter means there are an infinite number of solutions, which in particular means there are nontrivial solutions. Therefore, this collection of vectors is linearly dependent.
Remark 18.4.2.
Notice how the vectors from \(\R^4\) that we were testing in the previous example ended up as columns in the coefficient matrix in (\(\star\)) — we saw a similar pattern in Example 17.4.10 (and in the other examples in Subsection 17.4.3), where we tested whether a particular vector was in the span of some collection of vectors.
Example 18.4.3. Testing dependence/independence in \(\matrixring_{m \times n}(\R)\).
- Consider the matrices in Discovery 18.4.a. First we set up the Test for Linear Dependence/Independence. Again, we use the proper zero vector on the right-hand side, and then we combine the expression on the left-hand side into one vector so that we may compare against the zero vector.\begin{align*} k_1\begin{bmatrix}1 \amp 0\\0 \amp 1\end{bmatrix} + k_2\begin{bmatrix}0 \amp 1\\1 \amp 0\end{bmatrix} + k_3\begin{bmatrix}0 \amp 0\\0 \amp 1\end{bmatrix} \amp = \begin{bmatrix}0 \amp 0\\0 \amp 0\end{bmatrix}\\ \begin{bmatrix}k_1 \amp 0\\0 \amp k_1\end{bmatrix} + \begin{bmatrix}0 \amp k_2\\k_2 \amp 0\end{bmatrix} + \begin{bmatrix}0 \amp 0\\0 \amp k_3\end{bmatrix} \amp = \begin{bmatrix}0 \amp 0\\0 \amp 0\end{bmatrix}\\ \begin{bmatrix}k_1 \amp k_2\\k_2 \amp k_1+k_3\end{bmatrix} \amp = \begin{bmatrix}0 \amp 0\\0 \amp 0\end{bmatrix} \end{align*}There is no need to set up a system of equations here — we can see from comparing the top rows on either side that \(k_1=0\) and \(k_2=0\text{.}\) Then, from the \((2,2)\) entries, we see that \(k_1+k_3=0\text{.}\) But since we already have \(k_1=0\text{,}\) we get \(k_3=0\) as well. So there is only the trivial solution, and these vectors are linearly independent.
-
Consider the matrices in Discovery 18.4.b. Again, we start by setting up the Test for Linear Dependence/Independence using the appropriate zero vector.
\begin{equation*} k_1 \begin{bmatrix} 1 \amp 0 \\ 0 \amp 1 \end{bmatrix} + k_2 \left[\begin{array}{rr} 1 \amp 0 \\ 0 \amp -1 \end{array}\right] + k_3 \left[\begin{array}{rr} 3 \amp 0 \\ 0 \amp -2 \end{array}\right] = \begin{bmatrix} 0 \amp 0 \\ 0 \amp 0 \end{bmatrix} \end{equation*}As before, this will lead to a homogeneous system of equations in the unknown scalars \(k_1,k_2,k_3\text{,}\) and the coefficient matrix of this system will have the entries of the three vectors as columns:
\begin{equation*} \left[\begin{array}{rrr} 1 \amp 1 \amp 3 \\ 0 \amp 0 \amp 0 \\ 0 \amp 0 \amp 0 \\ 1 \amp -1 \amp -2 \end{array}\right] \qquad\rowredarrow\qquad \begin{bmatrix} 1 \amp 0 \amp 1/2 \\ 0 \amp 1 \amp 5/2 \\ 0 \amp 0 \amp 0 \\ 0 \amp 0 \amp 0 \end{bmatrix}\text{.} \end{equation*}From the reduced matrix, we see that \(k_3\) is a free variable and will be assigned a parameter in the general solution. The necessity of a parameter implies nontrivial solutions, so these vectors are linearly dependent.
The reduced matrix can also be used to tell us exactly how these vectors are linearly dependent. Since \(k_3\) is free, we obtain a solution to the system for every possible value we assign to it. To get a simple nontrivial solution, let's set \(k_3=1\text{.}\) Then solving the equations represented by the nonzero rows of the reduced matrix gives us \(k_1=-1/2\) and \(k_2=-5/2\text{.}\) Putting these back into the vector equation from when we first set up the Test for Linear Dependence/Independence, we get
\begin{align*} \left(-\frac{1}{2}\right)\begin{bmatrix} 1 \amp 0 \\ 0 \amp 1 \end{bmatrix} + \left(-\frac{5}{2}\right)\left[\begin{array}{rr} 1 \amp 0\\0 \amp -1 \end{array}\right] + \left[\begin{array}{rr} 3 \amp 0 \\ 0 \amp -2 \end{array}\right] = \begin{bmatrix}0 \amp 0\\0 \amp 0\end{bmatrix}\\ \\ \implies\quad \left[\begin{array}{rr} 3 \amp 0 \\ 0 \amp -2 \end{array}\right] = \frac{1}{2}\begin{bmatrix} 1 \amp 0\\0 \amp 1 \end{bmatrix} + \frac{5}{2}\left[\begin{array}{rr} 1 \amp 0 \\ 0 \amp -1 \end{array}\right]\text{.} \end{align*}From this we see exactly how one of the vectors in our collection can be expressed as a linear combination of others in the collection.
Example 18.4.4. Testing dependence/independence in \(\poly_n(\R)\).
Consider the polynomials from Discovery 18.4.c. Are they linearly dependent or independent? Set up the test, using the zero polynomial as the zero vector on the right-hand side:
As usual, we simplify the linear combination on the left-hand side into one vector. Here, this means collecting like terms.
The polynomial on the left can only be equal to the zero polynomial if all its coefficients are zero, leading to the following system of equations:
Once again, we reduce the coefficient matrix to determine if there are nontrivial solutions:
Since variable \(k_3\) is free, there exist nontrivial solutions and the vectors are linearly dependent.
Example 18.4.5. Testing dependence/independence in \(F(D)\).
Let's do an example in a function space. Consider vectors \(f(x) = x\text{,}\) \(g(x) = \sin(\pi x/2)\) and \(h(x) = \cos(\pi x/2)\) in \(F(\R)\text{,}\) the space of functions defined on the whole real number line. Are these functions linearly dependent or independent? Let's start the test by setting up the vector equation
Here, there is no algebraic way we can simplify the expression on the left-hand side. However, remember that the \(0\) on the right-hand side represents the zero function, and that functions are only equal when they always produce the same output given the same input (Definition 16.5.1). So let's try substituting some input \(x\)-values into the functions on either side of our vector equation above:
From the first equation we see \(k_3=0\text{.}\) Combining this with the third equation we also get \(k_1=0\text{.}\) Then combining that with the second equation we finally get \(k_2=0\text{.}\) Since only the trivial solution is possible, these vectors are linearly independent.
Subsection 18.4.2 Linear independence of “standard” spanning sets
Finally, let's check the “standard” spanning sets of our favourite example vector spaces.
Example 18.4.6. Independence of the standard basis vectors in \(\R^n\).
The standard basis vectors \(\uvec{e}_1,\uvec{e}_2,\dotsc,\uvec{e}_n\) form a spanning set for \(\R^n\text{,}\) and they are also linearly independent, as we see if we apply the test:
So clearly each scalar \(k_j\) must be zero, which means there is only the trivial solution.
Example 18.4.7. Independence of the standard spanning vectors in \(\matrixring_{m \times n}(\R)\).
In Remark 17.4.16, we noted that there is also a “standard” set of spanning vectors in \(\matrixring_{m \times n}(\R)\text{,}\) consisting of those matrices that have all zero entries except for a single \(1\) in one specific entry. We might call these “standard basis vectors” for \(\matrixring_{m \times n}(\R)\text{.}\) Write \(E_{ij}\) for the matrix of this type with a \(1\) in the \(\nth[(i,j)]\) entry. These spanning vectors are also linearly independent. Here, when we apply the Test for Linear Dependence/Independence, it is best if we enumerate our scalars with the same scheme as the vectors:
Again, we immediately see that only the trivial solution is possible.
Example 18.4.8. Independence of the standard spanning vectors in \(\poly_n(\R)\).
Also in Remark 17.4.16, we noted that the powers of \(x\) (along with the constant polynomial \(1\)) form a spanning set for \(\poly_n(\R)\text{.}\) We might call these “standard basis vectors” for \(\poly_n(\R)\text{.}\) Are they linearly independent? Apply the Test:
If any of these coefficients are nonzero, the polynomial on the left-hand side will be nonzero, so only the trivial solution is possible. Therefore, powers of \(x\) are always linearly independent in \(\poly_n(\R)\text{.}\)