Skip to main content
Logo image

Discovery guide 20.1 Discovery guide

Subsection 20.1.1 Column space

Take a minute to remind yourself of the column-wise view of matrix multiplication from (✶✶✶) in Subsection 4.3.7. In words, this matrix multiplication pattern says that in a matrix product \(AB\text{,}\)
  • the first column of \(AB\) is the result of multiplying matrix \(A\) against the first column of \(B\text{,}\)
  • the second column of \(AB\) is the result of multiplying matrix \(A\) against the second column of \(B\text{,}\)
  • and so on.
In the first discovery activity, we’ll use this pattern to obtain another important pattern involving the standard basis vectors.

Discovery 20.1.

Notice that the columns of the identity matrix are precisely the standard basis vectors \(\uvec{e}_1, \uvec{e}_2, \dotsc, \uvec{e}_n\) of \(\R^n\text{.}\) Use this observation, the matrix multiplication pattern described above, and the matrix identity \(A I = A\) to complete the following.
  • Product \(A \uvec{e}_1\) is equal to .
  • Product \(A \uvec{e}_2\) is equal to .
  • Product \(A \uvec{e}_j\) is equal to .

Discovery 20.2.

Think of an \(m\times 3\) matrix \(A\) as being made out of three column vectors from \(\R^m\text{:}\)
\begin{equation*} A = \begin{bmatrix} | \amp | \amp | \\ \uvec{a}_1 \amp \uvec{a}_2 \amp \uvec{a}_3 \\ | \amp | \amp | \end{bmatrix}\text{.} \end{equation*}
(a)
Suppose we want to compute \(A\uvec{x}\text{,}\) where \(\uvec{x} = (5,3,-1)\) (but as a column vector). Use the pattern you discovered in Discovery 20.1 to fill in the following.
Since
\begin{equation*} \left[\begin{array}{r} 5 \\ 3 \\ -1 \end{array}\right] = 5 \uvec{e}_1 + 3 \uvec{e}_2 + (-1) \uvec{e}_3\text{,} \end{equation*}
then
\begin{equation*} A \left[\begin{array}{r} 5 \\ 3 \\ -1 \end{array}\right] = A (5\uvec{e}_1 + 3\uvec{e}_2 + (-1)\uvec{e}_3) = 5 \fillinmath{XX} + 3 \fillinmath{XX} + (-1) \fillinmath{XX}\text{.} \end{equation*}
From this, we see that the column vector \(A\uvec{x}\) is in the span of
.
(b)
Convince yourself that the details/conclusion of Task a would be the same for every \(\uvec{x}\text{,}\) not just the example \(\uvec{x}\) we used.
(c)
Now consider system \(A\uvec{x}=\uvec{b}\text{.}\) If this system is consistent (i.e. has at least one solution), then our final conclusion from Task a would also be true about the column vector \(\uvec{b}\text{,}\) since \(\uvec{b} = A\uvec{x}\) for at least one \(\uvec{x}\text{.}\)
So system \(A\uvec{x} = \uvec{b}\) can only be consistent if \(\uvec{b}\) is in the span of
.
For \(m\times n\) matrix \(A\text{,}\) from Discovery 20.2 it appears that the subspace of \(\R^m\) obtained by taking the span of the columns of \(A\) is important when considering consistency of the system \(A\uvec{x} = \uvec{b}\text{.}\) Call this subspace the column space of \(A\). Let’s explore how to reduce our spanning set (the columns of \(A\)) down to a basis. For this task we’ll need a fact about how multiplication by a matrix affects the linear independence of column vectors that we will state as Statement 1 of Proposition 20.5.1 in Subsection 20.5.1. You should read this statement before proceeding.

Discovery 20.3.

The following matrix is in RREF:
\begin{equation*} B = \begin{bmatrix} 1 \amp 2 \amp 0 \amp 3 \amp 0 \amp 5 \\ 0 \amp 0 \amp 1 \amp 4 \amp 0 \amp 6 \\ 0 \amp 0 \amp 0 \amp 0 \amp 1 \amp 7 \\ 0 \amp 0 \amp 0 \amp 0 \amp 0 \amp 0 \end{bmatrix}. \end{equation*}
(a)
Build a linearly independent set of column vectors from \(B\) by working from left to right, and either including or discarding each column based on whether it is linearly independent from the vectors you have already accumulated. (You should, of course, begin by “including” the first column.) What do you notice about your final set of linearly independent columns, relative to the reduced form of \(B\text{?}\)
(b)
Suppose \(A\) is a matrix that can be reduced to \(B\) by a single elementary operation. Then there is an elementary matrix \(E\) so that
\begin{equation*} B = EA = \begin{bmatrix} | \amp | \amp | \amp | \amp | \amp |\\ E\uvec{a}_1 \amp E\uvec{a}_2 \amp E\uvec{a}_3 \amp E\uvec{a}_4 \amp E\uvec{a}_5 \amp E\uvec{a}_6 \\ | \amp | \amp | \amp | \amp | \amp | \end{bmatrix}, \end{equation*}
where the \(\uvec{a}_j\) are the columns of \(A\text{.}\) Use your answer to Task a along with the above-referenced Statement 1 to determine which columns of \(A\) form a linearly independent set.
(c)
Now suppose \(A\) is a matrix that can be reduced to \(B\) by two elementary operations. Then there are elementary matrices \(E_1,E_2\) so that \(B = E_2E_1A\text{.}\) Similarly to Task b, from \(B = E_2(E_1A)\text{,}\) decide which columns of \(E_1A\) are linearly independent. Then from the above-referenced Statement 1 and
\begin{equation*} E_1A = \begin{bmatrix} | \amp | \amp | \amp | \amp | \amp |\\ E_1\uvec{a}_1 \amp E_1\uvec{a}_2 \amp E_1\uvec{a}_3 \amp E_1\uvec{a}_4 \amp E_1\uvec{a}_5 \amp E_1\uvec{a}_6 \\ | \amp | \amp | \amp | \amp | \amp | \end{bmatrix} \end{equation*}
(where the \(\uvec{a}_j\) are the columns of \(A\)), decide which columns of \(A\) are linearly independent.
(d)
Now extrapolate to any number of row operations to complete the following statement: to create a linearly independent set of column vectors from a matrix \(A\text{,}\) row reduce \(A\) to RREF, and then take those columns of \(A\) that correspond to in \(\operatorname{RREF}(A)\text{.}\)

Discovery 20.4.

(a)
Use the procedure you’ve developed in Discovery 20.3.d to develop a reinterpretation of the Test for Linear Dependence/Independence for vectors in \(\R^m\text{:}\) if \(\uvec{v}_1,\uvec{v}_2,\dotsc,\uvec{v}_n\) are vectors in \(\R^m\text{,}\) write these vectors as columns in a matrix, row reduce, and then you will know the original vectors are linearly independent if .
(b)
Recall that a square matrix is invertible if and only if it can be row reduced to \(I\text{.}\) Use the procedure for testing linear independence that you’ve developed in Task a to create another condition that is equivalent to invertibility: a square matrix is invertible if and only if its columns .
(c)
Let’s go full circle. Combine Task a and Task b to complete the following condition: a collection of \(n\) vectors in \(\R^n\) is a basis if and only if the square matrix formed by using the vectors as columns has determinant .

Subsection 20.1.2 Row space

Why let the columns of a matrix have all the fun? Let’s now explore the subspace of \(\R^n\) formed by the span of the rows in an \(m\times n\) matrix, called the row space of the matrix.
In the next discovery activity, we’ll need to recall Statement 2 of Proposition 16.5.6 that gives us a way to determine when two spans are the same. You should re-read that statement before proceeding.

Discovery 20.5.

Assume \(\uvec{v}_1,\uvec{v}_2,\uvec{v}_3,\uvec{v}_4\) to be vectors in some vector space \(V\text{.}\)
(a)
What does the above-referenced Statement 2 say about \(\Span\{\uvec{v}_1,\uvec{v}_2,\uvec{v}_3,\uvec{v}_4\}\) and \(\Span\{\uvec{v}_1,\uvec{v}_4,\uvec{v}_3,\uvec{v}_2\}\text{?}\)
(b)
Complete the statement: if matrix \(A'\) is obtained from \(A\) by swapping two rows, then the row spaces of \(A'\) and of \(A\) are .
(c)
What does the above-referenced Statement 2 say about \(\Span \{\uvec{v}_1,\uvec{v}_2,\uvec{v}_3,\uvec{v}_4\}\) and \(\Span \{\uvec{v}_1,\uvec{v}_2,-7\uvec{v}_3,\uvec{v}_4\}\text{?}\)
(d)
Complete the statement: if matrix \(A'\) is obtained from \(A\) by multiplying some row by a nonzero constant, then the row spaces of \(A'\) and of \(A\) are .
(e)
What does the above-referenced Statement 2 say about \(\Span \{\uvec{v}_1,\uvec{v}_2,\uvec{v}_3,\uvec{v}_4\}\) and \(\Span \{\uvec{v}_1,\uvec{v}_2+3\uvec{v}_1,\uvec{v}_3,\uvec{v}_4\}\text{?}\)
(f)
Complete the statement: if matrix \(A'\) is obtained from \(A\) by adding a multiple of one row to another, then the row spaces of \(A'\) and of \(A\) are .

Discovery 20.6.

(a)
Based on Discovery 20.5, the row spaces of a matrix and of its RREF are .
(b)
Determine a basis for the row space of a matrix \(A\) for which
\begin{equation*} \RREF(A) = \begin{bmatrix} 1 \amp 0 \amp 2 \amp 0 \amp 3\\ 0 \amp 1 \amp 4 \amp 0 \amp 5\\ 0 \amp 0 \amp 0 \amp 1 \amp 6\\ 0 \amp 0 \amp 0 \amp 0 \amp 0 \end{bmatrix}. \end{equation*}

Discovery 20.7.

If you have a collection of vectors in \(\R^n\) and you want to obtain a basis for the subspace that the collection spans, you now have two options: either use those vectors as the columns in a matrix and row reduce to determine a basis for its column space, or use those vectors as the rows in a matrix and row reduce to determine a basis for its row space. Can you think of a reason you might choose to use column space instead of row space? And a reason you might choose to use row space instead of column space?

Subsection 20.1.3 Null space

There is one more subspace of \(\R^n\) associated to a matrix \(A\) — the solution space of the homogeneous system \(A\uvec{x}=\zerovec\text{.}\) Instead of solution space, from this point forward we will refer to it as the null space of \(A\text{.}\)

Discovery 20.8.

Suppose \(A\) is a matrix whose RREF is as given below. Use the “independent parameter” method to determine a basis for the null space of \(A\text{.}\)
\begin{equation*} \RREF(A) = \left[\begin{array}{rrrrr} 1 \amp -1 \amp 0 \amp 2 \amp 3 \\ 0 \amp 0 \amp 1 \amp 2 \amp -2 \\ 0 \amp 0 \amp 0 \amp 0 \amp 0 \end{array}\right] \end{equation*}

Subsection 20.1.4 Relationship between the three spaces

Discovery 20.9.

(a)
How can you determine the dimensions of the column/row/null spaces of a matrix from its RREF?
(b)
For an \(m \times n\) matrix \(A\text{,}\) what is the relationship between the dimension of its column space, the dimension of its null space, and its size?