Section 21.3 Concepts
In this section.
Subsection 21.3.1 Column space
The โconsistent spaceโ of a coefficient matrix.
The solution set of a nonhomogeneous system Ax=b with mรn coefficient matrix A cannot be a subspace of Rn because it can never contain the zero vector. Even worse, if the system is inconsistent, then the solution set does not contain any vectors at all.Question 21.3.1.
Amongst all systems with coefficient matrix A, which are consistent?
Consistent space versus column space.
To better understand this so-called consistent space, we should relate it back to the matrix A as we did in Discovery 21.2, because A is the only thing common to all the b vectors in this space. Let's again think of A as being made up of column vectors in Rm:Determining a basis for a column space.
Since the columns of A are, by definition, a spanning set for the column space of A, we can reduce it to a basis. Once again, we can apply row reduction to this task. Row reducing is equivalent to multiplying on the left by elementary matrices, and when we defined matrix multiplication in Subsection 4.3.7 we did so column-by-column:Procedure 21.3.2. To determine a basis for the column space of matrix A.
- Reduce to at least REF.
- Extract from A all those columns in positions corresponding to the positions of the leading ones in the reduced matrix.
These extracted columns will form a basis for the column space of A.
Remark 21.3.3.
It is important that you take the basis vectors from the columns of A, not from the columns of the reduced matrix โ row operations do not change independence/dependence relationships amongst the columns, but they do change the column space.
Using column space to determine linear dependence/independence.
In Discovery 21.4.a, we used this new procedure to create a reinterpretation of the Test for Linear Dependence/Independence for vectors in Rm.Procedure 21.3.4. To use row reduction to test linear dependence/independence in Rm.
To determine whether a collection of n vectors in Rm is linearly dependent or independent, form an mรn matrix by using the vectors as columns, and then row reduce to determine the rank of the matrix. If the rank is equal to n (i.e. there is a leading one in every column of the reduced matrix), then the vectors are linearly independent. If the rank is less that n (i.e. at least one column of the reduced matrix does not contain a leading one), then the vectors are linearly dependent.
Subsection 21.3.2 Row space
Analyzing the row space of a matrix is considerably easier. As we discovered in Discovery 21.5 and Discovery 21.6, elementary row operations do not change the row space of a matrix, so the row spaces of a matrix and each of its REFs are the same space. Clearly we do not need the zero rows from an REF to span this space. But the pattern of leading ones guarantees that the nonzero rows in an REF are linearly independent.Procedure 21.3.5. To determine a basis for the row space of matrix A.
- Reduce to at least REF.
- Extract the nonzero rows from the REF you have computed.
These extracted rows will form a basis for the row space of A.
Remark 21.3.6.
Note the difference from the column space procedure โ in this procedure we get the basis vectors from the reduced matrix, not from the original matrix.
Procedure 21.3.7. A second way to use row reduction to test linear dependence/independence in Rm.
To determine whether a collection of m vectors in Rn is linearly dependent or independent, form an mรn matrix by using the vectors as rows, and then row reduce to determine the rank of the matrix. If the rank is equal to m (i.e. no zero rows can be produced by reducing), then the vectors are linearly independent. If the rank is less that n (i.e. reducing produces at least one zero row), then the vectors are linearly dependent.
Subsection 21.3.3 Column space versus row space
Question 21.3.8.
Which procedure โ column space or row space โ should we use?
- Column space
Produces a basis involving vectors from the original collection.
- Row space
Produces a โsimplifiedโ basis.