Section 18.4 Examples
Subsection 18.4.1 Checking a basis
Let’s start by working through Discovery 18.1, where we were asked to determine whether a collection of vectors forms a basis for a vector space. In each case we are looking to check two properties: that the collection is linearly independent, and that it forms a spanning set for the whole vector space.
Example 18.4.1. A collection of vectors too large to be a basis.
We already know that can be spanned by the three standard basis vectors, and so Lemma 17.5.7 tells that any set of more than three vectors in must be linearly dependent. Set contains four vectors, so it can’t be a basis because it is linearly dependent. However, is a spanning set — can you see how?
Example 18.4.2. A nonstandard basis for .
This set is linearly independent, which can be verified using the Test for Linear Dependence/Independence. As we saw in many examples in Section 17.4, the vector equation
that we use to begin the Test for Linear Dependence/Independence leads to a homogeneous system. In this case, that system has coefficient matrix
where the vectors in appear as columns. This matrix can be reduced to in two operations, and so only the trivial solution is possible.
The set is also a spanning set for To check this, we need to make sure that every vector in can be expressed as a linear combination of the vectors in That is, we need to check that if is an arbitrary vector in then we can always determine scalars so that
Similar to the Test for Linear Dependence/Independence, the above vector equation leads to a system of equations with augmented matrix
The same two operations as before will reduce the coefficient part of this matrix to so that a solution always exists, regardless of the values of But it’s also possible to determine a solution directly by inspection of the vector equation above, as clearly
will be a solution.
Because this set is both linearly independent and a spanning set, it is a basis for the space.
Example 18.4.3. An independent set that does not span.
This set is linearly independent (check using the test!), but it is not a spanning set. We can see a linear combination of these vectors will never have a nonzero entry in the entry. In particular, the vector
Example 18.4.4. A set that neither spans nor is independent.
In Discovery 18.1.d, we considered a set
of four vectors in the space of all upper triangular matrices. This set of vectors is not a basis because it is neither a spanning set nor linearly independent.
- It can’t be a spanning set for the space
because a linear combination of these vectors will always have the same number in both diagonal entries. In particular, the vectoris upper triangular, so it is in but it is not in - While we could use the test to determine that these vectors are linearly dependent, we can see directly that one of these vectors is a linear combination of others:
Example 18.4.5. The standard basis for the space of lower triangular matrices.
In Discovery 18.1.e, we considered a set
We might call these matrices the “standard basis vectors” for the space of lower triangular matrices, since when we simplify a linear combination of them, such as
we see that the coefficients in the linear combination on the left correspond directly to the entries in the resulting sum matrix on the right, just as with other “standard” bases that we’ve encountered.
The set is a spanning set for since we can clearly achieve every possible vector in this space using linear combinations of vectors in by varying the coefficients in the general linear combination (✶) above.
The left-hand side of (✶) is also the left-hand side of the vector equation that we use in the Test for Linear Dependence/Independence, and from the right-hand side of (✶) we can see that if we set this linear combination to equal the zero vector (which is the zero matrix here), the only solution is the trivial one.
Example 18.4.6. Another independent set that does not span.
In Discovery 18.1.f, we considered a set
of three vectors in the space We have already seen in Subsection 17.4.2 that powers of are always linearly independent in a space of polynomials. But this set of polynomials cannot be a spanning set for because no linear combination of will ever produce a polynomial of degree So is not a basis.
Example 18.4.7. The standard basis for .
In Discovery 18.1.g, we considered a set
of four vectors in the space Again, we know that powers of are linearly independent in a space of polynomials. However, this time is also a spanning set, since we naturally write polynomials of degree as linear combinations of powers of
Such linear combinations can also be used to produce polynomials of degree less than by setting the coefficients on the higher powers to Since is both independent and a spanning set, it is a basis for
Remark 18.4.8.
After we study the concept of dimension in the next chapter, the process of determining whether a set of vectors is a basis will become simpler. It is fairly straightforward to check the linear independence condition, since this usually reduces to solving a homogeneous system of linear equations, but checking the spanning condition directly is more tedious. In Chapter 19, we will see that if we know the correct number of vectors required in a basis, we only need to check one of the two conditions in the definition of basis (Corollary 19.5.6). And, as mentioned, usually it is the linear independence condition that is easier to verify.
Subsection 18.4.2 Standard bases
In Subsection 17.4.2, we checked that certain “standard” spanning sets for our main examples of vector spaces were also linearly independent. Since they both span and are linearly independent, that makes each of them a basis for the space that contains them. We’ll list them again here.
Example 18.4.9. The standard basis of .
The standard basis vectors form a basis for justifying the word “basis” in our description “standard basis vectors” for these vectors.
Example 18.4.10. The standard basis of .
The space of matrices also has a standard basis: the collection of matrices that have all entries equal to except for a single in the entry.
Example 18.4.11. The two standard bases of .
A space of polynomials also has a standard basis: the collection of powers of As an ordered basis, we have two reasonable choices here: the order already presented, and the reverse order We will stick with the order of increasing powers of so that when we index the coefficients in a linear combination, as in
then their indices are increasing with the exponents on
Subsection 18.4.3 Coordinate vectors
Finally, we’ll do some computations with coordinate vectors, by working Discovery 18.4 and Discovery 18.5.
First, from Discovery 18.4.
Example 18.4.12. Determining a coordinate vector relative to the standard basis of .
First, decompose as a linear composition of the vectors in Since is the standard basis for this can be done by inspection:
To get the coordinate vector, we wrap the four coefficients up (in order) in an vector:
Example 18.4.13. Determining a coordinate vector relative to a nonstandard basis of .
In Discovery 18.4.b, we considered the same vector from as in the previous example, but relative to the nonstandard basis
We could probably also decompose by inspection here, but instead we’ll demonstrate the general method. Write as an unknown linear combination of the basis vectors, and then simplify the linear combination:
Comparing entries on left- and right-hand sides, we obtain a system of equations:
If we had more complicated basis vectors, we would have a more complicated system, which we could solve by forming an augmented matrix and row reducing. As it is, we can solve by inspection:
We collect these four coefficients (in order) in an vector:
Even though we were working with the same vector as in the previous example, we ended up with a different coordinate vector because it is relative to a different basis.
Example 18.4.14. Determining a coordinate vector relative to the standard basis of .
The standard basis of consists of powers of (along with the constant polynomial ), and our polynomial is naturally written as a linear combination of powers of However, this particular vector has no term, so we need to insert one:
Once again, we wrap up these four coefficients (in order) in an vector:
Example 18.4.15. Determining a coordinate vector relative to a nonstandard basis of .
Rather than try to guess, we should set up equations and solve. Start by writing as an unknown combination of the basis vectors and combine into a single vector expression:
This leads to a system of equations:
We could probably solve by inspection, but let’s form an augmented matrix and reduce:
Notice again how the columns in the initial augmented matrix, including the column of constants, are the vectors involved. The column of constants in the final reduced matrix is our coordinate vector:
Example 18.4.16. Determining a coordinate vector relative to the standard basis of .
Since we are using the standard basis for it is simple to decompose as a linear combination of the vectors in the basis:
Collect these coefficients together into an vector:
Remark 18.4.17.
The last two examples above might seem kind of weird, but the point is all about point of view.
Relative to the standard basis, a vector in is equal to its own coordinate vector. In other words, the standard basis of is standard because it corresponds to the natural way that we think of vectors in — in terms of its -, -, and -coordinates. This is similar to how the standard basis for a polynomial space leads to coordinate vectors that just record the coefficients of polynomials, or how the standard basis for a matrix space leads to coordinate vectors that just record the entries of the matrices.
But if we change our point of view and use a nonstandard basis for then coordinate vectors allow us to use vectors in to represent other vectors in where everything is “tuned” to the perspective of the nonstandard basis. And similarly if we use nonstandard bases in other spaces.
Now we’ll work through Discovery 18.5. This activity is the same as the previous, but in reverse — we are given a coordinate vector from and we can use its components as the coefficients in a linear combination of the basis vectors. We’ll complete some of the examples from this discovery activity, and leave the rest to you.
Example 18.4.18. Determining a vector in from its standard-basis coordinate vector.
In Discovery 18.5.a, we were tasked with determining the vector in that has coordinate vector relative to the standard basis.
To do this, simply compute the linear combination using the coordinate vector components as coefficients, in the proper order:
This result should not be surprising, as both a matrix and a vector in are just a collection of four numbers.
Example 18.4.19. Determining a vector in from its coordinate vector relative to a nonstandard basis.
In Discovery 18.5.b, we were tasked with determining the vector in that has coordinate vector relative to the nonstandard basis
Note that this is the same coordinate vector from Discovery 18.5.a (and Example 18.4.18), but a different basis.
Again, to do this we simply compute the linear combination using the coordinate vector components as coefficients, in the proper order:
Even though we were working with the same coordinate vector as in the previous example, we ended up with a different matrix result because it is relative to a different basis.
Example 18.4.20. Determining a vector in from its standard-basis coordinate vector.
In Discovery 18.5.c, we were tasked with determining the vector in that has coordinate vector relative to the standard basis