First we collect some of our observations about eigenvalues and eigenvectors from Section 21.4. We omit their proofs, as we have already discussed the ideas behind them in that section.
Proposition21.6.1.Eigenvalues of special forms.
If square matrix \(A\) is diagonal or triangular, then the eigenvalues of \(A\) are precisely its diagonal entries.
Proposition21.6.2.Eigenspaces.
For an \(n \times n\) matrix \(A\text{,}\) the collection of all eigenvectors that correspond to a specific eigenvalue \(\lambda\text{,}\) along with the zero vector, forms a subspace of \(\R^n\text{.}\)
Subsection21.6.2Eigenvalues and invertibility
Our observation in Subsection 21.4.5 about the possibility of eigenvalue \(\lambda=0\) allows us to add another to our list of properties that are equivalent to invertibility that we began in Theorem 6.5.2, and then continued in Theorem 10.5.3 and Theorem 20.5.5.
Theorem21.6.3.Characterizations of invertibility.
For a square matrix \(A\text{,}\) the following are equivalent.
Matrix \(A\) is invertible.
Every linear system that has \(A\) as a coefficient matrix has one unique solution.
The homogeneous system \(A\uvec{x} = \zerovec\) has only the trivial solution.
There is some linear system that has \(A\) as a coefficient matrix and has one unique solution.
The rank of \(A\) is equal to the size of \(A\text{.}\)
The RREF of \(A\) is the identity.
Matrix \(A\) can be expressed as a product of some number of elementary matrices.
The determinant of \(A\) is nonzero.
The columns of \(A\) are linearly independent.
The columns of \(A\) form a basis for \(\R^n\text{,}\) where \(n\) is the size of \(A\text{.}\)
The rows of \(A\) are linearly independent.
The rows of \(A\) form a basis for \(\R^n\text{,}\) where \(n\) is the size of \(A\text{.}\)
The scalar \(\lambda=0\) is not an eigenvalue for \(A\text{.}\)
In particular, a square matrix is invertible if and only if \(\lambda=0\) is not an eigenvalue for \(A\text{.}\)