Section 5.5 Theory
In this section.
Subsection 5.5.1 Properties of the identity matrix
Here are some important facts about the identity matrix and inverses of matrices. You could consider this proposition as a continuation of Proposition 4.5.1.
Proposition 5.5.1. Algebra rules involving the identity matrix.
Let \(I\) represent the \(n\times n\) identity matrix.
- For every \(m\times n\) matrix \(A\) and every \(n\times k\) matrix \(B\text{,}\) we have \(AI = A\) and \(IB = B\text{.}\)
- For every positive integer \(p\text{,}\) we have \(I^p = I\text{.}\)
- An identity matrix is its own inverse; i.e. \(\inv{I} = I\text{.}\)
- An identity matrix is equal to its own transpose; i.e. \(\utrans{I} = I\text{.}\)
Proof.
We will leave the proof of these properties up to you, the reader.
Subsection 5.5.2 Properties of the inverse
And now some first properties of the inverse. We will explore inverses more in the next chapter.
Theorem 5.5.2. Uniqueness of inverses.
A square matrix is either singular or has one unique inverse.
Proof.
A square matrix either has an inverse (i.e. is invertible) or it doesn't (i.e. is singular). We would like to know that in the invertible case, there can be only one inverse. So suppose that \(A\) is a square matrix, and that \(B\) is an inverse for \(A\text{.}\) Then, by definition we have both \(BA=I\) and \(AB=I\) (see Section 5.2). What if we had another inverse for \(A\text{?}\) Suppose \(C\) was also an inverse for \(A\text{,}\) so that both \(CA=I\) and \(AC=I\) were true. Here, all of \(A,B,C,I\) are square of the same size. But then,
with justifications
- Rule 1 of Proposition 5.5.1;
- \(B\) is an inverse for \(A\text{;}\)
- Rule 1.e of Proposition 4.5.1;
- \(C\) is an inverse for \(A\text{;}\) and
- Rule 1 of Proposition 5.5.1.
So \(C\) and \(B\) must actually be the same inverse for \(A\text{.}\) Since we can apply the same reasoning to any inverse for \(A\text{,}\) there can only be one inverse for \(A\text{.}\)
Proposition 5.5.3. Singularity of zero matrices.
A square zero matrix is always singular.
Proof.
It should be obvious from Rule 3.c and Rule 3.d of Proposition 4.5.1 that it is impossible for \(A=\zerovec\) to work in the definition of inverse from Section 5.2.
Let's record the formula for \(2\times 2\) inverses that we encountered in Subsection 5.4.1.
Proposition 5.5.4. \(2 \times 2\) inversion formula.
Consider general \(2\times 2\) matrix \(A = \left[\begin{smallmatrix} a \amp b\\c \amp d \end{smallmatrix}\right] \text{.}\) If \(ad-bc\neq 0\text{,}\) then \(A\) is invertible with inverse
Proof idea.
You can check by direct computation that these two matrices multiply to the identity matrix, in either order.
Here are the properties of inverses we explored in Discovery 5.3. We have changed some of the letters to avoid confusion with the \(A\) and \(B\) in the definition of inverse in Section 5.2.
Proposition 5.5.5. Algebra rules involving inverses.
- If \(M\) is an invertible square matrix, then its inverse \(\inv{M}\) is also invertible with inverse \(\inv{(\inv{M})} = M\text{.}\)
- If \(M\) is an invertible square matrix, then for every nonzero scalar \(k\) the scalar multiple \(kM\) is also invertible with inverse \(\inv{(kM)} = \inv{k}\inv{M}\text{.}\)
- If \(M\) and \(N\) are both invertible square matrices of the same size, then their product \(MN\) is also invertible with inverse \(\inv{(MN)} = \inv{N}\inv{M}\text{.}\)
- If \(M_1,M_2,\dotsc,M_{\ell-1},M_\ell \) are all invertible square matrices of the same size, then their product\begin{equation*} M_1 M_2 \dotsm M_{\ell-1} M_\ell \end{equation*}is also invertible with inverse\begin{equation*} \inv{(M_1 M_2 \dotsm M_{\ell-1} M_\ell)} = \inv{M}_\ell \inv{M}_{\ell-1}\dotsm\inv{M}_2\inv{M}_1. \end{equation*}
- If \(M\) is an invertible square matrix, then for every positive integer \(\ell\) the power \(M^\ell\) is also invertible with inverse \(\inv{\bigl(M^\ell\bigr)} = \bigl(\inv{M}\bigr)^\ell\text{.}\)
Proof of Statement 1.
We have a square matrix \(A=\inv{M}\) and would like to find an inverse \(B\) for it, so that both \(BA=I\) and \(AB=I\) are true. But we already know this is true for \(B = M\text{,}\) since then
Proof of Statement 2.
We have a square matrix \(A = kM\text{,}\) with \(k\neq 0\text{,}\) and would like to find an inverse \(B\) for it. Let's try \(B = \inv{k}\inv{M}\text{:}\)
where in the first steps we have applied Rule 2.c and Rule 2.d of Proposition 4.5.1.
Since both \(BA=I\) and \(AB=I\) are true, then \(B=\inv{k}\inv{M}\) is the inverse of \(A=kM\text{.}\)
Proof of Statement 3.
We have a square matrix \(A = MN\) and would like to find an inverse \(B\) for it. Let's try \(B = \inv{N}\inv{M}\text{:}\)
where in the first steps we have applied Rule 1.e of Proposition 4.5.1.
Since both \(BA=I\) and \(AB=I\) are true, then \(B=\inv{N}\inv{M}\) is the inverse of \(A=MN\text{.}\)
Proof of Statement 4.
We leave this proof to you, the reader.
Proof of Statement 5.
This is the special case of Statement 4 where each of \(M_1,M_2,\dotsc,M_{\ell-1},M_\ell \) is equal to \(M\text{.}\)
Remark 5.5.6.
In light of Statement 5 of the proposition, for an invertible matrix \(M\) and a positive integer \(k\) we can write \(M^{-k}\) to mean either the inverse \(\inv{(M^k)}\) or the power \((\inv{M})^k\text{,}\) since they are the same. This answers the question in Discovery 5.3.e.
We can turn some of the statements of Proposition 5.5.5 around to create new facts about singular (i.e. non-invertible) matrices.
Proposition 5.5.7. Singular products have singular factors.
- If the product \(MN\) is singular, where \(M\) and \(N\) are square matrices of the same size, then at least one of \(M,N\) must be singular.
- If the product\begin{equation*} M_1 M_2 \dotsm M_{\ell-1} M_\ell \end{equation*}is singular, where \(M_1,M_2,\dotsc,M_{\ell-1},M_\ell \) are square matrices of all the same size, then at least one of these matrices must be singular.
- If some power \(M^\ell\) is singular, where \(M\) is a square matrix and \(\ell\) is a positive integer, then \(M\) must be singular.
Proof of Statement 1.
If both \(M\) and \(N\) were invertible, then Statement 3 of Proposition 5.5.5 says that the product \(MN\) would be invertible. But we are assuming that the product \(MN\) is singular, so it is not possible for both \(M\) and \(N\) to be invertible.
Outline of proof for Statement 2.
The proof of this statement is similar to the one above for Statement 1, relying on Statement 4 of Proposition 5.5.5 instead. We leave the details to you, the reader.
Outline of proof for Statement 3.
This proof again is similar to that above for Statement 1, relying on Statement 5 of Proposition 5.5.5 instead. Alternatively, one could view this as the special case of Statement 2 of the current proposition, where each factor \(M_i\) is taken to be equal to \(M\text{.}\)
We did not explore this in our discovery guide, but we can add properties of the inverse with respect to the transpose.
Proposition 5.5.8. Inverse of a transpose.
If \(A\) is invertible, then so is \(\utrans{A}\text{,}\) with
Proof.
Suppose \(A\) is an invertible square matrix, and write \(B\) for \(\utrans{(\inv{A})}\text{.}\) If we can show that both \(B \utrans{A} = I\) and \(\utrans{A} B = I\text{,}\) then by definition we will have shown that \(\utrans{A}\) is invertible, and by Theorem 5.5.2 we will have shown that the inverse of \(\utrans{A}\) is \(B = \utrans{(\inv{A})}\text{.}\) Let's check the first required equality:
with justifications
- definition of \(B\text{;}\)
- Rule 5.d from Proposition 4.5.1;
- definition of inverse;
- Rule 4 from Proposition 5.5.1.
The verification of \(AB=I\) is similar, and we leave it up to you, the reader.
Using Statement 5 of Proposition 5.5.5 along with Proposition 5.5.8, we can expand the scope of our algebra rules for matrix powers.
Proposition 5.5.9. Algebra involving matrix powers with negative exponents.
With the convention that \(A^0\) should be equal to \(I\) for any invertible square matrix \(A\text{,}\) the matrix algebra rules involving matrix powers in Proposition 4.5.1 (including the property of the transpose relative to powers in rule Rule 5.e) and in Proposition 5.5.1 remain valid for all integers \(p\) and \(q\text{,}\) positive or negative (or zero).
Finally, we will record the observation of Discovery 5.6.
Proposition 5.5.10. Consistency of invertible coefficient matrix.
If the coefficient matrix for a linear system is square and invertible, then the system has one unique solution.
Proof.
Consider system \(A\uvec{x}=\uvec{b}\) where the coefficient matrix \(A\) is square and invertible. Then we can apply \(\inv{A}\) to both sides of this matrix equation just as in Subsection 5.3.5 and in Example 5.4.3, to isolate \(\uvec{x}=\inv{A}\uvec{b}\text{.}\) Thus, \(\uvec{x} = \inv{A}\uvec{b}\) is the only possible solution to the system.