Skip to main content

Discovery guide 5.1 Discovery guide

Discovery 5.1.

The number one is important in algebra, it lets us do things like

\begin{align*} 5 a \amp = 15 \\ \frac{1}{5}\cdot 5 \cdot a \amp = \frac{1}{5} \cdot 15 \\ 1 a \amp = 3 \\ a \amp = 3. \end{align*}

The critical step for us right now is the last simplification of the left-hand side:

\begin{equation*} 1 a = a \text{.} \end{equation*}
(a)

What matrix do you think will act similarly in matrix algebra for \(2\times 2\) matrices to how the number \(1\) acts in number algebra? To answer this, try to fill in the first matrix below so that the matrix equality is always true, no matter the values of \(a,b,c,d\text{.}\)

\begin{equation*} \begin{bmatrix} \underline{\hspace{0.909090909090909em}} \amp \underline{\hspace{0.909090909090909em}} \\ \underline{\hspace{0.909090909090909em}} \amp \underline{\hspace{0.909090909090909em}} \end{bmatrix} \begin{bmatrix} a \amp b \\ c \amp d \end{bmatrix} = \begin{bmatrix} a \amp b \\ c \amp d \end{bmatrix} \end{equation*}
(b)

Write \(I\) for your \(2\times 2\) matrix from Task a (for the I in Identity matrix).

(i)

Does \(IA=A\) work for every \(2 \times 2\) matrix \(A\text{?}\) For every \(2 \times 3\) matrix \(A\text{?}\) For every \(2\times\ell\) matrix \(A\text{,}\) no matter the number \(\ell\) of columns?

(ii)

Does \(BI = B\) also work for every \(2\times 2\) matrix \(B\text{?}\) For every \(\ell\times 2\) matrix \(B\text{?}\)

(c)

Extend: What is the \(3\times 3\) version of \(I\text{?}\) The \(4\times 4\) version? The \(n\times n\) version?

Discovery 5.2.

In the preamble to Discovery 5.1, there were two ingredients necessary to make the algebra work:

  • there is a special number \(1\) so that \(1a=a\) for all numbers \(a\text{;}\) and
  • for a nonzero number like \(5\text{,}\) there is a multiplicative inverse \(1/5\) so that \((1/5)\cdot 5 = 1\text{.}\)

Multiplicative inverses are very useful in algebra, so we would also like to have them in matrix algebra.

(a)

Consider

\begin{equation*} A = \left[\begin{array}{rr} 0 \amp -1 \\ 1 \amp 2 \end{array}\right] \text{.} \end{equation*}

Can you determine

\begin{equation*} B = \begin{bmatrix} a \amp b \\ c \amp d \end{bmatrix} \end{equation*}

so that \(BA = I\text{?}\) If so, check that \(AB = I\) also.

(b)

Consider

\begin{equation*} A = \begin{bmatrix} 0 \amp 1 \\ 0 \amp 0 \end{bmatrix}. \end{equation*}

Can you determine

\begin{equation*} B = \begin{bmatrix} a \amp b \\ c \amp d \end{bmatrix} \end{equation*}

so that \(BA = I\text{?}\)

Discovery 5.2 demonstrates that some square matrices have multiplicative inverses (i.e. are invertible) and some do not (called singular in this case). If square matrix \(A\) is invertible, write \(\inv{A}\) for its inverse. (But never write \(1/A\text{!}\)) This inverse is defined by its relationship to \(A\) and \(I\text{:}\) \(\inv{A}\) is the square matrix of the same size as \(A\) so that both \(A\inv{A} = I\) and \(\inv{A}A = I\) are true.

Discovery 5.3.

In the following, assume \(A,B,C\) are square invertible matrices, all of the same dimension, and assume that \(k\) is a nonzero scalar. Do not just look up the answers in the rest of this chapter, try to come up with them yourselves.

For this activity, it might be helpful to think of the pattern of the inverse in the following way: given a square matrix \(M\text{,}\) the inverse of \(M\) is the square matrix of the same size that can fill both of the boxes below to create true matrix equalities..

\begin{align} M \boxed{\phantom{X}} \amp = I \amp \boxed{\phantom{X}} M \amp = I\label{equation-inverses-pattern-of-inverses}\tag{\(\star\)} \end{align}
(a)

What do you think is the inverse of \(\inv{A}\text{?}\) In other words, if you use \(M = \inv{A}\) in (\(\star\)), what single choice of matrix can be used to fill in both boxes?

(b)

Determine a formula for the inverse of \(kA\) in terms of \(k\) and \(\inv{A}\text{.}\) In other words, if you use \(M = kA\) in (\(\star\)), what formula involving \(k\) and \(\inv{A}\) can be used to fill in both boxes?

(c)

Explain why the formula for the inverse of the product \(AB\) is not \(\inv{A}\inv{B}\text{.}\) Then determine a correct formula in terms of \(\inv{A}\) and \(\inv{B}\text{.}\) (Again, to determine the correct formula for \(\inv{(AB)}\text{,}\) use \(M = AB\) in (\(\star\)), and then try to figure out what single formula you can enter into both boxes so that both left-hand sides reduce to \(I\text{.}\))

(d)

Extend: Determine a formula for the inverse of the product \(ABC\) in terms of the inverses \(\inv{A}\text{,}\) \(\inv{B}\text{,}\) and \(\inv{C}\text{.}\)

(e)

What do you think \(A^{-2}\) means? There are two possibilities because the notation implies the application of two different processes: squaring and inverting. Do they both work out to be the same? Try with \(A\) given below. (For convenience, its inverse is also given.)

\begin{align*} A \amp = \left[\begin{array}{rr} 0 \amp -1 \\ 1 \amp 2 \end{array}\right] \amp \inv{A} \amp = \left[\begin{array}{rr} 2 \amp 1 \\ -1 \amp 0 \end{array}\right] \end{align*}
Discovery 5.4.
(a)

In algebra, when \(AB=AC\) we would usually conclude that \(B=C\text{.}\) Try this out for the matrices below.

\begin{align*} A \amp = \begin{bmatrix}0 \amp 1\\0 \amp 0\end{bmatrix} \amp B \amp = \begin{bmatrix}1 \amp 1\\2 \amp 3\end{bmatrix} \amp C \amp = \left[\begin{array}{rr} -1 \amp -1 \\ 2 \amp 3 \end{array}\right] \end{align*}

What is it about matrix \(A\) that is making the usual algebra of “cancellation” fail?

Hint

Think about the “hidden” algebra behind the cancellation \(ab=ac\Longrightarrow b=c\) for numbers.

(b)

In what circumstance is the algebra \(AB=AC \implies B=C\) valid? What explicit algebra steps go into this deduction?

(c)

Is the algebra \(AB=CA \implies B=C\) ever valid?

Discovery 5.5.

If we have a linear system \(A\uvec{x}=\uvec{b}\) with a square and invertible coefficient matrix \(A\text{,}\) we can use matrix algebra to solve the system instead of row reducing, by using \(\inv{A}\) to isolate \(\uvec{x}\text{.}\)

Here is an invertible \(3 \times 3\) matrix \(A\) and its inverse:

\begin{align*} A \amp = \left[\begin{array}{rrr} 0 \amp 1 \amp -2 \\ 1 \amp 2 \amp 0 \\ -2 \amp -4 \amp 1 \end{array}\right], \amp \inv{A} \amp = \left[\begin{array}{rrr} -2 \amp -7 \amp -4 \\ 1 \amp 4 \amp 2 \\ 0 \amp 2 \amp 1 \end{array}\right]. \end{align*}

Use matrix algebra (not row reducing!) to solve the system \(A\uvec{x} = \uvec{b}\) for

\begin{equation*} \uvec{b} = \left[\begin{array}{r} -1 \\ 1 \\ 3 \end{array}\right]. \end{equation*}

Now use the same method to solve the system \(A\uvec{x} = \uvec{b}\) for

\begin{equation*} \uvec{b} = \left[\begin{array}{r} -2 \\ 0 \\ 2 \end{array}\right]. \end{equation*}
Discovery 5.6.

In general, for system \(A\uvec{x} = \uvec{b}\) with a coefficient matrix \(A\) that is square and invertible, how many solutions does the system have? Justify your answer.

Hint

How many solutions did each of the systems in Discovery 5.5 have? Why?