Recall that the main diagonal of a square matrix refers to the entries on the diagonal from top left to bottom right. Here are some special types of square matrices for consideration.
Carry out the following tasks for each of the special types of matrices defined above. Think in general, and consider every possible size of matrix, not just \(2\times 2\) and \(3\times 3\text{!}\) You don’t need to prove each answer, but you should be able to articulate an informal justification for each answer that doesn’t rely on examples (unless it’s a counterexample).
Tip. When considering the questions in this activity for symmetric matrices, rather than trying to figure things out with examples, it is much easier to work algebraically with a letter \(A\) representing an arbitrary symmetric matrix, and use the definition of symmetric (\(\utrans{A} = A\)).
Recall that a matrix is invertible if and only if its RREF is the identity matrix. Based on this, can you come up with a simple condition by which you can determine whether a matrix of this type is invertible or not?
For the case of symmetric matrices, it will be too complicated to work by examples. Instead, consider the formula \(\utrans{(\inv{A})} = \inv{(\utrans{A})}\) from Proposition 5.5.8 and the definition of symmetric matrix above.
Come up with a condition or set of conditions on the entries \(a_{ij}\) of a square matrix \(A\) by which you can determine whether or not \(A\) is of this type.
Here is an example of the type of condition we’re looking for, using the identity matrix: a square matrix \(A\) is equal to the identity matrix if \(a_{ii} = 1\) for all indices \(i\text{,}\) and \(a_{ij} = 0\) for all pairs of indices \(i,j\) with \(i\neq j\text{.}\)
Make a conjecture (i.e. a guess based on previous examples) about what will happen if you compute powers of a \(5\times 5\) matrix of a similar form, with all entries equal to \(0\) except for a line of \(1\)s down the first “superdiagonal.”
Suppose that \(A\) and \(B\) are diagonal matrices of the same size. (But do not assume that they have a particular size like \(2\times 2\) or \(3\times 3\) or etc.)
Describe what our assumption that \(A\) is diagonal means about the entries of \(A\) in terms of your answer to Task g of Discovery 7.1. Then do the same for \(B\text{.}\)
Decide exactly what you need to check in order to be sure that the sum \(A + B\) is diagonal, in terms of your answer to Task g of Discovery 7.1. Then carry out that check, using your answer to Task a.
This activity will guide you through proving that the sum of two symmetric matrices is again symmetric. Unlike the proof in Discovery 7.5, we will not need to consider individual entries, since the definition of symmetric matrix does not refer to individual entries like the definition of diagonal matrix does.
Suppose that \(A\) and \(B\) are symmetric matrices of the same size. (But do not assume that they have a particular size like \(2\times 2\) or \(3\times 3\) or etc.)
Express what it means for \(A\) to be symmetric in mathematical notation, using the symbols \(A\text{,}\)\(\utrans{}\text{,}\) and \(=\text{.}\) Then do the same for \(B\text{.}\)
Your expressions from Task a are things we are assuming to be true. Your expression from Task b is the condition that needs to be verified. Carry out this verification, making sure to use proper LHS vs RHS procedure. In this verification, you will need to use your assumed knowledge from Task a as well as an algebra rule from Proposition 4.5.1.