Skip to main content

Discovery guide 42.1 Discovery guide

Discovery 42.1.

An \(m \times n\) real matrix \(A\) creates a function \(\funcdef{T_A}{\R^n}{\R^m}\) by matrix multiplication:

\begin{equation*} T_A(\uvec{x}) = A \uvec{x} \text{.} \end{equation*}

We will call such a function a matrix transformation \(\R^n \to \R^m\text{.}\)

(a)

Write out linear input-output component formulas for function \(T_A\) associated to matrix

\begin{equation*} A = \left[\begin{array}{rrr} 1 \amp 2 \amp -3 \\ 2 \amp -1 \amp 5 \end{array}\right] \text{,} \end{equation*}

so that \(\uvec{w} = T_A(\uvec{x})\text{.}\)

\begin{equation*} \left\{\begin{array}{rcrcrcr} w_1 \amp = \amp \underline{\hspace{0.909090909090909em}} x_1 \amp + \amp \underline{\hspace{0.909090909090909em}} x_2 \amp + \amp \underline{\hspace{0.909090909090909em}} x_3 \text{,} \\ w_2 \amp = \amp \underline{\hspace{0.909090909090909em}} x_1 \amp + \amp \underline{\hspace{0.909090909090909em}} x_2 \amp + \amp \underline{\hspace{0.909090909090909em}} x_3 \text{.} \end{array}\right. \end{equation*}
(b)

Determine the matrix \(B\) so that the linear input-ouput component formulas below correspond to a matrix transformation \(\uvec{w} = T_B(\uvec{x})\text{.}\)

\begin{equation*} \left\{\begin{array}{rcrcr} w_1 \amp = \amp 3 x_1 \amp - \amp x_2 \\ w_2 \amp = \amp 5 x_1 \amp + \amp 5 x_2 \\ w_3 \amp = \amp \amp + \amp 7 x_2 \\ w_4 \amp = \amp - x_1 \amp + \amp x_2 \end{array}\right. \end{equation*}
(c)

Suppose you know that matrix transformation \(\funcdef{T_C}{\R^3}{\R^3}\) satisfies

\begin{align*} T_C(\uvec{e}_1) \amp = \left[\begin{array}{r} 2 \\ -3 \\ 5 \end{array}\right] \text{,} \amp T_C(\uvec{e}_2) \amp = \left[\begin{array}{r} -7 \\ 11 \\ 13 \end{array}\right] \text{,} \amp T_C(\uvec{e}_3) \amp = \left[\begin{array}{r} 17 \\ 19 \\ -23 \end{array}\right] \text{.} \end{align*}

Do you have enough information to determine matrix \(C\text{?}\)

A function between two “spaces” of the same kind is often referred to as a morphism. Just as we used \(\R^n\) as the model for the ten vector space axioms, and used the dot products on \(\R^n\) and \(\C^n\) as the models for the four inner product space axioms, we will use matrix transformations \(\R^n \to \R^m\) as the model for the desired properties of vector space morphisms.

Discovery 42.2.

Suppose \(\funcdef{T_A}{\R^n}{\R^m}\) is the matrix transformation associated to \(m \times n\) matrix \(A\text{,}\) so that

\begin{equation*} T_A(\uvec{x}) = A \uvec{x} \text{.} \end{equation*}
(a)

How does \(T_A\) interact with the vector operations of the domain space \(\R^n\) and the codomain space \(\R^m\text{?}\)

That is, how does \(T_A\) interact with

(i)

vector addition?

(ii)

scalar multiplication?

(iii)

linear combinations?

(iv)

negatives?

(v)

the zero vector?

(b)

Which of the patterns from Task a can be deduced from others of the patterns?

Based on this, which of these patterns should be designated as the basic axioms of vector space morphisms?

A function \(\funcdef{T}{V}{W}\) between abstract vector spaces \(V,W\) that satisfies the axioms we have identified in Discovery 42.2.b will be called a linear transformation (or a vector space homomorphism).

Discovery 42.3.

In each of the following, determine whether the provided vector space function is a linear transformation.

(a)

Left-multiplication by \(m \times n\) matrix \(A\text{:}\)

\(\funcdef{L_A}{\matrixring_{n \times \ell}(\R)}{\matrixring_{m \times \ell}(\R)}\) by \(L_A(X) = A X\text{.}\)

(b)

Right-multiplication by \(m \times n\) matrix \(A\text{:}\)

\(\funcdef{R_A}{\matrixring_{\ell \times m}(\R)}{\matrixring_{\ell \times n}(\R)}\) by \(R_A(X) = X A\text{.}\)

(c)

Translation by a fixed nonzero vector \(\uvec{a}\) in vector space \(V\text{:}\)

\(\funcdef{t_{\uvec{a}}}{V}{V}\) by \(t_{\uvec{a}}(\uvec{v}) = \uvec{v} + \uvec{a} \text{.}\)

(d)

Multiplication by a fixed scalar \(a\) in vector space \(V\text{:}\)

\(\funcdef{m_a}{V}{V}\) by \(m_a(\uvec{v}) = a \uvec{v} \text{.}\)

(e)

Evaluation of polynomials at fixed \(x\)-value \(x = a\text{:}\)

\(\funcdef{E_a}{\poly(\R)}{\R^1}\) by \(E_a(p) = p(a) \text{.}\)

(f)

Determinant of square matrices: \(\funcdef{\det}{\matrixring_n(\R)}{\R^1}\text{.}\)

(g)

Differentiation: let \(F(a,b)\) represent the space of functions defined on the interval \(a \lt x \lt b\text{,}\) and let \(D(a,b)\) represent the subspace of \(F(a,b)\) consisting of differentiable functions.

Consider \(\funcdef{\ddx}{D(a,b)}{F(a,b)}\) by \(\ddx(f) = f'\text{.}\)

(h)

Integration: let \(C[a,b]\) represent the space of continuous functions defined on the interval \(a \le x \le b\text{.}\)

Consider \(\funcdef{I_{a,b}}{C[a,b]}{\R^1}\) by \(I_{a,b}(f) = \integral{a}{b}{f(x)}{x}\text{.}\)

Discovery 42.4.

Suppose \(V\) is a finite-dimensional vector space with

\begin{equation*} V = \Span \{ \uvec{v}_1, \uvec{v}_2, \uvec{v}_3 \} \text{,} \end{equation*}

and \(\funcdef{T}{V}{\R^2}\) is a linear transformation such that

\begin{align*} T(\uvec{v}_1) \amp = (1,2) \text{,} \amp T(\uvec{v}_2) \amp = (3,-5) \text{,} \amp T(\uvec{v}_3) \amp = (0,4)\text{.} \end{align*}
(a)

Based on this information, can you determine \(T(3 \uvec{v}_1 - \uvec{v}_2 + 5 \uvec{v}_3)\text{?}\)

(b)

Would you be able to answer Task a for other linear combinations of \(\uvec{v}_1,\uvec{v}_2,\uvec{v}_3\text{?}\)

(c)

Describe the pattern: in order to be able to compute every output of a linear transformation, the only output information required is .

Discovery 42.5.

Suppose \(\funcdef{T}{\R^3}{\R^3}\) is a linear transformation such that

\begin{align*} T(\uvec{e}_1) \amp = \left[\begin{array}{r} 2 \\ -3 \\ 5 \end{array}\right] \text{,} \amp T(\uvec{e}_2) \amp = \left[\begin{array}{r} -7 \\ 11 \\ 13 \end{array}\right] \text{,} \amp T(\uvec{e}_3) \amp = \left[\begin{array}{r} 17 \\ 19 \\ -23 \end{array}\right] \text{.} \end{align*}
(a)

Do you know any other linear transformation \(\R^3 \to \R^3\) that has the same outputs for the standard basis vectors as inputs?

Hint

Look back at Discovery 42.1.c.

(c)

Describe the pattern: every linear transformation \(\R^n \to \R^m\) is effectively .

Discovery 42.6.
(a)

Suppose \(\funcdef{T}{\R^n}{\R^1}\) is a linear transformation. What size of matrix would represent this linear transformation?

What word would we normally use to describe a matrix of those dimensions, instead of “matrix”?

(b)

Describe the pattern: every linear transformation \(\R^n \to \R^1\) corresponds to a .

Discovery 42.7.

For vector spaces \(V,W\text{,}\) let \(L(V,W)\) represent the collection of all linear transformations \(V \to W\text{.}\)

(a)

How could transformations in \(L(V,W)\) be added?

That is, if \(\funcdef{T_1,T_2}{V}{W}\) are objects in \(L(V,W)\text{,}\) what transformation should \(T_1 + T_2\) represent?

\begin{equation*} (T_1 + T_2)(\uvec{v}) = \underline{\hspace{9.090909090909092em}} \end{equation*}

Is the sum transformation \(T_1 + T_2\) still in \(L(V,W)\text{?}\) (i.e. Is it still linear?)

(b)

How could transformations in \(L(V,W)\) be scalar multiplied?

That is, if \(\funcdef{T}{V}{W}\) is an object in \(L(V,W)\text{,}\) what transformation should \(k T\) represent for scalar \(k\text{?}\)

\begin{equation*} (k T)(\uvec{v}) = \underline{\hspace{9.090909090909092em}} \end{equation*}

Is the scaled transformation \(T\) still in \(L(V,W)\text{?}\) (i.e. Is it still linear?)

(c)

Is \(L(V,W)\) a vector space under the operations of addition and scalar multiplication of linear transformations?

That is, do your operations satisfy the ten Vector space axioms?