Discovery guide 42.1 Discovery guide
Discovery 42.1.
An \(m \times n\) real matrix \(A\) creates a function \(\funcdef{T_A}{\R^n}{\R^m}\) by matrix multiplication:
We will call such a function a matrix transformation \(\R^n \to \R^m\text{.}\)
(a)
Write out linear input-output component formulas for function \(T_A\) associated to matrix
so that \(\uvec{w} = T_A(\uvec{x})\text{.}\)
(b)
Determine the matrix \(B\) so that the linear input-ouput component formulas below correspond to a matrix transformation \(\uvec{w} = T_B(\uvec{x})\text{.}\)
(c)
Suppose you know that matrix transformation \(\funcdef{T_C}{\R^3}{\R^3}\) satisfies
Do you have enough information to determine matrix \(C\text{?}\)
A function between two “spaces” of the same kind is often referred to as a morphism. Just as we used \(\R^n\) as the model for the ten vector space axioms, and used the dot products on \(\R^n\) and \(\C^n\) as the models for the four inner product space axioms, we will use matrix transformations \(\R^n \to \R^m\) as the model for the desired properties of vector space morphisms.
Discovery 42.2.
Suppose \(\funcdef{T_A}{\R^n}{\R^m}\) is the matrix transformation associated to \(m \times n\) matrix \(A\text{,}\) so that
(a)
How does \(T_A\) interact with the vector operations of the domain space \(\R^n\) and the codomain space \(\R^m\text{?}\)
That is, how does \(T_A\) interact with
(i)
vector addition?
(ii)
scalar multiplication?
(iii)
linear combinations?
(iv)
negatives?
(v)
the zero vector?
(b)
Which of the patterns from Task a can be deduced from others of the patterns?
Based on this, which of these patterns should be designated as the basic axioms of vector space morphisms?
A function \(\funcdef{T}{V}{W}\) between abstract vector spaces \(V,W\) that satisfies the axioms we have identified in Discovery 42.2.b will be called a linear transformation (or a vector space homomorphism).
Discovery 42.3.
In each of the following, determine whether the provided vector space function is a linear transformation.
(a)
Left-multiplication by \(m \times n\) matrix \(A\text{:}\)
\(\funcdef{L_A}{\matrixring_{n \times \ell}(\R)}{\matrixring_{m \times \ell}(\R)}\) by \(L_A(X) = A X\text{.}\)
(b)
Right-multiplication by \(m \times n\) matrix \(A\text{:}\)
\(\funcdef{R_A}{\matrixring_{\ell \times m}(\R)}{\matrixring_{\ell \times n}(\R)}\) by \(R_A(X) = X A\text{.}\)
(c)
Translation by a fixed nonzero vector \(\uvec{a}\) in vector space \(V\text{:}\)
\(\funcdef{t_{\uvec{a}}}{V}{V}\) by \(t_{\uvec{a}}(\uvec{v}) = \uvec{v} + \uvec{a} \text{.}\)
(d)
Multiplication by a fixed scalar \(a\) in vector space \(V\text{:}\)
\(\funcdef{m_a}{V}{V}\) by \(m_a(\uvec{v}) = a \uvec{v} \text{.}\)
(e)
Evaluation of polynomials at fixed \(x\)-value \(x = a\text{:}\)
\(\funcdef{E_a}{\poly(\R)}{\R^1}\) by \(E_a(p) = p(a) \text{.}\)
(f)
Determinant of square matrices: \(\funcdef{\det}{\matrixring_n(\R)}{\R^1}\text{.}\)
(g)
Differentiation: let \(F(a,b)\) represent the space of functions defined on the interval \(a \lt x \lt b\text{,}\) and let \(D(a,b)\) represent the subspace of \(F(a,b)\) consisting of differentiable functions.
Consider \(\funcdef{\ddx}{D(a,b)}{F(a,b)}\) by \(\ddx(f) = f'\text{.}\)
(h)
Integration: let \(C[a,b]\) represent the space of continuous functions defined on the interval \(a \le x \le b\text{.}\)
Consider \(\funcdef{I_{a,b}}{C[a,b]}{\R^1}\) by \(I_{a,b}(f) = \integral{a}{b}{f(x)}{x}\text{.}\)
Discovery 42.4.
Suppose \(V\) is a finite-dimensional vector space with
and \(\funcdef{T}{V}{\R^2}\) is a linear transformation such that
(a)
Based on this information, can you determine \(T(3 \uvec{v}_1 - \uvec{v}_2 + 5 \uvec{v}_3)\text{?}\)
(b)
Would you be able to answer Task a for other linear combinations of \(\uvec{v}_1,\uvec{v}_2,\uvec{v}_3\text{?}\)
(c)
Describe the pattern: in order to be able to compute every output of a linear transformation, the only output information required is .
Discovery 42.5.
Suppose \(\funcdef{T}{\R^3}{\R^3}\) is a linear transformation such that
(a)
Do you know any other linear transformation \(\R^3 \to \R^3\) that has the same outputs for the standard basis vectors as inputs?
Look back at Discovery 42.1.c.
(b)
Based on Task a, and in light of Discovery 42.4.c, what can you say about \(T\text{?}\)
(c)
Describe the pattern: every linear transformation \(\R^n \to \R^m\) is effectively .
Discovery 42.6.
(a)
Suppose \(\funcdef{T}{\R^n}{\R^1}\) is a linear transformation. What size of matrix would represent this linear transformation?
What word would we normally use to describe a matrix of those dimensions, instead of “matrix”?
(b)
Describe the pattern: every linear transformation \(\R^n \to \R^1\) corresponds to a .
Discovery 42.7.
For vector spaces \(V,W\text{,}\) let \(L(V,W)\) represent the collection of all linear transformations \(V \to W\text{.}\)
(a)
How could transformations in \(L(V,W)\) be added?
That is, if \(\funcdef{T_1,T_2}{V}{W}\) are objects in \(L(V,W)\text{,}\) what transformation should \(T_1 + T_2\) represent?
Is the sum transformation \(T_1 + T_2\) still in \(L(V,W)\text{?}\) (i.e. Is it still linear?)
(b)
How could transformations in \(L(V,W)\) be scalar multiplied?
That is, if \(\funcdef{T}{V}{W}\) is an object in \(L(V,W)\text{,}\) what transformation should \(k T\) represent for scalar \(k\text{?}\)
Is the scaled transformation \(T\) still in \(L(V,W)\text{?}\) (i.e. Is it still linear?)
(c)
Is \(L(V,W)\) a vector space under the operations of addition and scalar multiplication of linear transformations?
That is, do your operations satisfy the ten Vector space axioms?