Processing math: 100%
Skip to main content

Discovery guide 42.1 Discovery guide

Discovery 42.1.

An m×n real matrix A creates a function TA:Rn→Rm by matrix multiplication:

TA(x)=Ax.

We will call such a function a matrix transformation Rn→Rm.

(a)

Write out linear input-output component formulas for function TA associated to matrix

A=[12βˆ’32βˆ’15],

so that w=TA(x).

{w1=_x1+_x2+_x3,w2=_x1+_x2+_x3.
(b)

Determine the matrix B so that the linear input-ouput component formulas below correspond to a matrix transformation w=TB(x).

{w1=3x1βˆ’x2w2=5x1+5x2w3=+7x2w4=βˆ’x1+x2
(c)

Suppose you know that matrix transformation TC:R3β†’R3 satisfies

TC(e1)=[2βˆ’35],TC(e2)=[βˆ’71113],TC(e3)=[1719βˆ’23].

Do you have enough information to determine matrix C?

A function between two β€œspaces” of the same kind is often referred to as a morphism. Just as we used Rn as the model for the ten vector space axioms, and used the dot products on Rn and Cn as the models for the four inner product space axioms, we will use matrix transformations Rnβ†’Rm as the model for the desired properties of vector space morphisms.

Discovery 42.2.

Suppose TA:Rn→Rm is the matrix transformation associated to m×n matrix A, so that

TA(x)=Ax.
(a)

How does TA interact with the vector operations of the domain space Rn and the codomain space Rm?

That is, how does TA interact with

(b)

Which of the patterns from Task a can be deduced from others of the patterns?

Based on this, which of these patterns should be designated as the basic axioms of vector space morphisms?

A function T:Vβ†’W between abstract vector spaces V,W that satisfies the axioms we have identified in Discovery 42.2.b will be called a linear transformation (or a vector space homomorphism).

Discovery 42.3.

In each of the following, determine whether the provided vector space function is a linear transformation.

(a)

Left-multiplication by mΓ—n matrix A:

LA:MnΓ—β„“(R)β†’MmΓ—β„“(R) by LA(X)=AX.

(b)

Right-multiplication by mΓ—n matrix A:

RA:Mβ„“Γ—m(R)β†’Mβ„“Γ—n(R) by RA(X)=XA.

(c)

Translation by a fixed nonzero vector a in vector space V:

ta:V→V by ta(v)=v+a.

(d)

Multiplication by a fixed scalar a in vector space V:

ma:V→V by ma(v)=av.

(e)

Evaluation of polynomials at fixed x-value x=a:

Ea:P(R)β†’R1 by Ea(p)=p(a).

(f)

Determinant of square matrices: det:Mn(R)β†’R1.

(g)

Differentiation: let F(a,b) represent the space of functions defined on the interval a<x<b, and let D(a,b) represent the subspace of F(a,b) consisting of differentiable functions.

Consider ddx:D(a,b)β†’F(a,b) by ddx(f)=fβ€².

(h)

Integration: let C[a,b] represent the space of continuous functions defined on the interval a≀x≀b.

Consider Ia,b:C[a,b]β†’R1 by Ia,b(f)=∫baf(x)dx.

Discovery 42.4.

Suppose V is a finite-dimensional vector space with

V=Span{v1,v2,v3},

and T:V→R2 is a linear transformation such that

T(v1)=(1,2),T(v2)=(3,βˆ’5),T(v3)=(0,4).
(a)

Based on this information, can you determine T(3v1βˆ’v2+5v3)?

(b)

Would you be able to answer Task a for other linear combinations of v1,v2,v3?

(c)

Describe the pattern: in order to be able to compute every output of a linear transformation, the only output information required is .

Discovery 42.5.

Suppose T:R3β†’R3 is a linear transformation such that

T(e1)=[2βˆ’35],T(e2)=[βˆ’71113],T(e3)=[1719βˆ’23].
(a)

Do you know any other linear transformation R3β†’R3 that has the same outputs for the standard basis vectors as inputs?

Hint

Look back at Discovery 42.1.c.

(c)

Describe the pattern: every linear transformation Rn→Rm is effectively .

Discovery 42.6.
(a)

Suppose T:Rn→R1 is a linear transformation. What size of matrix would represent this linear transformation?

What word would we normally use to describe a matrix of those dimensions, instead of β€œmatrix”?

(b)

Describe the pattern: every linear transformation Rn→R1 corresponds to a .

Discovery 42.7.

For vector spaces V,W, let L(V,W) represent the collection of all linear transformations V→W.

(a)

How could transformations in L(V,W) be added?

That is, if T1,T2:V→W are objects in L(V,W), what transformation should T1+T2 represent?

(T1+T2)(v)=_

Is the sum transformation T1+T2 still in L(V,W)? (i.e. Is it still linear?)

(b)

How could transformations in L(V,W) be scalar multiplied?

That is, if T:V→W is an object in L(V,W), what transformation should kT represent for scalar k?

(kT)(v)=_

Is the scaled transformation T still in L(V,W)? (i.e. Is it still linear?)

(c)

Is L(V,W) a vector space under the operations of addition and scalar multiplication of linear transformations?

That is, do your operations satisfy the ten Vector space axioms?