Section 42.5 Theory
In this section.
Subsection 42.5.1 Properties of linear transformations
We begin by demonstrating that we chose the correct two axioms in Discovery 42.2 for our definition of linear transformation. We leave the verification of these properties to you, the reader.Proposition 42.5.1. Properties of linear transformations.
A linear transformation T:VβW satisfies the following additional linearity properties.
- The image of zero is zero. That is, T(0V)=0W.
- The image of negative is negative. That is, T(βv)=βT(v) for all v in V.
- The image of a linear combination is a linear combination. That is,T(a1v1+a2v2+β―+amvm)=a1T(v1)+a2T(v2)+β―+amT(vm)for all scalars a1,a2,β¦,am and all vectors v1,v2,β¦,vm in V.
Theorem 42.5.2. Uniqueness of vector images for a domain spanning set.
Suppose T:VβW is a linear transformation and {v1,β¦,vm} is a spanning set for the domain space V. Then T is uniquely determined by the image vectors T(v1),β¦,T(vm).
In other words, T is the one unique linear transformation VβW with these spanning set image vectors, and every other image vector for T can be determined from these spanning set image vectors.
Proof.
Write \(\uvec{w}_j\) for the \(\nth[j]\) spanning set image vector \(T(\uvec{v}_j)\text{.}\) Since each vector \(\uvec{v}\) in the domain space \(V\) can be expressed as a linear combination
we can use the linearity of \(T\) to compute
This demonstrates that \(T\) is completely determined by the image vectors \(\uvec{w}_1,\dotsc,\uvec{w}_m\text{.}\)
Now assume \(\funcdef{T'}{V}{W}\) is another linear transformation with \(T'(\uvec{v}_j) = \uvec{w}_j\text{.}\) But then for
the linearity of \(T'\) will lead to
Since \(T,T'\) agree on all input domain vectors, they must be the same linear transformation.
Corollary 42.5.3. Creating a transformation from a domain space basis.
Given basis B={v1,β¦,vn} of vector space V, and arbitrary vectors w1,β¦,wn in vector space W (with duplicates permitted), there there exists one unique transformation T:VβW satisfying T(vj)=wj for each index j.
Proof outline.
If we can show that there exists at least one such linear transformation \(\funcdef{T}{V}{W}\text{,}\) then Theorem 42.5.2 will guarantee that it is the only one.
We want \(T\) to both be linear and to satisfy \(T(\uvec{v}_j) = \uvec{w}_j\) for each index \(j\text{.}\) Given that each vector \(\uvec{v}\) in the domain space \(V\) has a unique expansion
in terms of the basis vectors in \(\basisfont{B}\) (Theorem 19.5.3), there is no ambiguity in defining \(T\) relative to such expansions:
Then \(T\) will be linear (check!),and since each basis vector is expanded as \(\uvec{v}_j = 1 \uvec{v}_j\text{,}\) with zero coefficients on each of the other basis vectors, we will have
as desired.
Corollary 42.5.4. Transformations RnβRm or CnβCm are matrix transformations.
- If T:RnβRm is a linear transformation, then there exists a unique mΓn real matrix A so that T=TA, where TA:RnβRm is defined by TA(x)=Ax, as usual.
- If T:CnβCm is a linear transformation, then there exists a unique mΓn complex matrix A so that T=TA, where TA:CnβCm is defined by TA(x)=Ax, as usual.
Proof idea.
The proof is identical for both the real and complex versions of the statement. By Theorem 42.5.2, the transformation \(T\) is uniquely determined by the output vectors
where \(\basisfont{S} = \{\uvec{e}_1,\dotsc,\uvec{e}_n\}\) is the standard basis for \(\R^n\) or \(\C^n\text{,}\) as appropriate. But if \(A\) is the matrix whose columns are these output vectors, then since
is equal to the \(\nth[j]\) column of \(A\text{,}\) the transformations \(T\) and \(T_A\) agree on each of the vectors in basis \(\basisfont{S}\text{.}\) From this we can conclude that \(T\) and \(T_A\) are in fact the same transformation.
Subsection 42.5.2 Spaces of linear transformations
As in Discovery 42.7 and Subsection 42.3.6, the collection of all linear transformations from one vector space to another is itself a vector space.Theorem 42.5.5. Transformations form a vector space.
For vector spaces V,W, collection L(V,W) of all linear transformations VβW is a vector space under the operations
Proof.
We will verify Axiom A 1, and leave the other nine axioms to you, the reader.
If \(\funcdef{T_1,T_2}{V}{W}\) are linear, is \(\funcdef{T_1 + T_2}{V}{W}\) linear? First check additivity:
with justifications
- definition of addition of \(T_1,T_2\text{;}\)
- additivity of \(T_1,T_2\text{;}\)
- vector algebra in the codomain space \(W\text{;}\) and
- definition of addition of \(T_1,T_2\text{.}\)
Now check homogeneity:
with justifications
- definition of addition of \(T_1,T_2\text{;}\)
- homogeneity of \(T_1,T_2\text{;}\)
- vector algebra in the codomain space \(W\text{;}\) and
- definition of addition of \(T_1,T_2\text{.}\)
Corollary 42.5.6. Dual spaces are vector spaces.
- For every real vector space V, the dual space Vβ=L(V,R) is a real vector space.
- For every complex vector space V, the dual space Vβ=L(V,C) is a complex vector space.
Theorem 42.5.7.
Suppose V is a finite-dimensional vector space. For every basis
for V, we can define a dual basis
for Vβ as follows.
For each index i, take eβi:VβR (or eβi:VβC, as appropriate) to be the unique linear transformation such that
for each index j.
Proof.
First, note that the existence and uniqueness of each \(\vecdual{\uvec{e}}_i\) linear functional is guaranteed by Corollary 42.5.3. So we just need to establish that these special linear functionals form a basis of \(\vecdual{V}\text{.}\)
Before we do that, it will help to establish a pattern of evaluating these special functionals. As \(\vecdual{V}\) is a vector space, a linear combination of linear functionals is also a linear functional on \(V\text{,}\) and so can be evaluated as a function on vectors in \(V\text{.}\) In particular, consider a linear combination
evaluated at \(\uvec{e}_1\text{.}\) Using the definition of the \(\vecdual{\uvec{e}}_i\) functionals in the statement of the theorem, we calculate:
Similarly,
for each index \(j\text{.}\)
Linear independence.
To apply the Test for Linear Dependence/Independence, we begin with a homogeneous vector equation
The zero vector on the right is the zero linear functional, which evaluates to zero at all vectors in \(V\text{.}\) So evaluating both sides of (\(\star\star\)) at vector \(\uvec{e}_j\text{,}\) and using (\(\star\)) to simplify on the left, we get \(k_j = 0\) for each index \(j\text{.}\) This means that vector equation (\(\star\star\)) has only the trivial solution, so that these βdualβ linear functionals are linearly independent.
Spans.
We must show that every linear functional in \(\vecdual{V}\) is a linear combination of the special \(\vecdual{\uvec{e}}_i\) functionals. So suppose that \(f\) is an arbitrary functional on \(V\text{.}\) Since \(\basisfont{B}\) is a basis for \(V\text{,}\) Theorem 42.5.2 says that \(f\) is uniquely determined by the values
Let \(a_1,a_2,\dotsc,a_n\) represent these values, and let \(g\) be the linear functional
The pattern of (\(\star\)) says that
But since \(f\) is uniquely determined by the \(a_j\) values, we can conclude that \(f = g\text{,}\) a linear combination of the \(\vecdual{\uvec{e}}_i\) functionals.
Corollary 42.5.8.
For finite-dimensional V, we have dimVβ=dimV.