Appendix
1 Linear Al gebra
1.1 R
n
a s a vector space
As a set, R
n
is the collection of all n-tuple (x
1
; :::; x
n
), x
k
2R. Equivalently, R
n
can be defined
as the Cartesian product of n copies of R as R
n
= R ×···×R. A tuple u = (x
1
; :::; x
n
) has
a double life, as a point in the Cartesian space with coordinates x
k
, and as a vector u~ with
the tail at the origin and head at u. For this reason, R
n
can be considered as a set of points
or a set of vectors .
u
x
1
x
n
x
2
x
2
x
n
u~
x
1
We ne ed two more structures on R
n
to make it a vector space, i .e. , the scalar multipli-
cation, and the vector addition. Let c 2R be a scalar, and u~ = (x
1
; :::x
n
); v~ = (y
1
; :::; y
n
) two
arbitrary vectors. We define
c u~ = (cx
1
; :: :; cx
n
); u~ + v~ = (x
1
+ y
1
; :::; x
n
+ y
n
):
The geometry of these two o perations are shown in Fig.1.
cu~ ; c < 0
u~
cu~ ; c > 0
v~
u~ + v~
u~ ¡v~
u~
Figure 1.
1
It is seen that R
n
is closed under the defined operations, that is, for any two vectors u~ ;
v~ in R
n
, and c
1
; c
2
2R, vector c
1
u~ + c
2
v~ is in R
n
. Moreover, the following properties hold for
any vectors u~ ; v~ ; w~ 2R
n
and constant c 2R:
i. u~ + v~ = v~ + u~
ii. c (u~ + v~) = c u~ + c v~
iii. (c
1
+ c
2
)u~ = c
1
u~ + c
2
u~
iv. (u~ + v~) + w~ = u~ + (v~ + w~ )
R
n
has n standard unit vectors e^
1
; :::; e^
n
defined below
e^
1
=
0
B
B
B
B
B
B
@
1
0
·
·
·
0
1
C
C
C
C
C
C
A
; e^
2
=
0
B
B
B
B
B
B
@
0
1
·
·
·
0
1
C
C
C
C
C
C
A
; :::; e^
n
=
0
B
B
B
B
B
B
@
0
0
·
·
·
1
1
C
C
C
C
C
C
A
:
The set fe^
k
g
k=1
n
is a basis for R
n
in the sense that any vector u~ =(x
1
; :::; x
n
) can be uniquely
represented as
u~ = x
1
e^
1
+ ···+ x
n
e^
n
:
Remark. The magnitude or norm of a vector u~ = (x
1
; :: :; x
n
) is defined as follows
ku~ k=
X
k=1
n
x
k
2
s
:
A vector is called unit or a direction vector if its no rm is equal 1. We usually denote unit
vector by notation ^.
1.2 Dot and Cross products
The dot product of two arbitrary vectors u~ = (x
1
; :::; x
n
); v~ = (y
1
; :: :; y
n
) is defined as
u~ ·v~ =
X
k=1
n
x
i
y
i
:
In particular, e^
i
·e^
j
= δ
i;j
where δ
i;j
=
1 i = j
0 i =/ j
.
Problem 1. Show that the dot product enjoys the following properti es for any vectors u~ ; v~ ; w~ and for
arbitrary constant c
i. (u~ + v~) ·w~ = u~ ·w~ + v~ ·w~ ,
ii. u~ ·v~ = v~ ·u~ ,
iii. (c u~ ) ·v~ = c (u~ ·v~)
Problem 2. Show the following relations
a) ku~ k
2
= u~ ·u~
b) ku~ + v~ kku~ k+ kv~ k
c) ku~ + v~ k
2
¡ku~ ¡v~ k
2
= 4 u~ ·v~
Problem 3. The Cauchy inequality is as follows
u~ ·v~ ku~ kkv~ k :
2 Appendix
Try to prove the inequality by the following method: ku~ + t v~ k
2
0 for all t 2R. Expand ku~ + t v~ k
2
in
terms of t and conclude the inequality. By this inequality, one can define the angle between two nonzero
vectors u~ ; v~ as follows
cos(θ) =
u~ ·v~
ku~ kk v~ k
:
By t he above equality, one can write (u~ ; v~) = ku~ kkv~ kcos(θ). Note that if u~ ·v~ = 0 for non-zero vectors
u~ ; v~, then cos(θ) = 0.
Problem 4. Show that if u~
1
; :::; u~
m
are mutually orthogonal, that is, if u~
i
·u~
j
= 0 for i =/ j then
ku~
1
+ · ··+ u~
m
k
2
= ku~
1
k+ ···+ ku~
m
k
2
:
There is a standard product in R
3
called cross or external product. For u~ = (x
1
; y
1
; z
1
);
v~ = (x
2
; y
2
; z
2
), the cross product is defined as follows
u~ ×v~ =
e^
1
e^
2
e^
3
x
1
y
1
z
1
x
2
y
2
z
2
= (y
1
z
2
¡z
1
y
2
)e^
1
+ (z
1
x
2
¡x
1
z
2
)e^
2
+ (x
1
y
2
¡y
1
x
2
)e^
3
:
Note that u~ ×v~ is a vector, while their dot p roduct is a scalar.
Problem 5. Show the relation u~ ×v~ = ¡v~ ×u~ .
Problem 6. Show that u~ ×v~ is perpendicular to u~ and v~, that is,
(u~ ×v~) ·u~ = (u~ ×v~) ·v~ = 0:
Problem 7. Show the identity
ku~ ×v~ k
2
= ku~ k
2
ku~ k
2
¡ju~ ·v~ j
2
;
and conclude the relation
ku~ ×v~ k= ku~ kkv~ ksin(θ);
where θ is the angle between u~ ; v~ in [0; π].
By the above problem, we can write
u~ ×v~ = ku~ kkv~ ksin(θ) n^;
where n^ is the unit vector perpendicular to the plane containing u~ ; v~, that is, n^ =
u~ ×v~
ku~ ×v~ k
.
A = jju~ ×v~ jj
θ
u~ ×v~
u~
v~
A
v~ ×u~
Problem 8. Show the following relation for any three vectors u~ ; v~ ; w~
u~ ·(v~ ×w~ ) = v~ ·(w~ ×u~ ):
Problem 9. If v~ ; w~ are orthogonal, show the following relation
u~ ×(v~ ×w~ ) = (u~ ·w~ )v~ ¡(u~ ·v~)w~ :
Use this result and relax the condition v~ ; w~ to be orthogonal. Use the formula and determine the
conditions the following rela tion holds
u~ ×(v~ ×w~ ) = (u~ ×v~) ×w~ :
1 Linear Algebra 3
1.3 Subspaces and direct sum
Definition 1. Tow vectors u~ ; v~ in R
n
are called linearly dependent if there is a scalar c
such that u~ = c v~ or v~ = c u~ . A vector u~ is linearly dependent on vectors v~
1
; :::; v~
m
if there are
scalars c
1
; :: :; c
m
such that
u~ = c
1
v
1
~ + ···+ c
m
v~
m
:
Vectors v~
1
; :: :; v~
m
in R
n
are linearly independent if the linear combination
c
1
v~
1
+ ···+ c
m
v~
m
= 0;
implies c
1
= ···= c
m
= 0.
Problem 10. Vectors e^
1
; :::; e^
n
are linearly independent in R
n
. Show that any n + 1 vectors of R
n
are
linearly dependent.
Let fv~
1
; ::: ; v~
d
g fo r d n be a set of linearly independent vectors i n R
n
. The span of
vectors in the given set is the set of all possible linear combinations of v~
1
; :::; v~
d
, i.e.,
spanfv~
1
; :::; v~
d
g= fc
1
v~
1
+ ·· ·+ c
d
v~
d
; c
k
2Rg:
Note that R
n
is itself equal to spanfe^
1
; :::; e^
n
g.
Proposition 1. V
d
:= spanfv~
1
; :::; v~
d
g is closed under the vector addition and scalar multi-
plication of R
n
. For this reason, V
d
is called a linear subspace of R
n
.
Definition 2. Let V be a linear subspace of R
n
. The dimension of V is the maximum
number of linearly independent vectors in V.
Example 1. Technically speaking, R
m
is not a subspace of R
n
for m < n, however, if we
interpret R
m
as spanfe^
1
; :::; e^
m
g where each e^
j
is a vector in R
n
, then R
m
is a linear subspace
of R
n
.
If V is a linear subspaces of R
n
, then its orthogonal subspace V
?
is defined as follows
V
?
= fw~ 2R
n
; w~ ·v~ = 0; v~ 2Vg:
Obviously, V
?
is a linear subspace of R
n
equipped with the vector addition and scalar
multiplication operations.
Problem 11. Show that if V =spanf(1; 1;0); (0; 1;1)g, then V
?
is the one dimensional subspace spanned
by (1; ¡1 ; 1).
Problem 12. Find the orthogonal subspace of V = f(1; 0; 1)g in R
3
.
Suppose U;V are two subspaces of R
n
and U \V = f0g. The direct sum U V is defined
as follows
U V = fc
1
u~ + c
2
v~; u~ 2U; v~ 2Vg:
Problem 13. If V is a subspace of R
n
, show that R
n
= V V
?
.
Problem 14. Let V be an arbitrary subspace of R
n
. Show that every vector u~ in R
n
can be represented
uniquely as u~ = c
1
v~ + c
2
w~ fo r v~ 2V and w~ 2V
?
.
4 Appendix
1.4 Matrices and line ar mappings
Definition 3. A mapping f : R
n
! R
m
is called linear if for any constants c
1
; c
2
and any
vectors u~ ; v~ 2R
n
, the following relation holds
f(c
1
u~ + c
2
v~) = c
1
f(u~ ) + c
2
f(v~):
Problem 15. If f : R
n
!R
m
is linear then f (0) = 0.
Proposition 2. A linear mapping f: R
n
!R
m
can be represented by a m ×n matrix.
Proof. Remember that a matrix A= [a
ij
]
m×n
is a structure of n-columns of vectors belonging
R
m
, i.e., A = [A
1
··jA
n
], where A
k
2 R
m
. The acti on of A to e^
k
is defined by the re lation
A (e^
k
) = A
k
. Now define A
f
as
A
f
= [f(e^
1
)jf(e^
2
)··jf(e^
n
)]:
It is simply seen that for arbitrary vector u~ 2R
n
, the following relation holds f(u~ ) = A
f
(u~ ) .
Problem 16. Let T : R
2
!R
2
has the matrix representation A
2×2
=
1 1
0 1
in the standard basis. Find
the matrix representation of T in the basis v~
1
=
1
1
; v~
2
=
1
¡1
.
Problem 17. Prove that a matrix 2 ×2 maps any parallelogram to a parallelogram.
Problem 18. Verify that the the matrix R
θ
=
cos(θ) ¡sin(θ)
sin(θ) cos(θ)
rotates vectors in the plane by θ degree
counter-clockwise. Verify that R
θ
1
R
θ
2
= R
θ
1
+θ
2
and conclude that R
θ
R
¡θ
is the identity matrix
1 0
0 1
.
Definition 4. Let f:R
n
!R
m
be a linear mapping. The kernel (or null space) of f, denoted
by ker(f) (or just N
f
) is a set of all vectors n~ of R
n
such that f(n~ ) = 0 2 R
m
. The image
of f denoted by I m(f) is the set of all vectors w~ 2R
m
such that w~ = f(v~) for some v~ 2R
n
.
Proposition 3. The kernel of a linear mapping f : R
n
! R
m
is a vector subspace of R
n
.
The image of f is a vector subspace of R
m
.
Problem 19. Prove the proposition.
Theorem 1. Assume that f : R
n
!R
m
is a linear mapping. The following relation holds
n = dim ker(f) + dim Im(f): (1)
Problem 20. Let S denote the orthogonal subspace of ker(f). Show that dim S = dim Im(f) and
conclude f (S) = Im(f ).
1.5 Linear m appings from R
n
to R
n
1.5.1 Determinant
Let f: R
n
!R
n
be a linear mapping, and let C be a unit cube constructed on e^
k
; k = 1; :::;
n. The image of C under f, that is f(C), is a parallelogram. In fact, every vector u~ 2C is
represented by the linear combination
u~ = c
1
e^
1
+ ·· ·+ c
n
e^
n
;
1 Linear Algebra 5
for 0 c
k
1, and thus
f(u~ ) = c
1
f
1
~
+ ···+ c
n
f
n
~
;
where f
k
= f (e^
k
). The set fc
1
f
1
~
+ ···+ c
n
f
n
~
g for 0 c
k
1 is a parallelogram constructed on
f
1
~
; :: :; f
~
n
; see Fig. 2 .
e^
1
f
~
1
= f (e^
1
)
f
~
2
= f (e^
2
)
e^
2
Figure 2.
Definition 5. Let f : R
n
!R
n
be a linear mapping, and let C be the unit cube constructed
on fe
k
^ g
k=1
n
. The determinant of f denoted by det(f) is the algebraic volume of parallelogram
f(C). The algebraic volume is t he signed volume with positive or negative signs.
Example 2. In R
2
, the determinant of A =
a
11
a
12
a
21
a
22
is defined by the following formul a
det
a
11
a
12
a
21
a
22
= a
11
a
22
¡a
12
a
21
: (2)
It is simply verified that jdet(A)j= jjA(e^
1
)jjjjA(e^
2
)jjsin(θ), where θ is the angel between two
columns of A.
If det(f) = 0, then the volume degenerates, that means vectors f
~
1
; :::; f
~
n
are linearly
dependent. If det(f ) < 0, then f changes the standard orientation of the basis fe^
k
g
k=1
n
(remember the standard rotations in R
2
and R
3
), for example f (x; y)=(y; x) with the matrix
representation
A = [f(e^
1
)jf (e^
2
)] =
0 1
1 0
;
changes the standard rotation. Let A
f
= [f
~
1
··jf
~
n
] be the representation of f: R
n
! R
n
in
the standard basis fe^
k
g
k=1
n
. The determinant det(A
f
) satisfies th e following properties:
i. If f
1
~
; ·· ·; f
~
n
are linearly dependent then det[A] = 0
ii. det[f
~
2
jf
~
1
j···jf
n
~
] = ¡det[A
f
]. In general any switch b etween column i a nd j multiple
the determinant by the factor (¡1)
i+j
.
iii. det[c f
~
1
jf
2
~
··jf
n
~
] = c det(A
f
)
iv. det[c
1
f
~
1
+ c
2
f
~
k
jf
~
2
j···jf
~
n
] = c
1
det[A
f
] for any k = 2; :::; n.
By the above properties, it is seen that if f ; g: R
n
!R
n
are two linear mappings then
det(AB) = det(A) det(B)
6 Appendix
Problem 21. Verify directly the above claim for 2 ×2 matrices.
1.5.2 Injective and surjective ma pping s
Definition 6. A linear mapping f : R
n
! R
m
is called one-to-one or injective if equality
f(u~ ) = f(v~) implies u~ = v~. A linear mapping f: R
n
! R
m
is called onto or surjective, if
R
m
= f(R
n
).
Problem 22. A linear mapping f : R
n
!R
n
is one to one if and only if ker(f) = ;, and if and only if
it is onto.
Problem 23. Let f: R
n
! R
m
be a linear mapping. Show tha t if m > n, then f can not be onto, if
m < n then f can not be one-to-one.
If f: R
n
! R
n
is one-to-one (and then onto), the mapping f
¡1
: R
n
! R
n
is called the
inverse mapping of f if the following relation holds
ff
¡1
= f
¡1
f = Id;
where Id is the identity mapping on R
n
. The identity mapping has the matrix representation
diag(1; :::; 1), where diag(1; :::; 1) has 1 on the main diagonal and zero everywhere else. Note
that Id(u~ ) = u~ for any vector u~ .
Problem 24. If f: R
n
!R
n
is a one to on e linear map, show that f
¡1
is also a one to one linear map.
1.5.3 Eigenvalues and Eigenvectors
A vector v~ 2 R
n
¡ f0g is called a n eigenvector of a linear mapping f : R
n
! R
n
if there is
a scalar λ such that f(v~) = λv~. It is seen that if v~ is an eigenvector, then vector w~ = tv~ for
arbitrary scalar t is also an eigenvector. Accordingly, one can define an eigendirection of f
that is spanfv~ g:= ft v~ ; t 2Rg; see Fig. 3.
v~
λv
~
= f (v
~
)
f
Figure 3.
Example 3. The vector v~ = (1; 1) is an eigenvector of the matrix A =
1 1
¡1 3
with the
eigenvalue λ = 2, because
1 1
¡1 3
1
1
= 2
1
1
. Matrix A =
1 1
¡1 3
has only one eigenvector
and matrix A =
2 3
¡4 ¡5
has two eigenvectors v~
1
= (1; ¡1) and v
2
~ = (3; ¡4) with eigenvalues
λ
1
= ¡1 and λ
2
= ¡2 respectively. The rotation matri x R
θ
=
cos(θ) ¡sin(θ)
sin(θ) cos(θ)
has no (real)
eigenvector for θ =/ 0; 2π. Recall that R
θ
rotates vectors counterclockwise by θ-angle. The
identity matrix I
2×2
=
1 0
0 1
has infinitely many eigenvectors. In fact, every vector in R
2
is
an eigenvector of Id
2×2
with eigenvalue λ = 1.
1 Linear Algebra 7
Proposition 4. If f: R
n
! R
n
has n distinct eigenvalues λ
1
; :::; λ
n
, then their associated
eigenvectors v~
1
; :::; v~
n
are linearly independent.
Problem 25. Prove the proposition.
If v~ is an eigenvector of a linear mapping f with eigenvalue λ, then (f ¡λId)v~ = 0, and
since v~ is nonzero, v~ must belong to the kernel of f ¡λId. Let A
f
is a matrix representation
of f, then the following relation holds
det(A
f
¡λId) = 0:
The above equation, which is an algebraic equation of λ, is called the characteristic equation
of f. If A =
a
11
a
12
a
21
a
22
, the characteristic equation is as follows
λ
2
¡tr(A) λ + det(A) = 0; (3)
where tr(A) (read trace A) is equal to a
11
+ a
22
.
Problem 26. Show that if A
2×2
has a r epeated eigenva lue λ with two linearly independent eigenvectors
then all vectors of R
2
is an eigenvector of A.
Problem 27. Let A be a 2 ×2 matrix. P rove that the f ollowing statements are equivalent
i. A is invertible.
ii. Two columns of A are linearly independent.
iii. The determinant of A is non-zero.
iv. No eigenvalue of A is zero.
Problem 28. If λ
1
; λ
2
are two eigenvalues of A
2×2
, show that det(A) = λ
1
λ
2
and tr(A) = λ
1
+ λ
2
.
Problem 29. If Q
2×2
is an invertible matrix, show the following relations
tr(Q
¡1
AQ) = tr(A), det(Q
¡1
AQ) = det(A):
1.5.4 Symmetric mappings and Jordan forms
Definition 7. A linear mapping f : R
n
! R
n
is called symmetric if the following equality
holds for arbitrary vectors u~ ; v~ 2R
n
:
f(u~ ) ·v~ = u~ · f (v~):
Theorem 2. If the linear mapping f : R
n
! R
n
is sy mmetric, then there are n mutually
orthogonal eigenvectors v~
1
; :: :; v~
n
for f. Moreover, all eigenvalues of f are real.
Problem 30. Assume v~
1
; v~
2
are two eigenvectors of a symmetric mapping f. Show t ha t hv~
1
; v~
2
i= 0.
If f: R
n
! R
n
has n linearly independent eigenvectors v~
1
; :::; v~
n
, then R
n
can be
decomposed by the direct sum R
n
= V
1
···V
n
where V
k
= spanfv~
k
g. The restriction of f
to each V
k
is a linear mapping f
k
: V
k
!V
k
, and thus we can decompose f as the direct s um
f = f
1
··· f
n
. With this interpretation, every vector v~ 2R
n
has a unique representation
v~ = c
1
v~
1
+ ···+ c
n
v~
n
, and thus f(v~) is
f(v~) = f
1
(c
1
v~
1
) + ···+ f
n
(c
n
v~
n
) = c