Skip to main content
Discover Linear Algebra:
A first course in linear algebra
Jeremy Sylvestre
Contents
Index
Search Book
close
Search Results:
No results.
Prev
Up
Next
\(\require{cancel} \newcommand{\bigcdot}{\mathbin{\large\boldsymbol{\cdot}}} \newcommand{\basisfont}[1]{\mathcal{#1}} \DeclareMathOperator{\RREF}{RREF} \DeclareMathOperator{\adj}{adj} \DeclareMathOperator{\proj}{proj} \DeclareMathOperator{\matrixring}{M} \DeclareMathOperator{\poly}{P} \DeclareMathOperator{\Span}{Span} \DeclareMathOperator{\rank}{rank} \DeclareMathOperator{\nullity}{nullity} \DeclareMathOperator{\nullsp}{null} \DeclareMathOperator{\uppermatring}{U} \DeclareMathOperator{\trace}{trace} \DeclareMathOperator{\dist}{dist} \DeclareMathOperator{\negop}{neg} \DeclareMathOperator{\Hom}{Hom} \DeclareMathOperator{\im}{im} \newcommand{\R}{\mathbb{R}} \newcommand{\C}{\mathbb{C}} \newcommand{\ci}{\mathrm{i}} \newcommand{\cconj}[1]{\bar{#1}} \newcommand{\lcconj}[1]{\overline{#1}} \newcommand{\cmodulus}[1]{\left\lvert #1 \right\rvert} \newcommand{\bbrac}[1]{\bigl(#1\bigr)} \newcommand{\Bbrac}[1]{\Bigl(#1\Bigr)} \newcommand{\irst}[1][1]{{#1}^{\mathrm{st}}} \newcommand{\ond}[1][2]{{#1}^{\mathrm{nd}}} \newcommand{\ird}[1][3]{{#1}^{\mathrm{rd}}} \newcommand{\nth}[1][n]{{#1}^{\mathrm{th}}} \newcommand{\leftrightlinesubstitute}{\scriptstyle \overline{\phantom{xxx}}} \newcommand{\inv}[2][1]{{#2}^{-{#1}}} \newcommand{\abs}[1]{\left\lvert #1 \right\rvert} \newcommand{\degree}[1]{{#1}^\circ} \newcommand{\iddots}{{\kern3mu\raise1mu{.}\kern3mu\raise6mu{.}\kern3mu \raise12mu{.}}} \newcommand{\blank}{-} \newcommand{\iso}{\simeq} \newcommand{\abray}[1]{\overrightarrow{#1}} \newcommand{\abctriangle}[1]{\triangle #1} \newcommand{\mtrxvbar}{\mathord{|}} \newcommand{\utrans}[1]{{#1}^{\mathrm{T}}} \newcommand{\rowredarrow}{\xrightarrow[\text{reduce}]{\text{row}}} \newcommand{\bidentmatfour}{\begin{bmatrix} 1 \amp 0 \amp 0 \amp 0 \\ 0 \amp 1 \amp 0 \amp 0 \\ 0 \amp 0 \amp 1 \amp 0 \\ 0 \amp 0 \amp 0 \amp 1\end{bmatrix}} \newcommand{\uvec}[1]{\mathbf{#1}} \newcommand{\zerovec}{\uvec{0}} \newcommand{\bvec}[2]{#1\,\uvec{#2}} \newcommand{\ivec}[1]{\bvec{#1}{i}} \newcommand{\jvec}[1]{\bvec{#1}{j}} \newcommand{\kvec}[1]{\bvec{#1}{k}} \newcommand{\injkvec}[3]{\ivec{#1} - \jvec{#2} + \kvec{#3}} \newcommand{\norm}[1]{\left\lVert #1 \right\rVert} \newcommand{\unorm}[1]{\norm{\uvec{#1}}} \newcommand{\dotprod}[2]{#1 \bigcdot #2} \newcommand{\udotprod}[2]{\dotprod{\uvec{#1}}{\uvec{#2}}} \newcommand{\crossprod}[2]{#1 \times #2} \newcommand{\ucrossprod}[2]{\crossprod{\uvec{#1}}{\uvec{#2}}} \newcommand{\uproj}[2]{\proj_{\uvec{#2}} \uvec{#1}} \newcommand{\adjoint}[1]{{#1}^\ast} \newcommand{\matrixOfplain}[2]{{\left[#1\right]}_{#2}} \newcommand{\rmatrixOfplain}[2]{{\left(#1\right)}_{#2}} \newcommand{\rmatrixOf}[2]{\rmatrixOfplain{#1}{\basisfont{#2}}} \newcommand{\matrixOf}[2]{\matrixOfplain{#1}{\basisfont{#2}}} \newcommand{\invmatrixOfplain}[2]{\inv{\left[#1\right]}_{#2}} \newcommand{\invrmatrixOfplain}[2]{\inv{\left(#1\right)}_{#2}} \newcommand{\invmatrixOf}[2]{\invmatrixOfplain{#1}{\basisfont{#2}}} \newcommand{\invrmatrixOf}[2]{\invrmatrixOfplain{#1}{\basisfont{#2}}} \newcommand{\stdmatrixOf}[1]{\left[#1\right]} \newcommand{\ucobmtrx}[2]{P_{\basisfont{#1} \to \basisfont{#2}}} \newcommand{\uinvcobmtrx}[2]{\inv{P}_{\basisfont{#1} \to \basisfont{#2}}} \newcommand{\uadjcobmtrx}[2]{\adjoint{P}_{\basisfont{#1} \to \basisfont{#2}}} \newcommand{\coordmapplain}[1]{C_{#1}} \newcommand{\coordmap}[1]{\coordmapplain{\basisfont{#1}}} \newcommand{\invcoordmapplain}[1]{\inv{C}_{#1}} \newcommand{\invcoordmap}[1]{\invcoordmapplain{\basisfont{#1}}} \newcommand{\similar}{\sim} \newcommand{\inprod}[2]{\left\langle\, #1,\, #2 \,\right\rangle} \newcommand{\uvecinprod}[2]{\inprod{\uvec{#1}}{\uvec{#2}}} \newcommand{\orthogcmp}[1]{{#1}^{\perp}} \newcommand{\vecdual}[1]{{#1}^\ast} \newcommand{\vecddual}[1]{{#1}^{\ast\ast}} \newcommand{\dd}[2]{\frac{d{#1}}{d#2}} \newcommand{\ddx}[1][x]{\dd{}{#1}} \newcommand{\ddt}[1][t]{\dd{}{#1}} \newcommand{\dydx}{\dd{y}{x}} \newcommand{\dxdt}{\dd{x}{t}} \newcommand{\dydt}{\dd{y}{t}} \newcommand{\intspace}{\;} \newcommand{\integral}[4]{\int^{#2}_{#1} #3 \intspace d{#4}} \newcommand{\funcdef}[3]{#1\colon #2\to #3} \newcommand{\lt}{<} \newcommand{\gt}{>} \newcommand{\amp}{&} \definecolor{fillinmathshade}{gray}{0.9} \newcommand{\fillinmath}[1]{\mathchoice{\colorbox{fillinmathshade}{$\displaystyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\textstyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\scriptstyle \phantom{\,#1\,}$}}{\colorbox{fillinmathshade}{$\scriptscriptstyle\phantom{\,#1\,}$}}} \)
Front Matter
Colophon
Author Biography
Preface
I
Systems of Equations and Matrices
1
Systems of linear equations
1.1
Discovery guide
1.2
Terminology and notation
1.3
Concepts
1.3.1
System solutions
1.3.2
Determining solutions
1.4
Examples
1.4.1
Row operations versus equation manipulations
2
Solving systems using matrices
2.1
Discovery guide
2.2
Terminology and notation
2.3
Concepts
2.3.1
Reducing matrices
2.3.2
Solving systems
2.4
Examples
2.4.1
Worked examples from the discovery guide
2.5
Theory
2.5.1
Reduced matrices
2.5.2
Solving systems using matrices
3
Using systems of equations
3.1
Discovery guide
3.2
Examples
3.2.1
A simple example
3.2.2
Flow in networks
3.2.3
Balancing chemical equations
3.2.4
Polynomial interpolation
3.3
Terminology and notation
3.4
Theory
3.4.1
Polynomial interpolation
4
Matrices and matrix operations
4.1
Discovery guide
4.2
Terminology and notation
4.3
Concepts
4.3.1
Matrix entries
4.3.2
Matrix dimensions
4.3.3
Matrix equality
4.3.4
Basic matrix operations
4.3.5
The zero matrix
4.3.6
Linear systems as matrix equations
4.3.7
Matrix multiplication
4.3.8
Matrix powers
4.3.9
Transpose
4.4
Examples
4.4.1
Basic matrix operations
4.4.2
Matrix multiplication
4.4.3
Combining operations
4.4.4
Linear systems as matrix equations
4.4.4.1
A first example
4.4.4.2
Expressing system solutions in vector form
4.4.5
Transpose
4.5
Theory
4.5.1
Rules of matrix algebra
4.5.2
Linear systems as matrix equations
5
Matrix inverses
5.1
Discovery guide
5.2
Terminology and notation
5.3
Concepts
5.3.1
The identity matrix
5.3.2
Inverse matrices
5.3.3
Matrix division
5.3.4
Cancellation
5.3.5
Solving systems using inverses
5.4
Examples
5.4.1
Inverses of
\(2\times 2\)
matrices
5.4.2
Solving systems using inverses
5.4.3
Solving other matrix equations using inverses
5.5
Theory
5.5.1
Properties of the identity matrix
5.5.2
Properties of the inverse
6
Elementary matrices
6.1
Discovery guide
6.2
Terminology and notation
6.3
Concepts
6.3.1
Elementary matrices
6.3.2
Inverses by elementary matrices
6.3.3
Inverses of elementary matrices
6.3.4
Decomposition of invertible matrices
6.3.5
Inverses by row reduction
6.4
Examples
6.4.1
Elementary matrices and their inverses
6.4.2
Decomposing an invertible matrix and its inverse into elementary matrices
6.4.3
Inversion by row reduction
6.5
Theory
6.5.1
Inverses of elementary matrices
6.5.2
Inverses versus row operations
6.5.3
More properties of inverses
6.5.4
Solution sets of row equivalent matrices
7
Special forms of square matrices
7.1
Discovery guide
7.2
Terminology and notation
7.3
Concepts
7.3.1
Algebra with scalar matrices
7.3.2
Inverses of special forms
7.3.3
Decompositions using special forms
7.4
Examples
7.4.1
Computation patterns
7.5
Theory
7.5.1
Algebra of special forms
7.5.2
Invertibility of special forms
8
Determinants
8.1
Discovery guide
8.2
Terminology and notation
8.3
Concepts
8.3.1
Definition of the determinant
8.3.2
Determinants of
\(1 \times 1\)
matrices
8.3.3
Determinants of
\(2 \times 2\)
matrices
8.3.4
Determinants of larger matrices
8.3.5
Determinants of special forms
8.4
Examples
8.4.1
Determinants of
\(2 \times 2\)
matrices
8.4.2
Minors and cofactors of
\(3 \times 3\)
matrices
8.4.2.1
Minors
8.4.2.2
Cofactors
8.4.3
Determinants of
\(3 \times 3\)
matrices
8.4.4
Minors and cofactors of
\(4 \times 4\)
matrices
8.4.5
Determinants of
\(4 \times 4\)
matrices
8.5
Theory
8.5.1
Basic properties of determinants
9
Determinants versus row operations
9.1
Discovery guide
9.2
Concepts
9.2.1
Swapping rows: effect on determinant
9.2.2
Multiplying rows: effect on determinant
9.2.3
Combining rows: effect on determinant
9.2.4
Column operations and the transpose
9.2.5
Determinants by row reduction
9.3
Examples
9.3.1
Determinants by row reduction
9.3.2
Matrices of determinant zero
9.4
Theory
9.4.1
Effect of row operations on the determinant
9.4.2
Determinants of elementary matrices
10
Determinants, the adjoint, and inverses
10.1
Discovery guide
10.2
Terminology and notation
10.3
Concepts
10.3.1
The classical adjoint
10.3.2
Determinants determine invertibility
10.3.3
Determinants versus matrix multiplication: case of elementary matrices
10.3.4
Determinants versus matrix multiplication: invertible case
10.3.5
Determinants versus matrix multiplication: singular case
10.3.6
Determinants versus matrix multiplication: all cases
10.3.7
Determinant of an inverse
10.3.8
Cramer’s rule
10.4
Examples
10.4.1
The
\(2\times 2\)
case
10.4.2
Computing an inverse using the adjoint
10.4.3
Cramer’s rule
10.5
Theory
10.5.1
Adjoints and inverses
10.5.2
Determinants determine invertibility
10.5.3
Determinant formulas
10.5.4
Cramer’s rule
II
Vector Spaces
11
Introduction to vectors
11.1
Discovery guide
11.2
Terminology and notation
11.3
Concepts
11.3.1
Vectors
11.3.2
Vector addition
11.3.3
The zero vector
11.3.4
Vector negatives and vector subtraction
11.3.5
Scalar multiplication
11.3.6
Vector algebra
11.3.7
The standard basis vectors
11.4
Examples
11.4.1
Vectors in
\(\R^n\)
11.4.2
Vector operations
11.5
Theory
11.5.1
Vector algebra
12
Geometry of vectors
12.1
Discovery guide
12.2
Terminology and notation
12.3
Concepts
12.3.1
Geometric length of a vector: the norm
12.3.2
Properties of the norm
12.3.3
Unit vectors and normalization
12.3.4
Distance between vectors
12.3.5
Angle between vectors in the plane and in space
12.3.6
Dot product
12.3.7
Angle between vectors in
\(\R^n\)
12.3.8
Dot product versus norm
12.3.9
Dot product as matrix multiplication
12.4
Examples
12.4.1
The norm of a vector
12.4.2
Dot product and the angle between vectors
12.5
Theory
12.5.1
Norm and dot product
12.5.2
Vector geometry inequalities and uniqueness of vector angles
13
Orthogonal vectors
13.1
Discovery guide
13.2
Terminology and notation
13.3
Concepts
13.3.1
Values of
\(\udotprod{u}{v}\)
13.3.2
Orthogonal vectors
13.3.2.1
Orthogonal vectors in
\(\R^2\)
13.3.3
Orthogonal projection
13.3.4
Normal vectors of lines in the plane
13.3.5
Normal vectors of planes in space
13.3.6
The cross product
13.4
Examples
13.4.1
Orthogonal vectors
13.4.2
Orthogonal projection
13.4.3
Cross product
13.5
Theory
13.5.1
Properties of orthogonal vectors and orthogonal projection
13.5.2
Decomposition of a vector into orthogonal components
13.5.3
Properties of the cross product
14
Geometry of linear systems
14.1
Discovery guide
14.2
Terminology and notation
14.3
Concepts
14.3.1
Lines in the plane
14.3.2
Lines in space
14.3.3
Planes in space
14.3.4
Parallel vectors as a “basis” for lines and planes
14.3.5
Summary
14.4
Examples
14.4.1
Describing lines and planes parametrically
14.4.2
Determining points of intersection
15
Abstract vector spaces
15.1
Discovery guide
15.2
Motivation
15.3
Terminology and notation
15.4
Concepts
15.4.1
The ten vector space axioms
15.4.2
Instances of vector spaces
15.5
Examples
15.5.1
Verifying axioms: the space of positive numbers
15.5.2
Verifying axioms: the space of functions
15.6
Theory
15.6.1
Uniqueness of the zero vector and of negatives
15.6.2
Basic vector algebra rules
16
Subspaces
16.1
Discovery guide
16.2
Terminology and notation
16.3
Concepts
16.3.1
Recognizing subspaces
16.3.2
Building subspaces
16.3.3
The subspaces of
\(\R^n\)
16.3.4
Recognizing when two subspaces are the same
16.4
Examples
16.4.1
The Subspace Test
16.4.2
Important subspace examples
16.4.3
Determining if a vector is in a span
16.4.4
Determining if a spanning set generates the whole vector space
16.5
Theory
16.5.1
The Subspace Test
16.5.2
Universal examples of subspaces
16.5.3
Equality of subspaces created via spanning sets
16.6
More examples
17
Linear independence
17.1
Discovery guide
17.2
Terminology and notation
17.3
Concepts
17.3.1
Reducing spanning sets
17.3.2
Linear dependence and independence
17.3.3
Linear dependence and independence of just one or two vectors
17.3.4
Linear dependence and independence in
\(\R^n\)
17.4
Examples
17.4.1
Testing dependence/independence
17.4.2
Linear independence of “standard” spanning sets
17.5
Theory
17.5.1
Basic facts about linear dependence and independence
17.5.2
Linear dependence and independence of spanning sets
18
Basis and Coordinates
18.1
Discovery guide
18.2
Terminology and notation
18.3
Concepts
18.3.1
Basis as a minimal spanning set
18.3.2
Basis as a maximal linearly independent set
18.3.3
Basis is not unique
18.3.4
Ordered versus unordered basis
18.3.5
Coordinates of a vector
18.3.5.1
Basic concept of coordinates relative to a basis
18.3.5.2
Linearity of coordinates
18.4
Examples
18.4.1
Checking a basis
18.4.2
Standard bases
18.4.3
Coordinate vectors
18.5
Theory
18.5.1
Reducing to a basis
18.5.2
Basis as optimal spanning set
19
Dimension
19.1
Discovery guide
19.2
Terminology and notation
19.3
Concepts
19.3.1
The “just-right” number of vectors in a spanning set
19.3.2
Dimension as geometric “degrees of freedom”
19.3.3
Dimension as algebraic “degrees of freedom”
19.3.4
The dimension of a subspace
19.3.5
The dimension of the trivial vector space
19.4
Examples
19.4.1
Determining a basis from a parametric expression
19.4.2
An infinite-dimensional example
19.4.3
Enlarging a linearly independent set to a basis
19.5
Theory
19.5.1
Dimension as size of a basis
19.5.2
Consequences for the theory of linear dependence/independence and spanning
19.5.3
Dimension of subspaces
20
Column, row, and null spaces
20.1
Discovery guide
20.1.1
Column space
20.1.2
Row space
20.1.3
Null space
20.1.4
Relationship between the three spaces
20.2
Terminology and notation
20.3
Concepts
20.3.1
Column space
20.3.2
Row space
20.3.3
Column space versus row space
20.3.4
Null space and the dimensions of the three spaces
20.4
Examples
20.4.1
The three spaces
20.4.2
Enlarging a linearly independent set
20.5
Theory
20.5.1
Column space
20.5.2
Row space
20.5.3
Column and row spaces versus rank and invertibility
III
Introduction to Matrix Forms
21
Eigenvalues and eigenvectors
21.1
Discovery guide
21.2
Terminology and notation
21.3
Motivation
21.4
Concepts
21.4.1
Determining eigenvalues
21.4.2
Eigenvalues for special forms of matrices
21.4.3
Determining eigenvectors
21.4.4
Eigenspaces
21.4.5
Connection to invertibility
21.4.6
The geometry of eigenvectors
21.5
Examples
21.6
Theory
21.6.1
Basic facts
21.6.2
Eigenvalues and invertibility
22
Diagonalization
22.1
Discovery guide
22.2
Terminology and notation
22.3
Motivation
22.4
Concepts
22.4.1
The transition matrix and the diagonal form
22.4.2
Diagonalizable matrices
22.4.3
Diagonalization procedure
22.5
Examples
22.5.1
Carrying out the diagonalization procedure
22.5.2
Determining diagonalizability from multiplicities
22.5.3
A different kind of example
22.6
Theory
22.6.1
Similar matrices
22.6.2
Diagonalizable matrices
22.6.3
The geometry of eigenvectors
22.6.4
More about diagonalizable matrices
Back Matter
A
GNU Free Documentation License
Bibliography
Index
Colophon
Author Biography
Author Biography
Jeremy Sylvestre is Associate Professor of Mathematics at the University of Alberta’s Augustana Campus.