Skip to main content
Logo image

Discover Linear Algebra

Section 36.6 Theory

Subsection 36.6.1 Properties of inner products

Here we list algebraic properties of inner products, including those prescribed by the inner product axioms.

Proof.

We will prove only Ruleย 9 and leave the rest up to you, the reader. (Though some of these are just restatements of the inner product axioms.)
Following the advice of Discoveryย 36.4, consider
\begin{equation*} \inprod{\zerovec}{\uvec{v}} = \inprod{k \zerovec}{\uvec{v}} = k \inprod{\zerovec}{\uvec{v}} \text{,} \end{equation*}
which holds for every scalar \(k\) by Ruleย 6 (or, really, by Axiom RIPย 3). If we take scalar \(k = 0\text{,}\) then we get
\begin{equation*} \inprod{\zerovec}{\uvec{v}} = 0 \inprod{\zerovec}{\uvec{v}} = 0 \text{,} \end{equation*}
as desired.
The symmetric formula \(\inprod{\uvec{v}}{\zerovec} = 0\) then follows by Ruleย 1 (or, really, by Axiom RIPย 1).
Most of the rules above also hold for complex inner products, but there are some wrinkles due to the complex conjugate.

Proof.

We will prove only Ruleย 7 and leave the rest up to you, the reader. (Though some of these are just restatements of the inner product axioms.)
Calculate
\begin{align*} \inprod{\uvec{u}}{a \uvec{v}} \amp = \lcconj{\inprod{a \uvec{v}}{\uvec{u}}} \amp \amp\text{(i)}\\ \amp = \lcconj{a \inprod{\uvec{v}}{\uvec{u}}} \amp \amp\text{(ii)} \\ \amp = \cconj{a} \lcconj{\inprod{\uvec{v}}{\uvec{u}}} \amp \amp\text{(iii)} \\ \amp = \cconj{a} \uvecinprod{u}{v} \amp \amp\text{(iv)} \text{,} \end{align*}
with justifications
  1. Ruleย 1 (or, really, Axiom CIPย 1);
  2. Ruleย 6 (or, really, Axiom CIPย 3);
  3. Ruleย 1 (or, really, Axiom CIPย 1).
The addition and scalar multiple rules for inner products can be combined to demonstrate that a real inner product is bilinear (i.e. linear in both terms), while a complex inner product is sesquilinear (i.e. linear in the first term but conjugate-linear in the second).

Aside: Terminology.

The properties of norm that we encountered using the dot product in \(\R^n\) remain true in every inner product space.
Both the Cauchy-Schwarz inequality (Theoremย 13.5.4) and the Triangle inequality (Theoremย 13.5.6) remain true in every inner product space, real or complex.
A vector can always be normalized to a unit vector.

Proof idea.

Remark 36.6.8.

As usual, in the above proposition \(\abs{k}\) indicates the ordinary absolute value in the real context and the complex modulus in the complex context. The only two real scalars that satisfy \(\abs{k} = 1\) are \(k = \pm 1\text{.}\) But in the complex context, there are an infinite number of scalars that satisfy this condition: the entire unit circle in the complex plane.

Subsection 36.6.2 Inner products of \(\R^n\) and \(\C^n\)

Finally weโ€™ll address what we learned about alternative inner products in Discoveryย 36.7 and Subsectionย 36.4.7.

Proof outline.

The converse of the above proposition is true as well.

Proof.

We will prove only the real case; the complex case is similar.
First, notice that for every matrix \(A\text{,}\)
\begin{equation*} \utrans{\uvec{e}}_i A \uvec{e}_j = a_{ij} \text{,} \end{equation*}
where \(a_{ij}\) is the \((i,j)\) entry of \(A\) and \(\uvec{e}_i,\uvec{e}_j\) are standard basis vectors, as usual. So if we set \(A\) to be the matrix with entries
\begin{equation*} a_{ij} = \inprod{\uvec{e}_j}{\uvec{e}_i} \text{,} \end{equation*}
then we have
\begin{equation*} \inprod{\uvec{e}_i}{\uvec{e}_j} = a_{ji} = \utrans{\uvec{e}}_j A \uvec{e}_i \text{,} \end{equation*}
which is a start. And, as usually is the case, what is true for basis vectors will be true for all vectors.
Aside: Notice.
Letโ€™s check: given \(\uvec{u},\uvec{v}\) in \(\R^n\text{,}\) we have
\begin{align*} \utrans{\uvec{v}} A \uvec{u} \amp = \utrans{(v_1 \uvec{e}_1 + \dotsb + v_n \uvec{e}_n)} A (u_1 \uvec{e}_1 + \dotsb + u_n \uvec{e}_n)\\ \amp = (v_1 \utrans{\uvec{e}}_1 + \dotsb + v_n \utrans{\uvec{e}}_n) (u_1 A \uvec{e}_1 + \dotsb + u_n A \uvec{e}_n)\\ \amp = \sum_{i=1}^n \sum_{j=1}^n v_j u_i \utrans{\uvec{e}}_j A \uvec{e}_i\\ \amp = \sum_{i=1}^n \sum_{j=1}^n u_i v_j \inprod{\uvec{e}_i}{\uvec{e}_j}\\ \amp = \inprod{u_1 \uvec{e}_1 + \dotsb + u_n \uvec{e}_n}{v_1 \uvec{e}_1 + \dotsb + v_n \uvec{e}_n}\\ \amp = \uvecinprod{u}{v}\text{,} \end{align*}
as desired.
It only remains to verify that \(A\) is positive definite. Using Axiom RIPย 1, we have
\begin{equation*} a_{ij} = \inprod{\uvec{e}_j}{\uvec{e}_i} = \inprod{\uvec{e}_i}{\uvec{e}_j} = a_{ji}\text{,} \end{equation*}
so \(A\) is symmetric. And the fact that
\begin{equation*} \utrans{\uvec{v}} A \uvec{u} = \uvecinprod{u}{v} \text{,} \end{equation*}
that we have verified above, guarantees that
\begin{equation*} \utrans{\uvec{v}} A \uvec{v} \gt 0 \end{equation*}
for all nonzero \(\uvec{v}\text{,}\) by Axiom RIPย 4.
Finally, we verify the method of constructing positive definite matrices discussed in Subsectionย 36.4.7.

Proof.

Again, we will provide the proof only in the real context; the complex context is similar.
It is straightfoward to verify that \(A = \utrans{P} P\) is symmetric by showing \(\utrans{A} = A\text{.}\) So we will focus on the positive part of the definition of positive definite real matrix. Suppose \(\uvec{x}\) is a nonzero column vector in \(\R^n\text{.}\) Then,
\begin{align*} \utrans{\uvec{x}} A \uvec{x} \amp = \utrans{\uvec{x}} \utrans{P} P \uvec{x} \\ \amp = \utrans{(P \uvec{x})} (P \uvec{x}) \\ \amp = \dotprod{(P \uvec{x})}{(P \uvec{x})} \\ \amp = \norm{P \uvec{x}}^2 \text{.} \end{align*}
We assumed both \(P\) invertible and \(\uvec{x} \neq \zerovec\text{,}\) so \(P \uvec{x} \neq \zerovec\) must be true as well (Theoremย 6.5.2). Which also means \(\norm{P \uvec{x}} \neq 0\) (Ruleย 1 of Propositionย 36.6.4). And therefore
\begin{equation*} \utrans{\uvec{x}} A \uvec{x} = \norm{P \uvec{x}}^2 > 0 \text{,} \end{equation*}
as desired.