Skip to main content

Section 36.6 Theory

Subsection 36.6.1 Properties of inner products

Here we list algebraic properties of inner products, including those prescribed by the inner product axioms.

We will prove only Rule 9 and leave the rest up to you, the reader. (Though some of these are just restatements of the inner product axioms.)

Following the advice of Discovery 36.4, consider

\begin{equation*} \inprod{\zerovec}{\uvec{v}} = \inprod{k \zerovec}{\uvec{v}} = k \inprod{\zerovec}{\uvec{v}} \text{,} \end{equation*}

which holds for every scalar \(k\) by Rule 6 (or, really, by Axiom RIP 3). If we take scalar \(k = 0\text{,}\) then we get

\begin{equation*} \inprod{\zerovec}{\uvec{v}} = 0 \inprod{\zerovec}{\uvec{v}} = 0 \text{,} \end{equation*}

as desired.

The symmetric formula \(\inprod{\uvec{v}}{\zerovec} = 0\) then follows by Rule 1 (or, really, by Axiom RIP 1).

Most of the rules above also hold for complex inner products, but there are some wrinkles due to the complex conjugate.

We will prove only Rule 7 and leave the rest up to you, the reader. (Though some of these are just restatements of the inner product axioms.)

Calculate

\begin{align*} \inprod{\uvec{u}}{a \uvec{v}} \amp = \lcconj{\inprod{a \uvec{v}}{\uvec{u}}} \amp \amp\text{(i)}\\ \amp = \lcconj{a \inprod{\uvec{v}}{\uvec{u}}} \amp \amp\text{(ii)} \\ \amp = \cconj{a} \lcconj{\inprod{\uvec{v}}{\uvec{u}}} \amp \amp\text{(iii)} \\ \amp = \cconj{a} \uvecinprod{u}{v} \amp \amp\text{(iv)} \text{,} \end{align*}

with justifications

  1. Rule 1 (or, really, Axiom CIP 1);
  2. Rule 6 (or, really, Axiom CIP 3);
  3. Rule 3 of Proposition A.2.11; and
  4. Rule 1 (or, really, Axiom CIP 1).

The addition and scalar multiple rules for inner products can be combined to demonstrate that a real inner product is bilinear (i.e. linear in both terms), while a complex inner product is sesquilinear (i.e. linear in the first term but conjugate-linear in the second).

The properties of norm that we encountered using the dot product in \(\R^n\) remain true in every inner product space.

Both The Cauchy-Schwarz inequality and the Triangle inequality remain true in every inner product space, real or complex.

A vector can always be normalized to a unit vector.

Just calculate

\begin{equation*} \norm{\frac{k}{\unorm{v}} \uvec{v}} \end{equation*}

using Rule 3 of Proposition 36.6.4, and simplify to get \(1\text{.}\)

Remark 36.6.8.

As usual, in the above proposition \(\abs{k}\) indicates the ordinary absolute value in the real context and the complex modulus in the complex context. The only two real scalars that satisfy \(\abs{k} = 1\) are \(k = \pm 1\text{.}\) But in the complex context, there are an infinite number of scalars that satisfy this condition: the entire unit circle in the complex plane.

Subsection 36.6.2 Inner products of \(\R^n\) and \(\C^n\)

Finally we'll address what we learned about alternative inner products in Discovery 36.7 and Subsection 36.4.7.

In the real case, assuming \(A\) is positive definite means, by definition, that \(A\) is symmetric and that the pairing defined in the statement satisfies Axiom RIP 4. The symmetry of \(A\) addresses Axiom RIP 1, which leaves only Axiom RIP 2 and Axiom RIP 3, which are easily confirmed using the properties of matrix algebra (Proposition 4.5.1).

The complex case is similar, with self-adjointness taking the place of symmetry.

The converse of the above proposition is true as well.

We will prove only the real case; the complex case is similar.

First, notice that for every matrix \(A\text{,}\)

\begin{equation*} \utrans{\uvec{e}}_i A \uvec{e}_j = a_{ij} \text{,} \end{equation*}

where \(a_{ij}\) is the \((i,j)\) entry of \(A\) and \(\uvec{e}_i,\uvec{e}_j\) are standard basis vectors, as usual. So if we set \(A\) to be the matrix with entries

\begin{equation*} a_{ij} = \inprod{\uvec{e}_j}{\uvec{e}_i} \text{,} \end{equation*}

then we have

\begin{equation*} \inprod{\uvec{e}_i}{\uvec{e}_j} = a_{ji} = \utrans{\uvec{e}}_j A \uvec{e}_i \text{,} \end{equation*}

which is a start. And, as usually is the case, what is true for basis vectors will be true for all vectors.

Let's check: given \(\uvec{u},\uvec{v}\) in \(\R^n\text{,}\) we have

\begin{align*} \utrans{\uvec{v}} A \uvec{u} \amp = \utrans{(v_1 \uvec{e}_1 + \dotsb + v_n \uvec{e}_n)} A (u_1 \uvec{e}_1 + \dotsb + u_n \uvec{e}_n)\\ \amp = (v_1 \utrans{\uvec{e}}_1 + \dotsb + v_n \utrans{\uvec{e}}_n) (u_1 A \uvec{e}_1 + \dotsb + u_n A \uvec{e}_n)\\ \amp = \sum_{i=1}^n \sum_{j=1}^n v_j u_i \utrans{\uvec{e}}_j A \uvec{e}_i\\ \amp = \sum_{i=1}^n \sum_{j=1}^n u_i v_j \inprod{\uvec{e}_i}{\uvec{e}_j}\\ \amp = \inprod{u_1 \uvec{e}_1 + \dotsb + u_n \uvec{e}_n}{v_1 \uvec{e}_1 + \dotsb + v_n \uvec{e}_n}\\ \amp = \uvecinprod{u}{v}\text{,} \end{align*}

as desired.

It only remains to verify that \(A\) is positive definite. Using Axiom RIP 1, we have

\begin{equation*} a_{ij} = \inprod{\uvec{e}_j}{\uvec{e}_i} = \inprod{\uvec{e}_i}{\uvec{e}_j} = a_{ji}\text{,} \end{equation*}

so \(A\) is symmetric. And the fact that

\begin{equation*} \utrans{\uvec{v}} A \uvec{u} = \uvecinprod{u}{v} \text{,} \end{equation*}

that we have verified above, guarantees that

\begin{equation*} \utrans{\uvec{v}} A \uvec{v} \gt 0 \end{equation*}

for all nonzero \(\uvec{v}\text{,}\) by Axiom RIP 4.

Finally, we verify the method of constructing positive definite matrices discussed in Subsection 36.4.7.

Again, we will provide the proof only in the real context; the complex context is similar.

It is straightfoward to verify that \(A = \utrans{P} P\) is symmetric by showing \(\utrans{A} = A\text{.}\) So we will focus on the positive part of the definition of positive definite real matrix. Suppose \(\uvec{x}\) is a nonzero column vector in \(\R^n\text{.}\) Then,

\begin{align*} \utrans{\uvec{x}} A \uvec{x} \amp = \utrans{\uvec{x}} \utrans{P} P \uvec{x} \\ \amp = \utrans{(P \uvec{x})} (P \uvec{x}) \\ \amp = \dotprod{(P \uvec{x})}{(P \uvec{x})} \\ \amp = \norm{P \uvec{x}}^2 \text{.} \end{align*}

We assumed both \(P\) invertible and \(\uvec{x} \neq \zerovec\text{,}\) so \(P \uvec{x} \neq \zerovec\) must be true as well (Theorem 6.5.2). Which also means \(\norm{P \uvec{x}} \neq 0\) (Rule 1 of Proposition 36.6.4). And therefore

\begin{equation*} \utrans{\uvec{x}} A \uvec{x} = \norm{P \uvec{x}}^2 > 0 \text{,} \end{equation*}

as desired.