Skip to main content

Section 36.4 Concepts

Subsection 36.4.1 Inner product axioms

We begin by listing the axioms for a real inner product space once again.

Definition 36.4.1. Real inner product axioms.

A real inner product is a pairing that assigns a scalar (i.e. real number) to each (ordered) pair of vectors in a real vector space, according to the following axioms. As usual, bold variable letters represent arbitrary vectors in the vector space, and ordinary variable letters represent arbitrary scalars (i.e. real numbers).

List 36.4.2. (RIP) Real inner product axioms
  1. Symmetry.

    Every \(\uvec{v},\uvec{w}\) satisfy \(\uvecinprod{w}{v} = \uvecinprod{v}{w}\text{.}\)

  2. Additivity.

    Every \(\uvec{u},\uvec{v},\uvec{w}\) satisfy \(\inprod{\uvec{u}+\uvec{v}}{\uvec{w}} = \uvecinprod{u}{w} + \uvecinprod{v}{w}\text{.}\)

  3. Homogeneity.

    Every \(k,\uvec{v},\uvec{w}\) satisfy \(\inprod{k \uvec{v}}{\uvec{w}} = k \uvecinprod{v}{w}\text{.}\)

  4. Positive definiteness.

    Every nonzero \(\uvec{v}\) satisfies \(\uvecinprod{v}{v} \gt 0\text{.}\)

The first three axioms mimic the algebra rules satisfied by the dot product on \(\R^n\text{.}\) (See Proposition 13.5.3.) And as explored in Discovery 36.5, Axiom RIP 4 is included to ensure that we will always be able to compute the square root of the result of \(\uvecinprod{v}{v}\) for all nonzero vectors \(\uvec{v}\text{.}\) (See Subsection 36.4.2.)

In Discovery 36.6, we found that there might be issues with naively transplanting the dot product into \(\C^n\text{.}\) We will discuss again these issues and their resolution below, but for now we will list the properties that the “modified” complex dot product satisfy as axioms that every complex inner product should satisfy.

Definition 36.4.3. Complex inner product axioms.

A complex inner product is a pairing that assigns a scalar (i.e. complex number) to each (ordered) pair of vectors in a complex vector space, according to the following axioms. As usual, bold variable letters represent arbitrary vectors in the vector space, and ordinary variable letters represent arbitrary scalars (i.e. complex numbers).

List 36.4.4. (CIP) Complex inner product axioms
  1. Symmetry.

    Every \(\uvec{v},\uvec{w}\) satisfy \(\uvecinprod{w}{v} = \lcconj{\uvecinprod{v}{w}}\text{.}\)

  2. Additivity.

    Every \(\uvec{u},\uvec{v},\uvec{w}\) satisfy \(\inprod{\uvec{u}+\uvec{v}}{\uvec{w}} = \uvecinprod{u}{w} + \uvecinprod{v}{w}\text{.}\)

  3. Homogeneity.

    Every \(k,\uvec{v},\uvec{w}\) satisfy \(\inprod{k \uvec{v}}{\uvec{w}} = k \uvecinprod{v}{w}\text{.}\)

  4. Positive definiteness.

    For every nonzero \(\uvec{v}\text{,}\) the result of \(\uvecinprod{v}{v}\) is a real number that satisfies \(\uvecinprod{v}{v} \gt 0\text{.}\)

In both cases, an inner product is required to produce scalar results from vector inputs — real scalars in the case of a real inner product, and complex scalars in the case of a complex inner product. It is for this reason that an inner product is also sometimes referred to as a scalar product.

Subsection 36.4.2 Geometry in real inner product spaces

In analogy with the formulas

\begin{align*} \unorm{v} \amp = \sqrt{\udotprod{v}{v}} \text{,} \amp \theta \amp = \inv{\cos} \left( \frac{\udotprod{u}{v}}{\unorm{u}\unorm{v}} \right) \end{align*}

relating the dot product to the geometry of \(\R^n\text{,}\) we make the following definitions in real inner product spaces:

\begin{align*} \unorm{v} \amp = \sqrt{\uvecinprod{v}{v}} \text{,} \amp \theta \amp = \inv{\cos} \left( \frac{\uvecinprod{u}{v}}{\unorm{u}\unorm{v}} \right) \text{.} \end{align*}

Axiom RIP 4 ensures that the formula for norm of a vector always makes sense. Also, The Cauchy-Schwarz inequality remains true when the dot product is replaced by the inner product of any real inner product space, which ensures that the formula for the angle between two vectors always makes sense. (See the discussion in Subsection 13.3.7, imagining the dot product in \(\R^n\) replaced by the inner product in a real inner product space.)

These formulas answer the questions in Section 36.1: yes, a matrix does have “length” (though we will call it norm instead of length); and yes, two functions can be “perpendicular” to each other (though we will call it orthogonal, as in Chapter 14). See Subsection 36.5.2 for examples of these calculations.

Subsection 36.4.3 Norm and the dot product in \(\C^n\)

As we reminded ourselves in Discovery 36.6, we already have a way to compute length in \(\C^1\text{,}\) the complex plane: for \(z = a + b \ci\text{,}\) the complex modulus

\begin{gather} \cmodulus{z} = \sqrt{a^2 + b^2}\label{equation-inner-prod-concepts-complex-modulus}\tag{\(\star\)} \end{gather}

is the distance from the origin to the point in the complex plane corresponding to \(z\text{.}\) If we imagine \(z\) as not just a point but as a vector from the origin to that terminal point instead, then \(\cmodulus{z}\) is the length of that vector.

Just as norm in \(\R^n\) for increasing \(n\) follows the pattern

\begin{align*} \text{in } \R^1 \amp \text{:} \amp \norm{(x_1)} \amp = \sqrt{x_1^2} = \abs{x_1} \text{,} \\ \text{in } \R^2 \amp \text{:} \amp \norm{(x_1,x_2)} \amp = \sqrt{x_1^2 + x_2^2} \text{,} \\ \text{in } \R^3 \amp \text{:} \amp \norm{(x_1,x_2,x_3)} \amp = \sqrt{x_1^2 + x_2^2 + x_3^2} \text{,} \end{align*}

and so on, it seems it would make sense to have norm in \(\C^n\) follow a similar pattern, starting at (\(\star\)):

\begin{align*} \text{in } \C^1 \amp \text{:} \amp \norm{(z_1)} \amp = \sqrt{\cmodulus{z_1}^2} = \cmodulus{z_1} \text{,} \\ \text{in } \C^2 \amp \text{:} \amp \norm{(z_1,z_2)} \amp = \sqrt{\cmodulus{z_1}^2 + \cmodulus{z_2}^2} \text{,} \\ \text{in } \C^3 \amp \text{:} \amp \norm{(z_1,z_2,z_3)} \amp = \sqrt{\cmodulus{z_1}^2 + \cmodulus{z_2}^2 + \cmodulus{z_3}^2} \text{,} \end{align*}

and so on.

To trace this back to an appropriate complex inner product on \(\C^n\) (as we attempted to do in Discovery 36.6), the formula

\begin{equation*} \cmodulus{z}^2 = z\cconj{z} \end{equation*}

seems to suggest that the dot product

\begin{equation*} \udotprod{x}{y} = x_1 y_1 + x_2 y_2 + \dotsb + x_n y_n \end{equation*}

from \(\R^n\) needs a complex conjugate in it in to adapt it for use in \(\C^n\text{:}\)

\begin{gather} \udotprod{w}{z} = w_1 \cconj{z}_1 + w_2 \cconj{z}_2 + \dotsb + w_n \cconj{z}_n\text{.}\label{equation-inner-prod-concepts-complex-dot-product}\tag{\(\star\star\)} \end{gather}

This definition of the standard inner product on \(\C^n\) satisfies Axiom CIP 4, and so the formula

\begin{equation*} \unorm{z} = \sqrt{\udotprod{z}{z}} \end{equation*}

both makes sense mathematically and matches up with the pattern of norms in \(\C^n\) explored above.

Subsection 36.4.4 Complex inner products

Just as we used dot product on \(\R^n\) as the model for the axioms for real inner products, we use the complex dot product on \(\C^n\) as the model for the axioms for complex inner products. This explains the differences between the two sets of axioms (real and complex).

Using (\(\star\star\)), we see that

\begin{align*} \udotprod{z}{w} \amp = z_1 \cconj{w}_1 + z_2 \cconj{w}_2 + \dotsb + z_n \cconj{w}_n \\ \amp = \lcconj{\cconj{z}_1 w_1 + \cconj{z}_2 w_2 + \dotsb + \cconj{z}_n w_n} \\ \amp = \lcconj{\udotprod{w}{z}} \text{,} \end{align*}

which leads to Axiom CIP 1.

As well, in analogy with geometric length in \(\R^2\) and \(\R^3\text{,}\) we would like a norm to always be a real number. So while Axiom CIP 4 is essentially the same in content as Axiom RIP 4, we chose in Axiom CIP 4 to explicitly specify that the result of \(\uvecinprod{v}{v}\) should be real, since in general a complex inner product will output complex results.

Subsection 36.4.5 Geometry in complex inner product spaces

Norm in \(\C^n\) was our guide to creating the complex dot product, and Axiom CIP 4 allows us to use the same definition of norm in any complex inner product space:

\begin{equation*} \unorm{z} = \sqrt{\uvecinprod{z}{z}} \text{.} \end{equation*}

It is possible to also use a complex inner product to define angle between vectors in a complex inner product space, but we will not have a need to, so we will leave that matter for your future studies. However, there is one special angle, \(\theta = \pi/2\text{,}\) that will play an important role in the following chapters, so while we will not study angles in general we will still make use of the concept of orthogonal vectors.

Subsection 36.4.6 Dot products as matrix multiplication

In Subsection 13.3.9, we discussed how dot product on \(\R^n\) is essentially just matrix multiplication. Viewing vectors in \(\R^n\) as \(n \times 1\) column vectors, we have

\begin{equation*} \udotprod{u}{v} = \utrans{\uvec{u}} \uvec{v} = \utrans{\uvec{v}} \uvec{u} \text{.} \end{equation*}

As in Axiom RIP 1, the order doesn't matter, so we can use either of the two expressions above.

For a complex inner product, the order does matter. We can still realize the complex dot product on \(\C^n\) as matrix multiplication, but it will look different in different orders of matrix multiplication:

\begin{equation*} \udotprod{w}{z} = \utrans{\uvec{w}} \cconj{\uvec{z}} = \utrans{\cconj{\uvec{z}}} \uvec{w} \text{.} \end{equation*}

Since we have already combined conjugate-transpose into a single computation, the complex adjoint, we will prefer the “reversed order” expression:

\begin{equation*} \udotprod{w}{z} = \adjoint{\uvec{z}} \uvec{w} \text{.} \end{equation*}

And to maintain consistency, we will also prefer the “reversed order” expression for the real dot product:

\begin{equation*} \udotprod{u}{v} = \utrans{\uvec{v}} \uvec{u} \text{.} \end{equation*}

Subsection 36.4.7 Other inner products on \(\R^n\) and \(\C^n\)

Inner products on \(\R^n\).

In Discovery 36.7, we explored modifying the formula

\begin{equation*} \udotprod{u}{v} = \utrans{\uvec{v}} \uvec{u} \end{equation*}

to produce other inner products on \(\R^n\text{.}\) Recognizing that there is a secret identity matrix in the dot-product-as-matrix-multiplication formula,

\begin{equation*} \udotprod{u}{v} = \utrans{\uvec{v}} I \uvec{u} \text{,} \end{equation*}

we explored the conditions on an \(n \times n\) matrix \(A\) so that

\begin{gather} \uvecinprod{u}{v} = \utrans{\uvec{v}} A \uvec{u}\label{equation-inner-prod-concepts-modified-real-dot-product}\tag{\(\dagger\)} \end{gather}

would also satisfy the axioms for a real inner product.

While we came up with some interesting observations in the case \(n = 2\) in Discovery 36.7, the direct approach is best here.

The last condition is best left as it is, instead of trying to get more specific about properties of \(A\) and its entries that would guarantee this property. A real matrix \(A\) that satisfies both of above the conditions necessary to generate an inner product on \(\R^n\) is called a positive definite matrix.

Positive definite matrices are easy to construct: if \(P\) is any invertible matrix, then \(A = \utrans{P} P\) is always positive definite. And it also turns out that every inner product on \(\R^n\) is of the form (\(\dagger\)) for some positive definite matrix \(A\). (See Subsection 36.6.2, where these two facts are stated formally.)

Notice what happens if we construct a pairing using \(A = \utrans{P} P\text{:}\)

\begin{equation*} \uvecinprod{u}{v} = \utrans{\uvec{v}} \utrans{P} P \uvec{u} = \utrans{(P \uvec{v})} (P \uvec{u}) = \dotprod{(P \uvec{u})}{(P \uvec{v})}\text{.} \end{equation*}

Recall that every invertible matrix \(P\) is somehow a transition matrix (Proposition 22.5.6). So the above calculation can be interpreted as saying that the new inner product on \(\R^n\) afforded by positive definite matrix \(A = \utrans{P} P\) is equivalent to the dot product by first transforming \(\R^n\) by \(P\).

Inner products on \(\C^n\).

Everything works almost exactly the same for \(\C^n\) as for \(\R^n\text{,}\) except now it requires that the complex matrix \(A\) be self-adjoint positive definite (instead of merely symmetric positive definite) in order for the pairing

\begin{gather} \uvecinprod{w}{z} = \adjoint{\uvec{z}} A \uvec{w}\label{equation-inner-prod-concepts-modified-complex-dot-product}\tag{\(\dagger\dagger\)} \end{gather}

to be a complex inner product. Regarding the positive definite condition, it is not obvious but the self-adjoint condition also guarantees that \(\adjoint{\uvec{z}} A \uvec{z}\) is always real for every column vector \(\uvec{z}\) in \(\C^n\text{,}\) making the comparison \(\adjoint{\uvec{z}} A \uvec{x} \gt 0\) actually meaningful.

Just as in the real case, complex positive definite matrices are easy to construct: if \(P\) is any invertible complex matrix, then \(A = \adjoint{P} P\) is always positive definite. And again, it also turns out that every inner product on \(\C^n\) is of the form (\(\dagger\dagger\)) for some positive definite matrix \(A\). (Again, see Subsection 36.6.2, where these two facts are stated formally.)

Just as in the real case, if we construct a pairing using \(A = \adjoint{P} P\text{:}\)

\begin{equation*} \uvecinprod{w}{z} = \adjoint{\uvec{z}} \adjoint{P} P \uvec{w} = \adjoint{(P \uvec{z})} (P \uvec{w}) = \dotprod{(P \uvec{w})}{(P \uvec{z})}\text{.} \end{equation*}

So again we can say that the new inner product on \(\C^n\) afforded by positive definite matrix \(A = \adjoint{P} P\) is equivalent to the complex dot product by first transforming \(\C^n\) by \(P\).