Processing math: 100%
Skip to main content

Discovery guide 37.1 Discovery guide

Recall that two vectors u,v in Rn are called orthogonal if uβ‹…v=0.

In analogy with this, we will also call two vectors u,v in an inner product space orthogonal if ⟨u,v⟩=0.

Suppose V in an inner product space, and U is a subspace of V. The collection of all vectors orthogonal to U is called the orthogonal complement of U, and is denoted UβŠ₯. That is, UβŠ₯ consists of all vectors that are orthogonal to every vector in U.

Discovery 37.1.

Suppose V=R3 with the usual Euclidean inner product (i.e. the dot product). Then orthogonal is the same as perpendicular.

(a)

Describe UβŠ₯ if U is a plane through the origin.

(b)

Describe WβŠ₯ if W is a line through the origin.

(c)

Based on your two answers, make a general conjecture about (UβŠ₯)βŠ₯ in an inner product space.

Note: In Task a and Task b keep in mind that changing the initial point of a geometric vector doesn't change the vector.

Discovery 37.2.

Suppose U=Span{u1,u2,u3} is a subspace of an inner product space V. Convince yourself that a vector v is in UβŠ₯ if and only if v is orthogonal to each of u1,u2,u3.

Since UβŠ₯ is defined by a homogeneous condition (inner product equals 0), we expect it to be a subspace. The orthogonality condition can be used to determine a basis for UβŠ₯.

Discovery 37.3.

Consider V=M2Γ—2(R) as an inner product space with ⟨A,B⟩=trace(BTA). Let U represent the subspace of every upper triangular matrix whose upper-right entry is equal to its trace.

(a)

Determine a basis for U.

Hint

Describe a typical element in \(U\) using parameters, then associate a basis vector to each independent parameter.

(b)

Determine a basis for UβŠ₯.

Hint

Use the idea of Discovery 37.2 applied to your basis for \(U\) from Task a to set up a homogeneous system of equations to solve.

A set of vectors in an inner product space is called an orthogonal set if each vector in the set is orthogonal to every other vector in the set. A set of vectors is called an orthonormal set if it is an orthogonal set where every member is a unit vector.

Geometrically we think of linearly independent vectors as β€œpointing in different directions,” so it is reasonable to expect an orthogonal set of vectors to be independent.

Discovery 37.4.

Suppose {v1,v2,v3} is an orthogonal set of nonzero vectors in an inner product space. To test for independence, we start with the homogeneous vector equation

k1v1+k2v2+k3v3=0.
(a)

From our initial equation (⋆), we have

⟨k1v1+k2v2+k3v3,v1⟩=⟨0,v1⟩.

Simplify the left-hand side of this equality to discover something about k1.

(b)

Convince yourself that similar reasoning will work for k2,k3.

(c)

Is {v1,v2,v3} an independent set?

Discovery 37.5.

Suppose B={e1,e2,e3} is both a basis and an orthogonal set in an inner product space V. Since B is a basis, every vector v in V has a unique expression

v=k1e1+k2e2+k3e3

for some scalars k1,k2,k3.

(a)

Substitute (⋆⋆) into ⟨v,e1⟩ to obtain an expression for ⟨v,e1⟩ in terms of the kj and ej. Then isolate k1.

(b)

Similar to Task a, use (⋆⋆) in ⟨v,e2⟩ to obtain a formula for k2, and in ⟨v,e3⟩ to obtain a formula for k3.

(c)

Extract the pattern: If V has dimension n (instead of dimension 3), then the coordinates of a vector v relative to an orthogonal basis B={e1,e2,…,en} are .

(d)

How does your pattern from Task c change for an orthonormal basis?

A basic problem in an inner product space is how to come up with an orthogonal basis. So let's invent a procedure for doing so.

Discovery 37.6.

To keep it simple, let's suppose V has dimension 3. The beginning ingredient for our procedure is some (probably non-orthogonal) basis B0={v1,v2,v3}, and the end result should be some definitely orthogonal basis B={e1,e2,e3}.

To get the process started, we might as well first choose e1=v1, since we don't yet have any other ej vectors chosen yet to which e1 needs to be orthogonal.

The rest of the activity requires us to choose e2,e3 to complete the orthogonal basis.

(a)

If we already knew the answer B={e1,e2,e3}, and we expanded v2=k1e1+k2e2+k3e3 relative to B, what would the coefficient k1 be?

Hint

We already considered this sort of question in Discovery 37.5 above.

(b)

Draw a diagram of v2, e1, and k1e1 as if these were vectors in Rn, keeping in mind that k1e1 should be exactly that part of v2 that is parallel to e1. (Does this diagram remind you of some previous concept?)

Use your diagram to propose a choice of vector e2 that is orthogonal to e1.

(c)

Carry out a similar exploration process as in Task a and Task b, but for v3.

(i)

For expansion v3=k1e1+k2e2+k3e3 relative to B, what would be the coordinates k1 and k2?

(ii)

Draw a diagram of v3, e1, e2, and k1e1+k2e2, keeping in mind that k1e1+k2e2 should be exactly that part of v3 that is β€œparallel” to Span{e1,e2}.

Use your diagram to propose a choice of vector e3 that is orthogonal to both e1 and e2.

(d)

Extract the pattern: If V has dimension n (instead of dimension 3), then the next step in the procedure would be

e4=_.

And then

e5=_.

And so on.

(e)

At the end of all this, what would you do if you wanted an orthonormal basis for V?

Discovery 37.7.

What do you think the result would be if you unknowingly applied the procedure of Discovery 37.6 to a starting basis B0 that was already orthogonal?

Discovery 37.8.

Suppose U is a subspace of an inner product space V.

(b)

Every basis for U can be enlarged to a basis for V (Statement 1 of Proposition 20.5.8).

Suppose

B0={u1,…,um,v1,…,vβ„“}

is such an enlarged basis for V, so that the uj form a basis for U. If we apply the procedure of Discovery 37.6 to B0 to obtain orthgonal basis

B={e1,…,em,f1,…,fβ„“}

for V, what subspace do the fj span?