Orthonormal Bases; Gram-Schmidt Process;...

61
Orthonormal Bases; Gram-Schmidt Process; QR -Decomposition MATH 322, Linear Algebra I J. Robert Buchanan Department of Mathematics Spring 2015

Transcript of Orthonormal Bases; Gram-Schmidt Process;...

Orthonormal Bases; Gram-Schmidt Process;QR-Decomposition

MATH 322, Linear Algebra I

J. Robert Buchanan

Department of Mathematics

Spring 2015

Motivation

When working with an inner product space, the mostconvenient bases are those that

1. consist of orthogonal vectors, and2. consist of vectors of length 1.

Orthogonal Sets

DefinitionA set of vectors in an inner product space is an orthogonal setif all the vectors are pairwise orthogonal. An orthogonal set inwhich each vector has norm (length) 1 is called anorthonormal set.

ExampleThe standard basis {e1,e2, . . . ,en} is an orthonormal set in Rn.

Recall: if v ∈ V and v 6= 0 then

u =v‖v‖

has norm 1.

Orthogonal Sets

DefinitionA set of vectors in an inner product space is an orthogonal setif all the vectors are pairwise orthogonal. An orthogonal set inwhich each vector has norm (length) 1 is called anorthonormal set.

ExampleThe standard basis {e1,e2, . . . ,en} is an orthonormal set in Rn.

Recall: if v ∈ V and v 6= 0 then

u =v‖v‖

has norm 1.

Orthogonal Sets

DefinitionA set of vectors in an inner product space is an orthogonal setif all the vectors are pairwise orthogonal. An orthogonal set inwhich each vector has norm (length) 1 is called anorthonormal set.

ExampleThe standard basis {e1,e2, . . . ,en} is an orthonormal set in Rn.

Recall: if v ∈ V and v 6= 0 then

u =v‖v‖

has norm 1.

Example

Create an orthonormal set of vectors from:

v1 = (0,1,0)v2 = (1,0,1)v3 = (1,0,−1)

u1 = (0,1,0)u2 = (1/

√2,0,1/

√2)

u3 = (1/√

2,0,−1/√

2)

Example

Create an orthonormal set of vectors from:

v1 = (0,1,0)v2 = (1,0,1)v3 = (1,0,−1)

u1 = (0,1,0)u2 = (1/

√2,0,1/

√2)

u3 = (1/√

2,0,−1/√

2)

Orthogonality and Linear Independence

TheoremIf S = {v1,v2, . . . ,vn} is an orthogonal set of nonzero vectors inan inner product space, then S is linearly independent.

Proof

I Suppose there are scalars k1, k2, . . . , kn such that

k1v1 + k2v2 + · · ·+ knvn = 0.

I Scalar ki = 0 since

〈k1v1 + k2v2 + · · ·+ knvn,vi〉 = 〈0,vi〉k1〈v1,vi〉+ k2〈v2,vi〉+ · · ·+ kn〈vn,vi〉 = 0

ki〈vi ,vi〉 = 0ki‖vi‖2 = 0

ki = 0

I Since this is true for all i we have k1 = k2 = · · · = kn = 0.

Orthonormal Basis

DefinitionA basis for an inner product space consisting of orthonormalvectors is called an orthonormal basis. A basis for an innerproduct space consisting of orthogonal vectors is called anorthogonal basis.

TheoremIf B = {v1,v2, . . . ,vn} is an orthonormal basis for an innerproduct space V , and if u ∈ V, then

u = 〈u,v1〉v1 + 〈u,v2〉v2 + · · ·+ 〈u,vn〉vn.

Remark: The coordinates of u relative to B are 〈u,v1〉, 〈u,v2〉,. . . , 〈u,vn〉.

Orthonormal Basis

DefinitionA basis for an inner product space consisting of orthonormalvectors is called an orthonormal basis. A basis for an innerproduct space consisting of orthogonal vectors is called anorthogonal basis.

TheoremIf B = {v1,v2, . . . ,vn} is an orthonormal basis for an innerproduct space V , and if u ∈ V, then

u = 〈u,v1〉v1 + 〈u,v2〉v2 + · · ·+ 〈u,vn〉vn.

Remark: The coordinates of u relative to B are 〈u,v1〉, 〈u,v2〉,. . . , 〈u,vn〉.

Example

Let u1 =

111

, u2 =

21−3

, u3 =

4−5

1

Verify that {u1,u2,u3} is an orthogonal basis for R3 and find an

orthonormal basis for R3. Express u =

101

in terms of this

orthonormal basis.

Solution

Since ‖u1‖ =√

3, ‖u2‖ =√

14, and ‖u3‖ =√

42 then

v1 =1√3

111

, v2 =1√42

21−3

, v3 =1√42

4−5

1

are orthonormal vectors.

u = 〈u,v1〉v1 + 〈u,v2〉v2 + 〈u,v3〉v3

=2√3

v1 −1√14

v2 +5√42

v3

Coordinates Relative to Orthonormal Bases

TheoremIf B is an orthonormal basis for an n-dimensional inner productspace, and if

(u)B = (u1,u2, . . . ,un) and (v)B = (v1, v2, . . . , vn)

then

I ‖u‖ =√

u21 + u2

2 + · · · u2n

I d(u,v) =√(u1 − v1)2 + (u2 − v2)2 + · · ·+ (un − vn)2

I 〈u,v〉 = u1v1 + u2v2 + · · ·+ unvn

Remark: computing norms, distances, and inner products withcoordinates relative to orthonormal bases is equivalent tocomputing them in Euclidean coordinates.

Coordinates Relative to Orthonormal Bases

TheoremIf B is an orthonormal basis for an n-dimensional inner productspace, and if

(u)B = (u1,u2, . . . ,un) and (v)B = (v1, v2, . . . , vn)

then

I ‖u‖ =√

u21 + u2

2 + · · · u2n

I d(u,v) =√(u1 − v1)2 + (u2 − v2)2 + · · ·+ (un − vn)2

I 〈u,v〉 = u1v1 + u2v2 + · · ·+ unvn

Remark: computing norms, distances, and inner products withcoordinates relative to orthonormal bases is equivalent tocomputing them in Euclidean coordinates.

Coordinates Relative to Orthogonal Bases

If B′ = {v1,v2, . . . ,vn} is merely an orthogonal basis for V then

B =

{v1

‖v1‖,

v2

‖v2‖, . . . ,

vn

‖vn‖

}is an orthonormal basis for V .

Thus if u ∈ V

u =〈u,v1〉‖v1‖2

v1 +〈u,v2〉‖v2‖2

v2 + · · ·+〈u,vn〉‖vn‖2

vn.

Orthogonal Projections

Question: how do you create an orthogonal basis for anarbitrary finite-dimensional inner product space?

Theorem (Projection Theorem)If W is a finite-dimensional subspace of an inner product spaceV , then every vector u ∈ V can expressed uniquely as

u = w1 + w2

where w1 ∈W and w2 ∈W⊥.

Remark: w1 is called the orthogonal projection of u on W .w2 is called the component of u orthogonal to W .

Orthogonal Projections

Question: how do you create an orthogonal basis for anarbitrary finite-dimensional inner product space?

Theorem (Projection Theorem)If W is a finite-dimensional subspace of an inner product spaceV , then every vector u ∈ V can expressed uniquely as

u = w1 + w2

where w1 ∈W and w2 ∈W⊥.

Remark: w1 is called the orthogonal projection of u on W .w2 is called the component of u orthogonal to W .

Orthogonal Projections

Question: how do you create an orthogonal basis for anarbitrary finite-dimensional inner product space?

Theorem (Projection Theorem)If W is a finite-dimensional subspace of an inner product spaceV , then every vector u ∈ V can expressed uniquely as

u = w1 + w2

where w1 ∈W and w2 ∈W⊥.

Remark: w1 is called the orthogonal projection of u on W .w2 is called the component of u orthogonal to W .

Illustration

u = w1 + w2 = projW u + projW⊥ u

Thus

projW⊥ u = u− projW uu = projW u + (u− projW u)

w2

w1

u

u-projwu

projWu

u

Calculating Orthogonal Projections

TheoremSuppose W is an finite-dimensional subspace of an innerproduct space V .

1. If B = {v1,v2, . . . ,vr} is an orthonormal basis for W andu ∈ V then

projW u = 〈u,v1〉v1 + 〈u,v2〉v2 + · · ·+ 〈u,vr 〉vr .

2. If B′ = {v1,v2, . . . ,vr} is an orthogonal basis for W andu ∈ V then

projW u =〈u,v1〉‖v1‖2

v1 +〈u,v2〉‖v2‖2

v2 + · · ·+〈u,vr 〉‖vr‖2

vr .

Example

B = {v1,v2,v3} =

1√3

111

,1√14

21−3

,1√42

4−5

1

is an orthonormal basis for R3. Let W = span{v1,v2}.

Calculate

projW

101

.

Solution

projW

101

= 〈(1,0,1), 1√3(1,1,1)〉v1

+ 〈(1,0,1), 1√14

(2,1,−3)〉v2

=2√3

v1 −1√14

v2

=

112125423742

Gram-Schmidt Process

Remark: we know that every nonzero finite-dimensional vectorspace has a basis. We can develop a stronger statement forinner product spaces.

TheoremEvery nonzero finite-dimensional inner product space has anorthonormal basis.

Proof.The proof is an algorithm known as the Gram-SchmidtProcess.

Gram-Schmidt Process

Remark: we know that every nonzero finite-dimensional vectorspace has a basis. We can develop a stronger statement forinner product spaces.

TheoremEvery nonzero finite-dimensional inner product space has anorthonormal basis.

Proof.The proof is an algorithm known as the Gram-SchmidtProcess.

Gram-Schmidt Process

Remark: we know that every nonzero finite-dimensional vectorspace has a basis. We can develop a stronger statement forinner product spaces.

TheoremEvery nonzero finite-dimensional inner product space has anorthonormal basis.

Proof.The proof is an algorithm known as the Gram-SchmidtProcess.

Gram-Schmidt Algorithm

Let S = {v1,v2, . . . ,vn} be a basis for V .

To construct anorthonormal basis for V we perform the following steps.

1. Let u1 = v1 and define W1 = span{u1}.

2. Let u2 = v2 − projW1v2 = v2 −

〈v2,u1〉‖u1‖2

u1 and define

W2 = span{u1,u2}.

3. Let u3 = v3 − projW2v3 = v3 −

〈v3,u1〉‖u1‖2

u1 −〈v3,u2〉‖u2‖2

u2 and

define W3 = span{u1,u2,u3}.4. Continue in this manner until we have definedB′ = {u1,u2, . . . ,un}, an orthogonal basis for V .

5. B =

{u1

‖u1‖,

u2

‖u2‖, . . . ,

un

‖un‖

}is orthonormal basis for V .

Gram-Schmidt Algorithm

Let S = {v1,v2, . . . ,vn} be a basis for V . To construct anorthonormal basis for V we perform the following steps.

1. Let u1 = v1 and define W1 = span{u1}.

2. Let u2 = v2 − projW1v2 = v2 −

〈v2,u1〉‖u1‖2

u1 and define

W2 = span{u1,u2}.

3. Let u3 = v3 − projW2v3 = v3 −

〈v3,u1〉‖u1‖2

u1 −〈v3,u2〉‖u2‖2

u2 and

define W3 = span{u1,u2,u3}.4. Continue in this manner until we have definedB′ = {u1,u2, . . . ,un}, an orthogonal basis for V .

5. B =

{u1

‖u1‖,

u2

‖u2‖, . . . ,

un

‖un‖

}is orthonormal basis for V .

Gram-Schmidt Algorithm

Let S = {v1,v2, . . . ,vn} be a basis for V . To construct anorthonormal basis for V we perform the following steps.

1. Let u1 = v1 and define W1 = span{u1}.

2. Let u2 = v2 − projW1v2 = v2 −

〈v2,u1〉‖u1‖2

u1 and define

W2 = span{u1,u2}.

3. Let u3 = v3 − projW2v3 = v3 −

〈v3,u1〉‖u1‖2

u1 −〈v3,u2〉‖u2‖2

u2 and

define W3 = span{u1,u2,u3}.4. Continue in this manner until we have definedB′ = {u1,u2, . . . ,un}, an orthogonal basis for V .

5. B =

{u1

‖u1‖,

u2

‖u2‖, . . . ,

un

‖un‖

}is orthonormal basis for V .

Gram-Schmidt Algorithm

Let S = {v1,v2, . . . ,vn} be a basis for V . To construct anorthonormal basis for V we perform the following steps.

1. Let u1 = v1 and define W1 = span{u1}.

2. Let u2 = v2 − projW1v2 = v2 −

〈v2,u1〉‖u1‖2

u1 and define

W2 = span{u1,u2}.

3. Let u3 = v3 − projW2v3 = v3 −

〈v3,u1〉‖u1‖2

u1 −〈v3,u2〉‖u2‖2

u2 and

define W3 = span{u1,u2,u3}.4. Continue in this manner until we have definedB′ = {u1,u2, . . . ,un}, an orthogonal basis for V .

5. B =

{u1

‖u1‖,

u2

‖u2‖, . . . ,

un

‖un‖

}is orthonormal basis for V .

Gram-Schmidt Algorithm

Let S = {v1,v2, . . . ,vn} be a basis for V . To construct anorthonormal basis for V we perform the following steps.

1. Let u1 = v1 and define W1 = span{u1}.

2. Let u2 = v2 − projW1v2 = v2 −

〈v2,u1〉‖u1‖2

u1 and define

W2 = span{u1,u2}.

3. Let u3 = v3 − projW2v3 = v3 −

〈v3,u1〉‖u1‖2

u1 −〈v3,u2〉‖u2‖2

u2 and

define W3 = span{u1,u2,u3}.

4. Continue in this manner until we have definedB′ = {u1,u2, . . . ,un}, an orthogonal basis for V .

5. B =

{u1

‖u1‖,

u2

‖u2‖, . . . ,

un

‖un‖

}is orthonormal basis for V .

Gram-Schmidt Algorithm

Let S = {v1,v2, . . . ,vn} be a basis for V . To construct anorthonormal basis for V we perform the following steps.

1. Let u1 = v1 and define W1 = span{u1}.

2. Let u2 = v2 − projW1v2 = v2 −

〈v2,u1〉‖u1‖2

u1 and define

W2 = span{u1,u2}.

3. Let u3 = v3 − projW2v3 = v3 −

〈v3,u1〉‖u1‖2

u1 −〈v3,u2〉‖u2‖2

u2 and

define W3 = span{u1,u2,u3}.4. Continue in this manner until we have definedB′ = {u1,u2, . . . ,un}, an orthogonal basis for V .

5. B =

{u1

‖u1‖,

u2

‖u2‖, . . . ,

un

‖un‖

}is orthonormal basis for V .

Gram-Schmidt Algorithm

Let S = {v1,v2, . . . ,vn} be a basis for V . To construct anorthonormal basis for V we perform the following steps.

1. Let u1 = v1 and define W1 = span{u1}.

2. Let u2 = v2 − projW1v2 = v2 −

〈v2,u1〉‖u1‖2

u1 and define

W2 = span{u1,u2}.

3. Let u3 = v3 − projW2v3 = v3 −

〈v3,u1〉‖u1‖2

u1 −〈v3,u2〉‖u2‖2

u2 and

define W3 = span{u1,u2,u3}.4. Continue in this manner until we have definedB′ = {u1,u2, . . . ,un}, an orthogonal basis for V .

5. B =

{u1

‖u1‖,

u2

‖u2‖, . . . ,

un

‖un‖

}is orthonormal basis for V .

Example

Given 1

2−2

,

432

,

121

find an orthonormal basis for R3.

Solution

Step 1: v1 =(1,2,−2)‖(1,2,−2)‖

=

(13,23,−2

3

)

Step 2: v2 = (4,3,2)− 〈(4,3,2),v1〉‖v1‖2

v1 =

(4,3,2)− 〈(4,3,2),v1〉v1 since v1 is a unit vector.

v2 = (4,3,2)− 2v1 =

(103,53,103

)We will replace v2 with a unit vector in the samedirection, so v2 = (2/3,1/3,2/3).

Step 3: v3 = (1,2,1)− 〈(1,2,1),v1〉 − 〈(1,2,1),v2〉 (sincev1 and v2 are unit vectors).

v3 = (1,2,1)− (1)v1 − (2)v2 =

(−2

3,23,13

)Vector v3 is already a unit vector.

Solution

Step 1: v1 =(1,2,−2)‖(1,2,−2)‖

=

(13,23,−2

3

)Step 2: v2 = (4,3,2)− 〈(4,3,2),v1〉

‖v1‖2v1 =

(4,3,2)− 〈(4,3,2),v1〉v1 since v1 is a unit vector.

v2 = (4,3,2)− 2v1 =

(103,53,103

)We will replace v2 with a unit vector in the samedirection, so v2 = (2/3,1/3,2/3).

Step 3: v3 = (1,2,1)− 〈(1,2,1),v1〉 − 〈(1,2,1),v2〉 (sincev1 and v2 are unit vectors).

v3 = (1,2,1)− (1)v1 − (2)v2 =

(−2

3,23,13

)Vector v3 is already a unit vector.

Solution

Step 1: v1 =(1,2,−2)‖(1,2,−2)‖

=

(13,23,−2

3

)Step 2: v2 = (4,3,2)− 〈(4,3,2),v1〉

‖v1‖2v1 =

(4,3,2)− 〈(4,3,2),v1〉v1 since v1 is a unit vector.

v2 = (4,3,2)− 2v1 =

(103,53,103

)We will replace v2 with a unit vector in the samedirection, so v2 = (2/3,1/3,2/3).

Step 3: v3 = (1,2,1)− 〈(1,2,1),v1〉 − 〈(1,2,1),v2〉 (sincev1 and v2 are unit vectors).

v3 = (1,2,1)− (1)v1 − (2)v2 =

(−2

3,23,13

)Vector v3 is already a unit vector.

Example

Let vector space P2 have the inner product

〈p,q〉 =∫ 1

−1p(x)q(x)dx .

Given {1, x , x2} find an orthonormal basis for P2.

Solution (1 of 2)

〈1,1〉 =

∫ 1

−1(1)2 dx = 2

v1 =1√2

u1 = x −⟨

x ,1√2

⟩1√2

= x − 1√2

∫ 1

−1

1√2

x dx

= x

v2 =x√〈x , x〉

=

√32

x

Solution (1 of 2)

〈1,1〉 =

∫ 1

−1(1)2 dx = 2

v1 =1√2

u1 = x −⟨

x ,1√2

⟩1√2

= x − 1√2

∫ 1

−1

1√2

x dx

= x

v2 =x√〈x , x〉

=

√32

x

Solution (2 of 2)

v1 =1√2

v2 =

√32

x

u3 = x2 −⟨

x2,1√2

⟩1√2−

⟨x2,

√32

x

⟩√32

x

= x2 − 1√2

√2

3− (0)

√32

x

= x2 − 13

v3 =x2 − 1/3√

〈x2 − 13 , x

2 − 13〉

=

√58(3x2 − 1)

Extending Orthonormal Sets

Recall: earlier we stated a theorem which said that a linearlyindependent set in a finite-dimensional vector space can beenlarged to a basis by adding appropriate vectors.

TheoremIf W is a finite-dimensional inner product space, then:

I Every orthogonal set of nonzero vectors in W can beenlarged to an orthogonal basis for W.

I Every orthonormal set of nonzero vectors in W can beenlarged to an orthonormal basis for W.

Extending Orthonormal Sets

Recall: earlier we stated a theorem which said that a linearlyindependent set in a finite-dimensional vector space can beenlarged to a basis by adding appropriate vectors.

TheoremIf W is a finite-dimensional inner product space, then:

I Every orthogonal set of nonzero vectors in W can beenlarged to an orthogonal basis for W.

I Every orthonormal set of nonzero vectors in W can beenlarged to an orthonormal basis for W.

QR-Decomposition

Remark: the Gram-Schmidt Process can be used to factorcertain matrices.

Theorem (QR-Decomposition)If A is an m × n matrix with linearly independent columnvectors, then A can be factored as A = QR where Q is anm × n matrix with orthonormal column vectors, and R is ann × n invertible upper triangular matrix.

QR-Decomposition

Remark: the Gram-Schmidt Process can be used to factorcertain matrices.

Theorem (QR-Decomposition)If A is an m × n matrix with linearly independent columnvectors, then A can be factored as A = QR where Q is anm × n matrix with orthonormal column vectors, and R is ann × n invertible upper triangular matrix.

QR-Decomposition (continued)

Proof.I Suppose A =

[u1 u2 · · · un

]and rank(A) = n.

I Apply the Gram-Schmidt process to {u1,u2, . . . ,un} toproduce the orthonormal basis {q1,q2, . . . ,qn} and definematrix Q =

[q1 q2 · · · qn

].

QR-Decomposition (continued)

Proof.I Suppose A =

[u1 u2 · · · un

]and rank(A) = n.

I Apply the Gram-Schmidt process to {u1,u2, . . . ,un} toproduce the orthonormal basis {q1,q2, . . . ,qn} and definematrix Q =

[q1 q2 · · · qn

].

QR-Decomposition (continued)Proof.

I Note that

u1 = 〈u1,q1〉q1 + 〈u1,q2〉q2 + · · ·+ 〈u1,qn〉qn

u2 = 〈u2,q1〉q1 + 〈u2,q2〉q2 + · · ·+ 〈u2,qn〉qn...

un = 〈un,q1〉q1 + 〈un,q2〉q2 + · · ·+ 〈un,qn〉qn

I Writing this system of equations in matrix form yields

A = Q

〈u1,q1〉 〈u2,q1〉 · · · 〈un,q1〉〈u1,q2〉 〈u2,q2〉 · · · 〈un,q2〉

......

...〈u1,qn〉 〈u2,qn〉 · · · 〈un,qn〉

= QR

QR-Decomposition (continued)Proof.

I Note that

u1 = 〈u1,q1〉q1 + 〈u1,q2〉q2 + · · ·+ 〈u1,qn〉qn

u2 = 〈u2,q1〉q1 + 〈u2,q2〉q2 + · · ·+ 〈u2,qn〉qn...

un = 〈un,q1〉q1 + 〈un,q2〉q2 + · · ·+ 〈un,qn〉qn

I Writing this system of equations in matrix form yields

A = Q

〈u1,q1〉 〈u2,q1〉 · · · 〈un,q1〉〈u1,q2〉 〈u2,q2〉 · · · 〈un,q2〉

......

...〈u1,qn〉 〈u2,qn〉 · · · 〈un,qn〉

= QR

QR-Decomposition (continued)

Proof.I By definition

qi = ui −i−1∑j=1

〈ui ,qj〉qj

ui = qi +i−1∑j=1

〈ui ,qj〉qj

ui ∈ span{q1,q2, . . . ,qi}

I Thus for j ≥ 2 vector qj is orthogonal to u1, u2, . . . , uj−1.

QR-Decomposition (continued)

Proof.I By definition

qi = ui −i−1∑j=1

〈ui ,qj〉qj

ui = qi +i−1∑j=1

〈ui ,qj〉qj

ui ∈ span{q1,q2, . . . ,qi}

I Thus for j ≥ 2 vector qj is orthogonal to u1, u2, . . . , uj−1.

QR-Decomposition (continued)

Proof.I Thus

R =

〈u1,q1〉 〈u2,q1〉 · · · 〈un,q1〉

0 〈u2,q2〉 · · · 〈un,q2〉...

......

0 0 · · · 〈un,qn〉

I We must still show that 〈ui ,qi〉 6= 0 for i = 1,2, . . . ,n inorder for R to be nonsingular.

QR-Decomposition (continued)

Proof.I Thus

R =

〈u1,q1〉 〈u2,q1〉 · · · 〈un,q1〉

0 〈u2,q2〉 · · · 〈un,q2〉...

......

0 0 · · · 〈un,qn〉

I We must still show that 〈ui ,qi〉 6= 0 for i = 1,2, . . . ,n in

order for R to be nonsingular.

QR-Decomposition (continued)

Proof.Recall

ui = qi +i−1∑j=1

〈ui ,qj〉qj

〈ui ,qi〉 =

⟨qi +i−1∑j=1

〈ui ,qj〉qj

,qi

= 〈qi ,qi〉+i−1∑j=1

〈ui ,qj〉 〈qj ,qi〉︸ ︷︷ ︸=0

= ‖qi‖2 6= 0

QR-Decomposition (continued)

Proof.Thus (finally)

R =

〈u1,q1〉 〈u2,q1〉 · · · 〈un,q1〉

0 〈u2,q2〉 · · · 〈un,q2〉...

......

0 0 · · · 〈un,qn〉

which is invertible.

Example

Example

Find the QR-decomposition of A =

1 1 21 2 31 2 11 1 6

.

Hint: an orthonormal basis for the column space of A is12

1111

,12

−1

11−1

,1

10

−2√

10√10

−√

102√

10

.

Example

Example

Find the QR-decomposition of A =

1 1 21 2 31 2 11 1 6

.

Hint: an orthonormal basis for the column space of A is12

1111

,12

−1

11−1

,1

10

−2√

10√10

−√

102√

10

.

Proof of Projection Theorem

Theorem (Projection Theorem)If W is a finite-dimensional subspace of an inner product spaceV , then every vector u ∈ V can expressed uniquely as

u = w1 + w2

where w1 ∈W and w2 ∈W⊥.

Proof.There are two steps:

1. Find w1 and w2 with the stated properties.2. Show that w1 and w2 are unique.

Proof of Projection Theorem

Theorem (Projection Theorem)If W is a finite-dimensional subspace of an inner product spaceV , then every vector u ∈ V can expressed uniquely as

u = w1 + w2

where w1 ∈W and w2 ∈W⊥.

Proof.There are two steps:

1. Find w1 and w2 with the stated properties.

2. Show that w1 and w2 are unique.

Proof of Projection Theorem

Theorem (Projection Theorem)If W is a finite-dimensional subspace of an inner product spaceV , then every vector u ∈ V can expressed uniquely as

u = w1 + w2

where w1 ∈W and w2 ∈W⊥.

Proof.There are two steps:

1. Find w1 and w2 with the stated properties.2. Show that w1 and w2 are unique.

Homework

I Read Section 6.3I Exercises: 1, 3, 7, 9, 15, 19, 27, 29