Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture...

10
Lecture 24: Quick review from previous lecture A square matrix A is called an orthogonal matrix if it satisfies A T A = AA T = I. If A is an orthogonal matrix, then its columns (rows) form an orthonormal basis on R n with respect to the Euclidean dot product. ————————————————————————————————— Today we will discuss the QR factorization and orthogonal projections. - Lecture will be recorded - ————————————————————————————————— Lecture video can be found in Canvas ”Media Gallery” MATH 4242-Week 10-3 1 Spring 2020 - -

Transcript of Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture...

Page 1: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

Lecture 24: Quick review from previous lecture

• A square matrix A is called an orthogonal matrix if it satisfies

ATA = AAT= I.

• If A is an orthogonal matrix, then its columns (rows) form an orthonormal

basis on Rnwith respect to the Euclidean dot product.

—————————————————————————————————

Today we will discuss the QR factorization and orthogonal projections.

- Lecture will be recorded -

—————————————————————————————————

Lecture video can be found in Canvas ”Media Gallery”

MATH 4242-Week 10-3 1 Spring 2020

-

-

Page 2: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

§ The QR Factorization

Revisit the Gram-Schmidt process:

Let A = [a1 · · · an] be n⇥ n nonsingular matrix, where aj is the ith column vector

of A. Thus, a1, · · · , an are linearly independent.

Turn a1, · · · , an to orthogonal vectors v1, · · · ,vn:

v1 = a1

v2 = a2 �ha2,v1ikv1k2

v1

v3 = a3 �ha3,v1ikv1k2

v1 �ha3,v2ikv2k2

v2

...

vn = an �han,v1ikv1k2

v1 � · · ·� han,vn�1ikvn�1k2

vn�1

Normalize v1, · · · ,vn to get orthonormal vectors q1, · · · ,qn:

That is,

qj =vj

kvjk)|{z}

rjj=kvjkrjjqj = vj.

We can get

r11q1 = a1

r22q2 = a2 � ha2,q1iq1

r33q3 = a3 � ha3,q1iq1 � ha3,q2iq2...

rnnqn = an � han,q1iq1 � · · ·� han,qn�1iqn�1

MATH 4242-Week 10-3 2 Spring 2020

"114119 , CAZ

, 9178 ' 9g- = yjll- a-97nF.ae z. + " i. case. > a

.

Vj=HVjH9j .HzT

= rjj 9J .

I3 723- -

Vin rn-I, h .

Page 3: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

Let rij = haj,qii, then we further have

r11q1 = a1

r22q2 = a2 � r12q1

r33q3 = a3 � r13q1 � r23q2...

rnnqn = an � r1nq1 � · · ·� rn�1,nqn�1

We reach

a1 = r11q1

a2 = r12q1 + r22q2

a3 = r13q1 + r23q2 + r33q3...

an = r1nq1 + +r2nq2 + · · · + rn�1,nqn�1 + rnnqn

The Gram-Schmidt equations can then be recast into an equivalent matrix form:

0

@ a1 a2 · · · an

1

A

| {z }A

=

0

@ q1 q2 · · · qn

1

A

| {z }Q

0

BBB@

r11 r12 · · · r1nr22 · · · r2n

. . ....

rnn

1

CCCA

| {z }R

where rjj = kvjk and rij = haj,qii. This is called the QR factorization.

MATH 4242-Week 10-3 3 Spring 2020

{ai, . . ., an } l . independent EY. .. .

, rn } orthogonal9i= VillNill

-

⇒ ( Gi , - - y Gn ) ON

-

=

← Aft<"

aj.gs

"

toa orthogonal

Page 4: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

Fact: Every nonsingular matrix can be factored into, A = QR, the product of

an orthogonal matrix Q and an upper triangular matrix R.

The factorization is unique if R is positive upper triangular(meaning that all

its diagonal entries are positive (rjj > 0).

Example. Let A = [a1 a2 a3] where

a1 =

0

@1

1

0

1

A , a2 =

0

@0

1

1

1

A , a3 =

0

@1

0

1

1

A .

Find the QR factorization of A.[Ans:] We have saw in Lecture 22 how to apply Gram-Schmidt to the vectors ai:

We found the resulting orthogonal basis to be

v1 =

0

@1

1

0

1

A ,v2 =

0

@�1/21/21

1

A ,v3 =

0

@2/3

�2/32/3

1

A .

Thus, letting qk =vkkvkk

, we have the orthonormal basis is

q1 =1p2

0

@1

1

0

1

A ,q2 =

r2

3

0

@�1/21/21

1

A ,q3 =

0

@1/p3

�1/p3

1/p3

1

A

It is remaining to find R.

MATH 4242-Week 10-3 4 Spring 2020

-

CD-

= = =

=

= =

→⇐We have known D= ( 8 , Ez E, 7 .

To find R : R -

- ( "g' "

I. Try,

].Hii = 11 Vill

. K ,= 11411=52

,r. >= 11411 = FI{ Vij = ( Aj , Gi > 833=11411 = ¥

,

Page 5: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

[Example Continue:]

§ Use QR factorization of A to solve linear system Ax = b.

If A = QR, we want to use it to solve the system Ax = b:

From the linear system, we get

QRx = b

Since Q�1= QT

(Q is orthogonal), we have

Rx = QTb.

Therefore, we just need to solve

Rx = QTb

by applying back substitution since R is upper triangular matrix.

Example. Apply this to solve the system

0

@1 0 1

1 1 0

0 1 1

1

A

0

@xyz

1

A =

0

@1

0

0

1

A = b

MATH 4242-Week 10-3 5 Spring 2020

riz = Laz,8 , > = ( ( Y)

,

¥ ( I ) ) = t.

% = C as , 9 , > = ( ( lol , rt ( Il ) = Is.

ru -

- ca .. as = "

. ÷Eo¥÷ ).

#

--

-

T1-- O

-A ~

From previous example . A = Q R.

OR LEI -- l ! ).

R ( El = ay !)

Page 6: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

[Example Continue:]

MATH 4242-Week 10-3 6 Spring 2020

E. ¥: ÷÷÷iHt②

using bad. as.mn.

① F z =

.

⇒ 2- = I.

⑦ Ey t fro Z = tf ⇒ y = -K.

③x -- "

'

any . (#

Page 7: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

4.4 Orthogonal Projections and Subspaces

§ Orthogonal projections

Definition: We call a vector x in an inner product space V is orthogonal to

the subspace W of V if it is orthogonal to every vector in W , that is,

hx,wi = 0 for all w 2 W.

Suppose w1, · · · ,wn is the basis of W . Thus,

x is orthogonal to W , hx,wii = 0, i = 1, · · · , n.

[To see this:]

Definition: The orthogonal projection of v onto the subspace W of V is

the element w 2 W such that the di↵erence z = v �w orthogonal to W .

We use the notation

z ? W.

The orthogonal projection is “unique”.

Note that such w is the unique vector in W that is “closet to” v.

MATH 4242-Week 10-3 7 Spring 2020

Recall :-

gu- fYj w , orthogonal to w .

Et:EYEw , orthogonal projection of x onto vector w .

=

ax= Hw

⇒ ) since wi e-W,

C X, Wi 7=0 . IE ien.

(E) Any y E W can be written as

y = C, W, t - - - t Cn Wn.

( x, y > = (x ,

t> go

= c, cx.tv?j-.--cuLx/wu?

= O. #

" " IEEE:

Page 8: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

Fact: Suppose u1, · · · ,un is an orthonormal basis of subspace W of V . If

w 2 W is the orthogonal projection of v 2 V onto W , then

w = c1u1 + · · · cnun,

where

cj = hv,uji, j = 1, · · · , n.

[To see this:]

Remark: Thus, v �Pn

k=1hv,ukiuk is orthogonal to W , that is,

⇣v �

nX

k=1

hv,ukiuk

⌘? W.

MATH 4242-Week 10-3 8 Spring 2020

-

= =

IT-

(il WEW can be written as linear combination

of E U ... . .

,Un } , so

w = I, U , t . - - t CI Un .

j = Cc , u , t . - -t Chun , Uj > = Cjlujsuj?= Cj llujll '

(2) Since W is orthogonal projection of uooEw(V- w) t W * e

-w

So, Lu-W , Uj > = o ⇒ C = CHIP-• It ( nil B orthogonal , then Cj = NY

" 4*112' #

"-

w

Page 9: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

§ Orthogonal Subspaces

Definition: Two subspaces W,Z of V are called orthogonal if every vector

in W is orthogonal to every vector in Z, that is,

hw, zi = 0 for all w 2 W, z 2 Z.

Immediately, we also have

Fact: If {w1, · · · ,wn} span W and {z1, · · · , zk} span Z, then

W,Z are orthogonal , hwi, zji = 0

for all 1 i n, 1 j k.

Definition: If W is a subspace of V , its orthogonal complement W?

(pronounced “W perp”) is the set of all vectors orthogonal to W , that is,

W?= {v 2 V : hv,wi = 0 for all w 2 W}.

• It can be checked that W?is also a subspace of V .

• If W = span{w}, we will also denote W?by w

?.

• Note that the only vector contained in both W and W?is 0.

MATH 4242-Week 10-3 9 Spring 2020

-

--

O=O pv=5%0=

Page 10: Lecture 24: Quick review from previous lecture · Lecture 24: Quick review from previous lecture • AsquarematrixA is called an orthogonal matrix if it satisfies ATA = AAT = I.

Example. Find W?, the orthogonal complement to W = span{w} in R3

, where

w =

0

@1

�2

1

1

A

Example. Suppose W = span{w1,w2}, where

w1 =

0

@1

2

1

1

A , w2 =

0

@0

�1

1

1

A .

To find W?.

MATH 4242-Week 10-3 10 Spring 2020

f'll,-2,11TI -_ ix. 4. as in wt HI

dWTI = 0

. 6-din?CLT -2 1) ( ¥ ) -- O .

put Otree variables.

x-

- ay - Z . w¥{ ( 21¥) I y.zc.IR).

*

WTI -- o

wie -

- o.

( IE ) 5=0 .

1- continue on Monday)