Linear Least Squares QR Factorization
description
Transcript of Linear Least Squares QR Factorization
![Page 1: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/1.jpg)
Linear Least SquaresQR Factorization
![Page 2: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/2.jpg)
![Page 3: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/3.jpg)
Systems of linear equations
Problem to solve: M x = b
Given M x = b : Is there a solution? Is the solution unique?
![Page 4: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/4.jpg)
Systems of linear equations
Find a set of weights x so that the weighted sum of the
columns of the matrix M is equal to the right hand side b
1 1
2 21 2 N
N N
x b
x bM M M
x b
1 1 2 2 N Nx M x M x M b
![Page 5: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/5.jpg)
Systems of linear equations - Existence
A solution exists when b is in the span of the columns of M
A solution exists if:
There exist weights, x1, …., xN, such that:
bMxMxMx NN ...2211
![Page 6: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/6.jpg)
Systems of linear equations - Uniqueness
A solution is unique only if the columns of M are linearly
independent.
Then: Mx = b Mx + My= b M(x+y) = b
Suppose there exist weights, y1, …., yN, not all zero, such that:
0...2211 NN MyMyMy
![Page 7: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/7.jpg)
QR factorization 1
A matrix Q is said to be orthogonal if its columns are orthonormal, i.e. QT·Q=I.
Orthogonal transformations preserve the Euclidean norm since
Orthogonal matrices can transform vectors in various ways, such as rotation or reflections but they do not change the Euclidean length of the vector. Hence, they preserve the solution to a linear least squares problem.
![Page 8: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/8.jpg)
QR factorization 2
Any matrix A(m·n) can be represented as
A = Q·R
,where Q(m·n) is orthonormal and R(n·n) is upper triangular:
nn
n
nn
r
r
rrr
qqaa
000
00
0|...||...| 11
11211
11
![Page 9: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/9.jpg)
QR factorization 2
Given A , let its QR decomposition be given as A=Q·R, where Q is an (m x n) orthonormal matrix and R is upper triangular.
QR factorization transform the linear least square problem into a triangular least squares.
Q·R·x = b
R·x = QT·b
x=R-1·QT·b
Matlab Code:
![Page 10: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/10.jpg)
Normal Equations
Consider the system
It can be a result of some physical measurements, which usually incorporate
some errors. Since, we can not solve it exactly, we would like to minimize the
error:
r=b-Ax
r2=rTr=(b-Ax)T(b-Ax)=bTb-2xTATb+xTATAx
(r2)x=0 - zero derivative is a (necessary) minimum condition
-2ATb+2ATAx=0;
ATAx = ATb; – Normal Equations
5.3
5.2
2
21
12
11
x
![Page 11: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/11.jpg)
Normal Equations 2
ATAx = ATb – Normal Equations
5.3
5.2
2
21
12
11
x
65
56
21
12
11
211
121AAT
...,6
55.105.11
11
6
6
55.105.11
6
510
5.1056
5.1165
5.1056212
xxx
5.11
5.10
5.3
5.2
2
211
121bAT
![Page 12: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/12.jpg)
Least squares via A=QR decomposition
A(m,n)=Q(m,n)R(n,n), Q is orthogonal, therefore QTQ=I.
QRx=b
R(n,n)x=QT(n,m)b(m,1) -well defined linear system
x=R-1QTb
Q is found by Gram=Schmidt orthogonalization of A. How to find R?
QR=A
QTQR=QTA, but Q is orthogonal, therefore QTQ=I:
R=QTA
R is upper triangular, since in orthogonalization procedure only
a1,..ak (without ak+1,…) are used to produce qk
![Page 13: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/13.jpg)
Least squares via A=QR decomposition 2
Let us check the correctness:
QRx=b
Rx=QT
b
x=R-1
QT
b
10
11
01
A
3
20
6
1
2
16
1
2
1
Q
6
1
3
20
2
1
2
2
10
11
01
3
2
6
1
6
1
02
1
2
1
AQR T
10
11
01
18
2
3
20
18
2
6
1
2
11
18
2
6
1
2
11
6
1
3
20
2
1
2
2
3
20
6
1
2
16
1
2
1
QR
![Page 14: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/14.jpg)
4M
3M
2M
1M
12 13 14r r r
1Q
2Q
3Q
4Q
11r
22r 23 24r r
33r
44r
34r
Last lecture reminderQR Factorization – By picture
![Page 15: Linear Least Squares QR Factorization](https://reader036.fdocuments.net/reader036/viewer/2022082712/56813a2b550346895da210b4/html5/thumbnails/15.jpg)
For i = 1 to N “For each Target Column”
For j = 1 to i-1 “For each Source Column left of target”
end
end
1
ii i
irp p
iii iMpr Mp
Normalize
i ix x v p
i ip e
Tjj
Tiir p M Mp
i ji i jp p pr Orthogonalize Search Direction
QR Factorization – Minimization ViewMinimization Algorithm