Presentation is loading. Please wait.

Presentation is loading. Please wait.

Math for CSTutorial 41 Contents: 1.Least squares solution for overcomplete linear systems. 2.… via normal equations 3.… via A = QR factorization 4.… via.

Similar presentations


Presentation on theme: "Math for CSTutorial 41 Contents: 1.Least squares solution for overcomplete linear systems. 2.… via normal equations 3.… via A = QR factorization 4.… via."— Presentation transcript:

1 Math for CSTutorial 41 Contents: 1.Least squares solution for overcomplete linear systems. 2.… via normal equations 3.… via A = QR factorization 4.… via SVD decomposition 5.SVD - Singular Value Decomposition, A = UΣV T

2 Math for CSTutorial 42 Normal Equations Consider the system It can be a result of some physical measurements, which usually incorporate some errors. Since, we can not solve it exactly, we would like to minimize the error: r=b-Ax r 2 =r T r=(b-Ax) T (b-Ax)=b T b-2x T A T b+x T A T Ax (r 2 ) x =0 - zero derivative is a (necessary) minimum condition -2A T b+2A T Ax=0; A T Ax = A T b; – Normal Equations

3 Math for CSTutorial 43 Normal Equations 2 A T Ax = A T b – Normal Equations

4 Math for CSTutorial 44 Least squares via A=QR decomposition A (m,n) =Q (m,n) R (n,n), Q is orthogonal, therefore Q T Q=I. QRx=b R (n,n) x=Q T (n,m) b (m,1) -well defined linear system x=R -1 Q T b Q is found by Gram=Schmidt orthogonalization of A. How to find R? QR=A Q T QR=Q T A, but Q is orthogonal, therefore Q T Q=I: R=Q T A R is upper triangular, since in orthogonalization procedure only a 1,..a k (without a k+1,…) are used to produce q k

5 Math for CSTutorial 45 Least squares via A=QR decomposition 2 Let us check the correctness: QRx=b Rx=Q T b x=R -1 Q T b

6 Math for CSTutorial 46 Least squares via SVD Ax=b; A=UΣV T -singular value decomposition of A: UΣV T x=b; x= VΣ -1 U T b

7 Math for CSTutorial 47 Singular Value Decomposition 1 The SVD based on the fact that for any A there are orthonormal bases v 1,…v r for the row space and u 1,…u r for the column space, such, that Av i =σ i u i, while σ i >0 Thus, any matrix can be represented as,where U and V are orthogonal, and Σ is diagonal.

8 Math for CSTutorial 48 Singular Value Decomposition 2 First we find the matrix V: A T A=(UΣV T ) T (UΣV T )= V T Σ T U T U ΣV T = V T Σ T ΣV T This is an ordinary (eigenvector) factorization of a symmetric matrix, therefore V is built of eigenvectors of A T A. The eigenvectors of A T A are rows of V T. In the same way one can prove, that U is built from eigenvectors of AA T. However, an easier way to find U and Σ is to use the equations: Av i =σ i u i

9 Math for CSTutorial 49 SVD Example Let us find SVD for the matrix In order to find V, we are calculating eigenvectors of A T A: (5-λ) 2 -9=0; λ 2 -10 λ +16=0; λ 1,2 =8,2

10 Math for CSTutorial 410 SVD Example 2 Now, we obtain the U and Σ : A=UΣV T :

11 Math for CSTutorial 411 Appendix: derivative of x T A T Ax


Download ppt "Math for CSTutorial 41 Contents: 1.Least squares solution for overcomplete linear systems. 2.… via normal equations 3.… via A = QR factorization 4.… via."

Similar presentations


Ads by Google