Math for CSTutorial 41 Contents: 1.Least squares solution for overcomplete linear systems. 2.… via normal equations 3.… via A = QR factorization 4.… via.

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Scientific Computing QR Factorization Part 2 – Algorithm to Find Eigenvalues.
Chapter 6 Eigenvalues and Eigenvectors
1.5 Elementary Matrices and a Method for Finding
Lecture 19 Singular Value Decomposition
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Symmetric Matrices and Quadratic Forms
1cs542g-term Notes  Simpler right-looking derivation (sorry):
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
1cs542g-term Notes  r 2 log r is technically not defined at r=0 but can be smoothly continued to =0 there  Question (not required in assignment):
Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.
Matrices and Systems of Equations
Singular Value Decomposition
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Chapter 2 Matrices Definition of a matrix.
3D Geometry for Computer Graphics
Lecture 12 Least Square Approximation Shang-Hua Teng.
Tutorial 7 SVD Total Least Squares. 2 We already know, that the basis of eigenvectors of a matrix A is a convenient basis for work with A. However, for.
Ordinary least squares regression (OLS)
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Linear Least Squares QR Factorization. Systems of linear equations Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?
化工應用數學 授課教師: 郭修伯 Lecture 9 Matrices
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Length and Dot Product in R n Notes: is called a unit vector. Notes: The length of a vector is also called its norm. Chapter 5 Inner Product Spaces.
CHAPTER SIX Eigenvalues
January 22 Review questions. Math 307 Spring 2003 Hentzel Time: 1:10-2:00 MWF Room: 1324 Howe Hall Instructor: Irvin Roy Hentzel Office 432 Carver Phone.
SVD: Singular Value Decomposition
Matrices CHAPTER 8.1 ~ 8.8. Ch _2 Contents  8.1 Matrix Algebra 8.1 Matrix Algebra  8.2 Systems of Linear Algebra Equations 8.2 Systems of Linear.
Orthogonality and Least Squares
Computing Eigen Information for Small Matrices The eigen equation can be rearranged as follows: Ax = x  Ax = I n x  Ax - I n x = 0  (A - I n )x = 0.
Chapter 5 Eigenvalues and Eigenvectors 大葉大學 資訊工程系 黃鈴玲 Linear Algebra.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Chapter 5 MATRIX ALGEBRA: DETEMINANT, REVERSE, EIGENVALUES.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Chapter 9 Gauss Elimination The Islamic University of Gaza
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
Fall 1999 Copyright © R. H. Taylor Given a linear systemAx -b = e, Linear Least Squares (sometimes written Ax  b) We want to minimize the sum.
Matrices and Determinants
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
2.5 – Determinants and Multiplicative Inverses of Matrices.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Matrices CHAPTER 8.9 ~ Ch _2 Contents  8.9 Power of Matrices 8.9 Power of Matrices  8.10 Orthogonal Matrices 8.10 Orthogonal Matrices 
Feature Generation and Cluster-based Feature Selection.
Chapter 12: Data Analysis by linear least squares Overview: Formulate problem as an over-determined linear system of equations Solve the linear system.
Let W be a subspace of R n, y any vector in R n, and the orthogonal projection of y onto W. …
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
QR decomposition: A = QR Q is orthonormal R is upper triangular To find QR decomposition: 1.) Q: Use Gram-Schmidt to find orthonormal basis for column.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
The Inverse of a Square Matrix
Some useful linear algebra
SVD: Physical Interpretation and Applications
Orthogonality and Least Squares
Multiplicative Inverses of Matrices and Matrix Equations
Outline Singular Value Decomposition Example of PCA: Eigenfaces.
Maths for Signals and Systems Linear Algebra in Engineering Lecture 6, Friday 21st October 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Orthogonality and Least Squares
Outline Numerical Stability Singular Value Decomposition
Presentation transcript:

Math for CSTutorial 41 Contents: 1.Least squares solution for overcomplete linear systems. 2.… via normal equations 3.… via A = QR factorization 4.… via SVD decomposition 5.SVD - Singular Value Decomposition, A = UΣV T

Math for CSTutorial 42 Normal Equations Consider the system It can be a result of some physical measurements, which usually incorporate some errors. Since, we can not solve it exactly, we would like to minimize the error: r=b-Ax r 2 =r T r=(b-Ax) T (b-Ax)=b T b-2x T A T b+x T A T Ax (r 2 ) x =0 - zero derivative is a (necessary) minimum condition -2A T b+2A T Ax=0; A T Ax = A T b; – Normal Equations

Math for CSTutorial 43 Normal Equations 2 A T Ax = A T b – Normal Equations

Math for CSTutorial 44 Least squares via A=QR decomposition A (m,n) =Q (m,n) R (n,n), Q is orthogonal, therefore Q T Q=I. QRx=b R (n,n) x=Q T (n,m) b (m,1) -well defined linear system x=R -1 Q T b Q is found by Gram=Schmidt orthogonalization of A. How to find R? QR=A Q T QR=Q T A, but Q is orthogonal, therefore Q T Q=I: R=Q T A R is upper triangular, since in orthogonalization procedure only a 1,..a k (without a k+1,…) are used to produce q k

Math for CSTutorial 45 Least squares via A=QR decomposition 2 Let us check the correctness: QRx=b Rx=Q T b x=R -1 Q T b

Math for CSTutorial 46 Least squares via SVD Ax=b; A=UΣV T -singular value decomposition of A: UΣV T x=b; x= VΣ -1 U T b

Math for CSTutorial 47 Singular Value Decomposition 1 The SVD based on the fact that for any A there are orthonormal bases v 1,…v r for the row space and u 1,…u r for the column space, such, that Av i =σ i u i, while σ i >0 Thus, any matrix can be represented as,where U and V are orthogonal, and Σ is diagonal.

Math for CSTutorial 48 Singular Value Decomposition 2 First we find the matrix V: A T A=(UΣV T ) T (UΣV T )= V T Σ T U T U ΣV T = V T Σ T ΣV T This is an ordinary (eigenvector) factorization of a symmetric matrix, therefore V is built of eigenvectors of A T A. The eigenvectors of A T A are rows of V T. In the same way one can prove, that U is built from eigenvectors of AA T. However, an easier way to find U and Σ is to use the equations: Av i =σ i u i

Math for CSTutorial 49 SVD Example Let us find SVD for the matrix In order to find V, we are calculating eigenvectors of A T A: (5-λ) 2 -9=0; λ λ +16=0; λ 1,2 =8,2

Math for CSTutorial 410 SVD Example 2 Now, we obtain the U and Σ : A=UΣV T :

Math for CSTutorial 411 Appendix: derivative of x T A T Ax