Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition.

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Solving Linear Systems (Numerical Recipes, Chap 2)
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Dan Witzner Hansen  Groups?  Improvements – what is missing?
Symmetric Matrices and Quadratic Forms
Symmetric Matrices and Quadratic Forms
Chapter 5 Orthogonality
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Information Retrieval in Text Part III Reference: Michael W. Berry and Murray Browne. Understanding Search Engines: Mathematical Modeling and Text Retrieval.
Ch 7.3: Systems of Linear Equations, Linear Independence, Eigenvalues
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
Chapter 2 Matrices Definition of a matrix.
CSci 6971: Image Registration Lecture 2: Vectors and Matrices January 16, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI.
1 Neural Nets Applications Vectors and Matrices. 2/27 Outline 1. Definition of Vectors 2. Operations on Vectors 3. Linear Dependence of Vectors 4. Definition.
Previously Two view geometry: epipolar geometry Stereo vision: 3D reconstruction epipolar lines Baseline O O’ epipolar plane.
3D Geometry for Computer Graphics
Math for CSTutorial 41 Contents: 1.Least squares solution for overcomplete linear systems. 2.… via normal equations 3.… via A = QR factorization 4.… via.
Ordinary least squares regression (OLS)
Orthogonality and Least Squares
Linear Least Squares QR Factorization. Systems of linear equations Problem to solve: M x = b Given M x = b : Is there a solution? Is the solution unique?
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Orthogonal Matrices and Spectral Representation In Section 4.3 we saw that n  n matrix A was similar to a diagonal matrix if and only if it had n linearly.
Scientific Computing QR Factorization Part 1 – Orthogonal Matrices, Reflections and Rotations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
CHAPTER SIX Eigenvalues
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
Eigenvalue Problems Solving linear systems Ax = b is one part of numerical linear algebra, and involves manipulating the rows of a matrix. The second main.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
AN ORTHOGONAL PROJECTION
SVD: Singular Value Decomposition
Linear algebra: matrix Eigen-value Problems
Vector Norms and the related Matrix Norms. Properties of a Vector Norm: Euclidean Vector Norm: Riemannian metric:
CPSC 491 Xin Liu November 19, Norm Norm is a generalization of Euclidean length in vector space Definition 2.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Scientific Computing Singular Value Decomposition SVD.
MATH 685/ CSI 700/ OR 682 Lecture Notes Lecture 4. Least squares.
CPSC 491 Xin Liu November 22, A change of Bases A mxn =UΣV T, U mxm, V nxn are unitary matrixes Σ mxn is a diagonal matrix Columns of a unitary.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Eigenvalues The eigenvalue problem is to determine the nontrivial solutions of the equation Ax= x where A is an n-by-n matrix, x is a length n column.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Review of Linear Algebra Optimization 1/16/08 Recitation Joseph Bradley.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
1 Chapter 8 – Symmetric Matrices and Quadratic Forms Outline 8.1 Symmetric Matrices 8.2Quardratic Forms 8.3Singular ValuesSymmetric MatricesQuardratic.
Instructor: Mircea Nicolescu Lecture 9
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Chapter 12: Data Analysis by linear least squares Overview: Formulate problem as an over-determined linear system of equations Solve the linear system.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Lecture 16: Image alignment
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Matrices and vector spaces
Singular Value Decomposition
Some useful linear algebra
SVD: Physical Interpretation and Applications
Singular Value Decomposition SVD
Symmetric Matrices and Quadratic Forms
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Elementary Linear Algebra Anton & Rorres, 9th Edition
Presentation transcript:

Math for CSLecture 41 Linear Least Squares Problem Over-determined systems Minimization problem: Least squares norm Normal Equations Singular Value Decomposition

Math for CSLecture 42 Linear Least Squares: Example Consider an equation for a stretched beam: Y = x 1 + x 2 T Where x 1 is the original length, T is the force applied and x 2 is the inverse coefficient of stiffness. Suppose that the following measurements where taken: T Y Corresponding to the overcomplete system: = x1 + x = x1 + x can not be satisfied exactly… = x1 + x2 20

Math for CSLecture 43 Linear Least Squares: Definition Problem: Given A(m x n), m≥n, b(m x 1) find x(n x 1) to minimize ||Ax-b|| 2. If m > n, we have more equations than the number of unknowns, there is generally no x satisfying Ax=b exactly. This is an overcomplete system.

Math for CSLecture 44 Solution Approaches There are three different algorithms for computing the least square minimum. 1.Normal Equations (Cheap, less Accurate). 2.QR decomposition. 3.SVD (expensive, more reliable). The first algorithm in the fastest and the least accurate among the three. On the other hand SVD is the slowest and most accurate.

Math for CSLecture 45 Minimize the squared Euclidean norm of the residual vector: To minimize we take the derivative with respect to x and set it to zero: Which reduces to an (n x n) linear system commonly known as NORMAL EQUATIONS: Normal Equations 1

Math for CSLecture 46 Normal Equations = x1 + x = x1 + x = x1 + x2 20 A x b min||Ax-b|| 2

Math for CSLecture 47 Normal Equations 3 We must solve the system For the following values (A T A) -1 A T is called a Pseudo-inverse

Math for CSLecture 48 QR factorization 1 A matrix Q is said to be orthogonal if its columns are orthonormal, i.e. Q T ·Q=I. Orthogonal transformations preserve the Euclidean norm since Orthogonal matrices can transform vectors in various ways, such as rotation or reflections but they do not change the Euclidean length of the vector. Hence, they preserve the solution to a linear least squares problem.

Math for CSLecture 49 QR factorization 2 Any matrix A (m·n) can be represented as A = Q·R,where Q (m·n) is orthonormal and R (n·n) is upper triangular:

Math for CSLecture 410 QR factorization 2 Given A, let its QR decomposition be given as A=Q·R, where Q is an (m x n) orthonormal matrix and R is upper triangular. QR factorization transform the linear least square problem into a triangular least squares. Q·R·x = b R·x = Q T ·b x=R -1 ·Q T ·b Matlab Code:

Math for CSLecture 411 Singular Value Decomposition Normal equations and QR decomposition only work for fully-ranked matrices (i.e. rank( A) = n). If A is rank- deficient, that there are infinite number of solutions to the least squares problems and we can use algorithms based on SVD's. Given the SVD: U (m x m), V (n x n) are orthogonal Σ is an (m x n) diagonal matrix (singular values of A) The minimal solution corresponds to:

Math for CSLecture 412 Singular Value Decomposition Matlab Code:

Math for CSLecture 413 Linear algebra review - SVD

Math for CSLecture 414 Approximation by a low-rank matrix

Math for CSLecture 415 Geometric Interpretation of the SVD The image of the unit sphere under any m x n matrix is a hyperellipse v1v1 v2v2 σ·v 2 σ·v 1

Math for CSLecture 416 Left and Right singular vectors We can define the properties of A in terms of the shape of AS v1v1 v2v2 σ·u 2 σ·u 1 SAS Singular values of A are the lengths of principal axes of AS, usually written in non-increasing order σ1 ≥ σ2 ≥ … ≥ σn n left singular vectors of A are the unit vectors {u 1,…, u n }, oriented in the directions of the principal semiaxes of AS numbered in correspondance with {σ i } n right singular vectors of A are the unit vectors {v 1,…, v n }, of S, which are the preimages of the principal semiaxes of AS: Av i = σ i u i

Math for CSLecture 417 Singular Value Decomposition Av i = σ i u i, 1 ≤ i ≤ n Matrices U,V are orthogonal and Σ is diagonal - Singular Value decomposition

Math for CSLecture 418 Matrices in the Diagonal Form Every matrix is diagonal in appropriate basis: Any vector b (m,1) can be expanded in the basis of left singular vectors of A {u i }; Any vector x (n,1) can be expanded in the basis of right singular vectors of A {v i }; Their coordinates in these new expansions are: Then the relation b=Ax can be expressed in terms of b’ and x’:

Math for CSLecture 419 Rank of A Let p=min{m,n}, let r≤p denote the number of nonzero singlular values of A, Then: The rank of A equals to r, the number of nonzero singular values Proof: The rank of a diagonal matrix equals to the number of its nonzero entries, and in the decomposition A=UΣV *,U and V are of full rank

Math for CSLecture 420 Determinant of A For A (m,m), Proof: The determinant of a product of square matrices is the product of their determinants. The determinant of a Unitary matrix is 1 in absolute value, since: U * U=I. Therefore,

Math for CSLecture 421 For A (m,n), can be written as a sum of r rank-one matrices: (1) Proof: If we write Σ as a sum of Σ i, where Σ i =diag(0,..,σ i,..0), then (1) Follows from (2) A in terms of singular vectors

Math for CSLecture 422 The L 2 norm of the vector is defined as: (1) The L2 norm of the matrix is defined as: Therefore,where λ i are the eigenvalues Norm of the matrix

Math for CSLecture 423 For any ν with 0 ≤ ν ≤r, define (1) If ν=p=min{m,n}, define σ v+1 =0. Then Matrix Approximation in SVD basis