Multivariate Analysis: Theory and Geometric Interpretation

Slides:



Advertisements
Similar presentations
3D Geometry for Computer Graphics
Advertisements

Object Orie’d Data Analysis, Last Time Finished NCI 60 Data Started detailed look at PCA Reviewed linear algebra Today: More linear algebra Multivariate.
Lecture 19 Singular Value Decomposition
Slides by Olga Sorkine, Tel Aviv University. 2 The plan today Singular Value Decomposition  Basic intuition  Formal definition  Applications.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Determinants Bases, linear Indep., etc Gram-Schmidt Eigenvalue and Eigenvectors Misc
Symmetric Matrices and Quadratic Forms
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Face Recognition Jeremy Wyatt.
Chapter 3 Determinants and Matrices
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
3D Geometry for Computer Graphics
Ordinary least squares regression (OLS)
Class 25: Question 1 Which of the following vectors is orthogonal to the row space of A?
Orthogonality and Least Squares
Dirac Notation and Spectral decomposition
Matrices CS485/685 Computer Vision Dr. George Bebis.
5.1 Orthogonality.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
1 February 24 Matrices 3.2 Matrices; Row reduction Standard form of a set of linear equations: Chapter 3 Linear Algebra Matrix of coefficients: Augmented.
8.1 Vector spaces A set of vector is said to form a linear vector space V Chapter 8 Matrices and vector spaces.
AN ORTHOGONAL PROJECTION
Linear algebra: matrix Eigen-value Problems
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 07 Chapter 7: Eigenvalues, Eigenvectors.
Introduction to Linear Algebra Mark Goldman Emily Mackevicius.
Class 24: Question 1 Which of the following set of vectors is not an orthogonal set?
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
5.1 Eigenvectors and Eigenvalues 5. Eigenvalues and Eigenvectors.
Advanced Computer Graphics Spring 2014 K. H. Ko School of Mechatronics Gwangju Institute of Science and Technology.
4.8 Rank Rank enables one to relate matrices to vectors, and vice versa. Definition Let A be an m  n matrix. The rows of A may be viewed as row vectors.
Instructor: Mircea Nicolescu Lecture 8 CS 485 / 685 Computer Vision.
Chapter 61 Chapter 7 Review of Matrix Methods Including: Eigen Vectors, Eigen Values, Principle Components, Singular Value Decomposition.
Lecture XXVII. Orthonormal Bases and Projections Suppose that a set of vectors {x 1,…,x r } for a basis for some space S in R m space such that r  m.
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Principal Component Analysis (PCA)
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
CS479/679 Pattern Recognition Dr. George Bebis
Elementary Linear Algebra Anton & Rorres, 9th Edition
Matrices and vector spaces
Row Space, Column Space, and Nullspace
CHE 391 T. F. Edgar Spring 2012.
Euclidean Inner Product on Rn
Some useful linear algebra
CS485/685 Computer Vision Dr. George Bebis
Principal Component Analysis
Orthogonality and Least Squares
Chapter 3 Linear Algebra
Linear Algebra Lecture 39.
Feature space tansformation methods
Equivalent State Equations
Symmetric Matrices and Quadratic Forms
Lecture 13: Singular Value Decomposition (SVD)
Maths for Signals and Systems Linear Algebra in Engineering Lecture 18, Friday 18th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR) IN SIGNAL.
Elementary Linear Algebra Anton & Rorres, 9th Edition
Linear Algebra Lecture 41.
Eigenvalues and Eigenvectors
Subject :- Applied Mathematics
Linear Algebra: Matrix Eigenvalue Problems – Part 2
Linear Algebra Lecture 28.
Vector Spaces COORDINATE SYSTEMS © 2012 Pearson Education, Inc.
Orthogonality and Least Squares
Symmetric Matrices and Quadratic Forms
Presentation transcript:

Multivariate Analysis: Theory and Geometric Interpretation David Chelidze

Proper Orthogonal Decomposition (POD) We consider a scalar field , where and . POD decomposes it using orthonormal basis functions and the corresponding time coordinates , where This is equivalent to the following maximization problem: , subject to This results into the following integral eigenvalue problem

Smooth Orthogonal Decomposition (SOD) SOD is looking for a projective function , such that the has maximal variance subject to its minimal roughness, expressed by the following maximization problem: subject to This translates into the following generalized integral eigenvalue problem: Using the solution to the above eigenvalue problem, the scalar field can be reconstructed as: , where . A set of smooth orthogonal modes form a bi-orthogonal set with smooth projective modes

Practical Calculations for POD and SOD When the field is sampled such that , where , then POD can be solved by singular value decomposition , where are time coordinates and contains the corresponding mode shapes. The corresponding SOD problem can be solved by generalized singular value decomposition: where: are unitary matrices, are diagonal matrices, and columns of are smooth orthogonal modes; columns of are smooth orthogonal coordinates; columns of are smooth projective modes; are smooth orthogonal values, and .

Proper Orthogonal Decomposition: Geometric Interpretation

Geometric Interpretation of SOD SOD identifies the subspaces where the scalar field projection is maximally smooth Smooth orthogonal modes (SOMs) are not orthogonal to each other but are linearly independent SOMs span smooth modal subspaces Smooth projective modes (SPMs) form a bi- orthonormal set with SOMs and are used to obtain smooth orthogonal coordinates (SOCs) SOCs are orthogonal to each other (i.e., their covariance matrix is diagonal) SOCs are invariant under invertible linear coordinate transform

SOD Geometric Interpretation Given a cloud of trajectory points, we identify its center of mass. Then a point and its velocity can be visualized by two vectors in the figure Now we identify the first SPM by maximizing the ratio of the projections of the velocity and the position onto this mode This is followed by the similar maximization for the second SPM and so on Smooth orthogonal modes (SOMs) form a bi- orthonormal set with SPMs and are used to obtain describe the points in the cloud