Prabhakar.G.Vaidya and Swarnali Majumder A preliminary investigation of the feasibility of using SVD and algebraic topology to study dynamics on a manifold.

Slides:



Advertisements
Similar presentations
4.1 Introduction to Matrices
Advertisements

8.2 Kernel And Range.
3D Geometry for Computer Graphics
Applied Informatics Štefan BEREŽNÝ
Chapter 28 – Part II Matrix Operations. Gaussian elimination Gaussian elimination LU factorization LU factorization Gaussian elimination with partial.
Differential geometry I
Dimension Reduction by pre-image curve method Laniu S. B. Pope Feb. 24 th, 2005.
Tensors and Component Analysis Musawir Ali. Tensor: Generalization of an n-dimensional array Vector: order-1 tensor Matrix: order-2 tensor Order-3 tensor.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Linear Transformations
THE DIMENSION OF A VECTOR SPACE
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
Computer Graphics Recitation 5.
3D Geometry for Computer Graphics
Some useful linear algebra. Linearly independent vectors span(V): span of vector space V is all linear combinations of vectors v i, i.e.
Charts. Coordinate Chart  A chart is a neighborhood W and function . W  XW  X  : W  E d  : W  E d  is a diffeomorphism  is a diffeomorphism.
Lecture 19 Quadratic Shapes and Symmetric Positive Definite Matrices Shang-Hua Teng.
What is Linear Algebra? Notation:. Linear Transformation Linear Operator Matrix Multiplication n-Dimensional Linear Mapping Linear Coordinate Transformation.
The Terms that You Have to Know! Basis, Linear independent, Orthogonal Column space, Row space, Rank Linear combination Linear transformation Inner product.
3D Geometry for Computer Graphics
The Implicit Mapping into Feature Space. In order to learn non-linear relations with a linear machine, we need to select a set of non- linear features.
Lecture 20 SVD and Its Applications Shang-Hua Teng.
6 1 Linear Transformations. 6 2 Hopfield Network Questions.
Boot Camp in Linear Algebra Joel Barajas Karla L Caballero University of California Silicon Valley Center October 8th, 2008.
PHY 042: Electricity and Magnetism
Solving System of Linear Equations. 1. Diagonal Form of a System of Equations 2. Elementary Row Operations 3. Elementary Row Operation 1 4. Elementary.
Linear Algebra Review By Tim K. Marks UCSD Borrows heavily from: Jana Kosecka Virginia de Sa (UCSD) Cogsci 108F Linear.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
SVD(Singular Value Decomposition) and Its Applications
NUS CS5247 A dimensionality reduction approach to modeling protein flexibility By, By Miguel L. Teodoro, George N. Phillips J* and Lydia E. Kavraki Rice.
Linear Algebra Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 : “shiv rpi” Linear Algebra A gentle introduction Linear Algebra has become as basic and as applicable.
CS B659: Principles of Intelligent Robot Motion Configuration Space.
Co. Chapter 3 Determinants Linear Algebra. Ch03_2 Let A be an n  n matrix and c be a nonzero scalar. (a)If then |B| = …….. (b)If then |B| = …..... (c)If.
December 12 th, 2001C. Geyer/K. Daniilidis GRASP Laboratory Slide 1 Structure and Motion from Uncalibrated Catadioptric Views Christopher Geyer and Kostas.
4.6: Rank. Definition: Let A be an mxn matrix. Then each row of A has n entries and can therefore be associated with a vector in The set of all linear.
Non-Linear Dimensionality Reduction
2 2.9 © 2016 Pearson Education, Inc. Matrix Algebra DIMENSION AND RANK.
Chemistry 301/ Mathematics 251 Chapter 4
Feedback Stabilization of Nonlinear Singularly Perturbed Systems MENG Bo JING Yuanwei SHEN Chao College of Information Science and Engineering, Northeastern.
EIGENSYSTEMS, SVD, PCA Big Data Seminar, Dedi Gadot, December 14 th, 2014.
2.5 – Determinants and Multiplicative Inverses of Matrices.
Matrix Factorization & Singular Value Decomposition Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
4 4.5 © 2016 Pearson Education, Inc. Vector Spaces THE DIMENSION OF A VECTOR SPACE.
Principal Warps: Thin-Plate Splines and the Decomposition of Deformations 김진욱 ( 이동통신망연구실 ; 박천현 (3D 모델링 및 처리연구실 ;
Boot Camp in Linear Algebra TIM 209 Prof. Ram Akella.
Reduced echelon form Matrix equations Null space Range Determinant Invertibility Similar matrices Eigenvalues Eigenvectors Diagonabilty Power.
Matrices and systems of Equations. Definition of a Matrix * Rectangular array of real numbers m rows by n columns * Named using capital letters * First.
 Matrix Operations  Inverse of a Matrix  Characteristics of Invertible Matrices …
CS246 Linear Algebra Review. A Brief Review of Linear Algebra Vector and a list of numbers Addition Scalar multiplication Dot product Dot product as a.
Spectral Methods for Dimensionality
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Introduction The central problems of Linear Algebra are to study the properties of matrices and to investigate the solutions of systems of linear equations.
Review of Linear Algebra
Transformation of Axes
Transformation of Axes
Section 1.8: Introduction to Linear Transformations.
Some useful linear algebra
Principal Component Analysis
4.6: Rank.
§1-3 Solution of a Dynamical Equation
Recitation: SVD and dimensionality reduction
Maths for Signals and Systems Linear Algebra in Engineering Lectures 13 – 14, Tuesday 8th November 2016 DR TANIA STATHAKI READER (ASSOCIATE PROFFESOR)
Lecture 13: Singular Value Decomposition (SVD)
Lecture 14 – More damned mathematics
Linear Algebra A gentle introduction
NonLinear Dimensionality Reduction or Unfolding Manifolds
Eigenvalues and Eigenvectors
THE DIMENSION OF A VECTOR SPACE
Presentation transcript:

Prabhakar.G.Vaidya and Swarnali Majumder A preliminary investigation of the feasibility of using SVD and algebraic topology to study dynamics on a manifold

Global Method and Atlas Method In global methods we get maps or equations for the whole state space, but in case the of the atlas methods we cover the trajectory by overlapping patches and get maps or equations for each one of them separately.

Covering the Trajectory by Local Patches Farmer,J.D. and Sidorowich,J.J., “Predicting Chaotic Time Series”, Physical review Letters 59, 1987.

Role of singular value decomposition in studying algebraic topology Finding local dimension of the manifold where data resides. Local dimension is equal to the number of nonzero singular values. Locally we model high dim data by a low dim manifold. SVD gives us local co ordinates of a manifold when it is embedded in higher dim.

Let us consider a local patch on mobius strip Mobius strip is 2 dim manifold, but it is embedded in 3 dim, so we get data in 3 dim. By SVD we find local dimension of this patch. Also it is a natural way of getting local co ordinates.

We take data from the 3 dim differential equation of mobius strip

H is the data matrix of mobius strip. It is 100 by 3. = H = UWV t Number of nonzero diagonal element in W gives the local dimension. In case of mobius strip it is 2. The above relationship gives a 1-1 transformation from 3D to 2D. yxyz u v z z v

Since the 3 rd singular value in W is very small, we consider only first two columns of UW. Let us call it sU. Let us consider first two column of V and let us call it as sV. So we have a local bijective relation H=sU sV t

We get bijection between 3 dim data and 2 dim local co-ordinates in each local patch.

Non-linear singular value decomposition When we want to do local approximation in a bigger area we do generalization of singular value decomposition. We consider non linear combinations of x,y,z and do svd on the matrix.

We create a global dynamics

Dynamics is created in the lower dim of each chart and going to the higher dim when overlapping region comes. We have transformation from higher to lower dimension and also from lower to higher dimension in each chart.

In a specific patch we get the following dynamics a= , b=.0007, c= , d= , e=.012, f=

We consider first two columns of U, which are the local coordinates. Using this U we do rectification. Aligning two charts together We continue this alignment for every chart and get a low dimensional manifolds. It is the covering space of the original manifold, once we make identification.

Reference: 1. Farmer,J.D. and Sidorowich,J.J., “Predicting Chaotic Time Series”, Physical review Letters 59, 1987.

Thank You