Dimension Reduction by pre-image curve method Laniu S. B. Pope Feb. 24 th, 2005.

Slides:



Advertisements
Similar presentations
Vector Spaces A set V is called a vector space over a set K denoted V(K) if is an Abelian group, is a field, and For every element vV and K there exists.
Advertisements

More on single-view geometry
Linear Subspaces - Geometry. No Invariants, so Capture Variation Each image = a pt. in a high-dimensional space. –Image: Each pixel a dimension. –Point.
8.2 Kernel And Range.
3D Geometry for Computer Graphics
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Differential geometry I
Differential Geometry Applied to Dynamical Systems
Surface normals and principal component analysis (PCA)
1 OR II GSLM Outline  some terminology  differences between LP and NLP  basic questions in NLP  gradient and Hessian  quadratic form  contour,
Kliah Soto Jorge Munoz Francisco Hernandez. and.
Two-View Geometry CS Sastry and Yang
As applied to face recognition.  Detection vs. Recognition.
1cs542g-term High Dimensional Data  So far we’ve considered scalar data values f i (or interpolated/approximated each component of vector values.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
“Random Projections on Smooth Manifolds” -A short summary
6. One-Dimensional Continuous Groups 6.1 The Rotation Group SO(2) 6.2 The Generator of SO(2) 6.3 Irreducible Representations of SO(2) 6.4 Invariant Integration.
3D Geometry for Computer Graphics
Dimension Reduction of Combustion Chemistry Species reconstruction by Constrained Equilibrium Pre-Image Curve method (CE-PIC) Zhuyin (Laniu) Ren Steve.
Computer Graphics Recitation 6. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
Approximate Nearest Subspace Search with Applications to Pattern Recognition Ronen Basri, Tal Hassner, Lihi Zelnik-Manor presented by Andrew Guillory and.
1/ 30. Problems for classical IR models Introduction & Background(LSI,SVD,..etc) Example Standard query method Analysis standard query method Seeking.
Dimension Reduction of Combustion Chemistry Zhuyin (Laniu) Ren S. B. Pope Mar. 29 th, 2005.
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
Dimension Reduction of Combustion Chemistry using Pre-Image Curves Zhuyin (laniu) Ren October 18 th, 2004.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Olga Sorkine’s slides Tel Aviv University. 2 Spectra and diagonalization A If A is symmetric, the eigenvectors are orthogonal (and there’s always an eigenbasis).
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Elementary Linear Algebra Anton & Rorres, 9 th Edition Lecture Set – 08 Chapter 8: Linear Transformations.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Lecture 6: Feature matching and alignment CS4670: Computer Vision Noah Snavely.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
SVD(Singular Value Decomposition) and Its Applications
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Summarized by Soo-Jin Kim
Linear Least Squares Approximation. 2 Definition (point set case) Given a point set x 1, x 2, …, x n  R d, linear least squares fitting amounts to find.
1 Recognition by Appearance Appearance-based recognition is a competing paradigm to features and alignment. No features are extracted! Images are represented.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
December 12 th, 2001C. Geyer/K. Daniilidis GRASP Laboratory Slide 1 Structure and Motion from Uncalibrated Catadioptric Views Christopher Geyer and Kostas.
Eigenvectors and Linear Transformations Recall the definition of similar matrices: Let A and C be n  n matrices. We say that A is similar to C in case.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
CSE 185 Introduction to Computer Vision Face Recognition.
Prabhakar.G.Vaidya and Swarnali Majumder A preliminary investigation of the feasibility of using SVD and algebraic topology to study dynamics on a manifold.
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Systems of Linear Equations in Vector Form Prepared by Vince Zaccone For Campus Learning Assistance Services at UCSB.
Ch 6 Vector Spaces. Vector Space Axioms X,Y,Z elements of  and α, β elements of  Def of vector addition Def of multiplication of scalar and vector These.
ECE 576 – Power System Dynamics and Stability Prof. Tom Overbye Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
3D Geometry for Computer Graphics Class 3. 2 Last week - eigendecomposition A We want to learn how the transformation A works:
4.5: The Dimension of a Vector Space. Theorem 9 If a vector space V has a basis, then any set in V containing more than n vectors must be linearly dependent.
Chapter 3 Section 3.5 Dimension. A 1-dimensional subspace is a line. A 2-dimensional subspace is a plane. A 3-dimensional subspace is space. A 4-dimensional.
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
Self-Modeling Curve Resolution and Constraints Hamid Abdollahi Department of Chemistry, Institute for Advanced Studies in Basic Sciences (IASBS), Zanjan,
Lecture 16: Image alignment
Principal Component Analysis (PCA)
Motion Segmentation with Missing Data using PowerFactorization & GPCA
Unsupervised Riemannian Clustering of Probability Density Functions
Recognition: Face Recognition
RECORD. RECORD Subspaces of Vector Spaces: Check to see if there are any properties inherited from V:
Linear Transformations
Multivariate Analysis: Theory and Geometric Interpretation
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
1.3 Vector Equations.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Feature space tansformation methods
CS4670: Intro to Computer Vision
Linear Algebra Lecture 20.
Nonlinear Dimension Reduction:
Vector Spaces, Subspaces
Presentation transcript:

Dimension Reduction by pre-image curve method Laniu S. B. Pope Feb. 24 th, 2005

Part B: Dimension Reduction –Manifold Perspective Different methods impose different n u = n φ - n r conditions which determine the corresponding manifold φ m, which is used to approximate the attracting manifold Given a reduced composition r, according to the n u conditions to determine the corresponding full composition on the manifold φ m What is the attracting slow manifold? ---geometric significance ---invariant Could we define a manifold which has the same geometric significance and similar properties? Impose n u conditions=>

The sensitivity matrix is defined as Part B: Geometric significance of sensitivity matrices The initial ball is squashed to a low dimensional object, and this low dimensional object aligns with the attracting manifold The principal subspace U m should be a good approximation to the tangent space of the attracting manifold at the mapping point The “maximally compressive” subspace of the initial ball is that spanned by V c

Part B: Manifold Given the reduced composition r, find a point which satisfy the above condition U c is from the sensitivity matrix A, which is the sensitivity of φ with respect to some point on the trajectory backward

PartB: Simple Example I Slow attracting manifold QSSA manifold ILDM manifold Global Eigenvalue manifold

PartB: Simple Example I (Contd) => Tangent plane of the manifold The manifold is approaching to be invariant approaches the tangent plane of the slow manifold

Part B: Simple Example I (Contd) Comments: For this linear system, ILDM predicts the exact slow manifold. The ILDM fast subspace seems weird The new manifold approaches the slow manifold and approaches to be invariant as approaches zero. The most compressive subspace approaches the QSSA species direction

Part B: Simple Example II Slow attracting manifold QSSA manifold ILDM manifold Global Eigenvalue manifold

Part B: Simple Example II (Contd) approaches the tangent plane of the slow manifold

Part B: Simple Example II (Contd) => Tangent plane of the manifold approaches the tangent plane of the slow manifold; The manifold is approaching to be invariant; the most compressive subspace approaches the QSSA species direction

Part B: Dimension Reduction by pre-image curve ---Manifold Perspective Ideas: Use pre-image curve to get a good U m, which is a good approximation to the tangent plane of the attracting slow manifold. H 2 /air system

Conclusion and Future work Identify The geometric significance of the sensitivity matrix Identify the principal subspace and the compressive subspace Identify the tangent plane of the pre-image manifold Species reconstruction by attracting-manifold pre-image curve method is implemented The manifold perspective of dimension reduction by pre-image curve method is discussed Thanks to Professor Guckenheimer