O UT - OF -S AMPLE E XTENSION AND R ECONSTRUCTION ON M ANIFOLDS Bhuwan Dhingra Final Year (Dual Degree) Dept of Electrical Engg.

Slides:



Advertisements
Similar presentations
Text mining Gergely Kótyuk Laboratory of Cryptography and System Security (CrySyS) Budapest University of Technology and Economics
Advertisements

Non-linear Dimensionality Reduction by Locally Linear Inlaying Presented by Peng Zhang Tianjin University, China Yuexian Hou, Peng Zhang, Xiaowei Zhang,
Face Recognition and Biometric Systems Eigenfaces (2)
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Differential geometry I
R ECONSTRUCTION ON S MOOTH M ANIFOLDS Bhuwan Dhingra Dual Degree Student Electrical Engineering IIT Kanpur.
Nonlinear Dimension Reduction Presenter: Xingwei Yang The powerpoint is organized from: 1.Ronald R. Coifman et al. (Yale University) 2. Jieping Ye, (Arizona.
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
Two-View Geometry CS Sastry and Yang
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
1 A Survey on Distance Metric Learning (Part 1) Gerry Tesauro IBM T.J.Watson Research Center.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
Principal Component Analysis CMPUT 466/551 Nilanjan Ray.
Principal Component Analysis
Dimensionality Reduction and Embeddings
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Dimensional reduction, PCA
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Learning a Kernel Matrix for Nonlinear Dimensionality Reduction By K. Weinberger, F. Sha, and L. Saul Presented by Michael Barnathan.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
CSE 185 Introduction to Computer Vision Face Recognition.
Dimensionality Reduction
Manifold learning: MDS and Isomap
Non-Isometric Manifold Learning Analysis and an Algorithm Piotr Dollár, Vincent Rabaud, Serge Belongie University of California, San Diego.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Principal Manifolds and Probabilistic Subspaces for Visual Recognition Baback Moghaddam TPAMI, June John Galeotti Advanced Perception February 12,
EE4-62 MLCV Lecture Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Data Projections & Visualization Rajmonda Caceres MIT Lincoln Laboratory.
Final Review Course web page: vision.cis.udel.edu/~cv May 21, 2003  Lecture 37.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
High Dimensional Probabilistic Modelling through Manifolds
Neil Lawrence Machine Learning Group Department of Computer Science
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
Neil Lawrence Machine Learning Group Department of Computer Science
Ch 12. Continuous Latent Variables ~ 12
9.3 Filtered delay embeddings
Unsupervised Riemannian Clustering of Probability Density Functions
کاربرد نگاشت با حفظ تنکی در شناسایی چهره
Dimensionality Reduction
Machine Learning Dimensionality Reduction
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Dimensionality Reduction
Object Modeling with Layers
Principal Component Analysis
Principal Component Analysis
Nonlinear Dimension Reduction:
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

O UT - OF -S AMPLE E XTENSION AND R ECONSTRUCTION ON M ANIFOLDS Bhuwan Dhingra Final Year (Dual Degree) Dept of Electrical Engg.

I NTRODUCTION An m- dimensional manifold is a topological space which is locally homeomorphic to the m - dimensional Euclidean space In this work we consider manifolds which are: Differentiable Embedded in a Euclidean space Generated from a set of m latent variables via a smooth function f

I NTRODUCTION n >> m

N ON -L INEAR D IMENSIONALITY R EDUCTION In practice we only have a sampling on the manifold Y is estimated using a Non-Linear Dimensionality Reduction (NLDR) method Examples of NLDR methods –ISOMAP, LLE, KPCA etc. However most non-linear methods only provide the embedding Y and not the mappings f and g

P ROBLEM S TATEMENT x* y* g f

O UTLINE

The tangent plane is estimated from the k- nearest neighbors of p using PCA

O UT - OF -S AMPLE E XTENSION A linear transformation A e is learnt s.t Y = A e Z Embedding for new point y* = A e z* AeAe z*y*

O UT - OF -S AMPLE R ECONSTRUCTION A linear transformation A r is learnt s.t Z = A r Y Projection of reconstruction on tangent plane z* = A r y* z*y* ArAr

P RINCIPAL C OMPONENTS A NALYSIS Covariance matrix of neighborhood: Let be the eigenvector and eigenvalue matrixes of M k Then Denotethen the projection of a point x onto the tangent plane is given by:

L INEAR T RANSFORMATION

F INAL E STIMATES

E RROR A NALYSIS

S AMPLING D ENSITY

N EIGHBORHOOD P ARAMETERIZATION

R ECONSTRUCTION E RROR But A r A e = I, hence

R ECONSTRUCTION E RROR

S MOOTHNESS OF M ANIFOLD

R ESULTS - E XTENSION Out of sample extension on the Swiss-Roll dataset Neighborhood size = 10

R ESULTS - E XTENSION Out of sample extension on the Japanese flag dataset Neighborhood size = 10

R ESULTS - R ECONSTRUCTION Reconstructions of ISOMAP faces dataset (698 images) n = 4096, m = 3 Neighborhood size = 8

R ECONSTRUCTION ERROR V N UMBER OF P OINTS ON M ANIFOLD ISOMAP Faces dataset Number of cross validation sets = 5 Neighborhood size = [6, 7, 8, 9]