NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.

Slides:



Advertisements
Similar presentations
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Advertisements

EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Distance Preserving Embeddings of Low-Dimensional Manifolds Nakul Verma UC San Diego.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
1 A Survey on Distance Metric Learning (Part 1) Gerry Tesauro IBM T.J.Watson Research Center.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
1 High dimensionality Evgeny Maksakov CS533C Department of Computer Science UBC.
“Random Projections on Smooth Manifolds” -A short summary
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
1 High dimensionality Evgeny Maksakov CS533C Department of Computer Science UBC.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Image Manifolds : Learning-based Methods in Vision Alexei Efros, CMU, Spring 2007 © A.A. Efros With slides by Dave Thompson.
1 Numerical geometry of non-rigid shapes Spectral Methods Tutorial. Spectral Methods Tutorial 6 © Maks Ovsjanikov tosca.cs.technion.ac.il/book Numerical.
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Manifold learning and pattern matching with entropic graphs Alfred O. Hero Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan - Ann.
Distance Metric Learning: A Comprehensive Survey
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Adaptive nonlinear manifolds and their applications to pattern.
Dimensionality reduction: Some Assumptions High-dimensional data often lies on or near a much lower dimensional, curved manifold. A good way to represent.
Data Reduction. 1.Overview 2.The Curse of Dimensionality 3.Data Sampling 4.Binning and Reduction of Cardinality.
Learning a Kernel Matrix for Nonlinear Dimensionality Reduction By K. Weinberger, F. Sha, and L. Saul Presented by Michael Barnathan.
THE MANIFOLDS OF SPATIAL HEARING Ramani Duraiswami | Vikas C. Raykar Perceptual Interfaces and Reality Lab University of Maryland, College park.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Dimensionality Reduction
Manifold learning: MDS and Isomap
CSC2535: Computation in Neural Networks Lecture 12: Non-linear dimensionality reduction Geoffrey Hinton.
Dimensionality Reduction Part 2: Nonlinear Methods
1 LING 696B: MDS and non-linear methods of dimension reduction.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Non-Linear Dimensionality Reduction
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Data Projections & Visualization Rajmonda Caceres MIT Lincoln Laboratory.
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Data Mining Course 2007 Eric Postma Clustering. Overview Three approaches to clustering 1.Minimization of reconstruction error PCA, nlPCA, k-means clustering.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
CSC321: Lecture 25: Non-linear dimensionality reduction Geoffrey Hinton.
CSC321: Extra Lecture (not on the exam) Non-linear dimensionality reduction Geoffrey Hinton.
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
Out of sample extension of PCA, Kernel PCA, and MDS WILSON A. FLORERO-SALINAS DAN LI MATH 285, FALL
Multi-index Evaluation Algorithm Based on Locally Linear Embedding for the Node importance in Complex Networks Fang Hu
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
Unsupervised Riemannian Clustering of Probability Density Functions
Dipartimento di Ingegneria «Enzo Ferrari»,
Dimensionality Reduction
Spectral Methods Tutorial 6 1 © Maks Ovsjanikov
ISOMAP TRACKING WITH PARTICLE FILTERING
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Dimensionality Reduction
NonLinear Dimensionality Reduction or Unfolding Manifolds
Goodfellow: Chapter 14 Autoencoders
Presentation transcript:

NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas C. Raykar | University of Maryland, CollegePark

Dimensionality Reduction  Need to analyze large amounts multivariate data.  Human Faces.  Speech Waveforms.  Global Climate patterns.  Gene Distributions.  Difficult to visualize data in dimensions just greater than three.  Discover compact representations of high dimensional data.  Visualization.  Compression.  Better Recognition.  Probably meaningful dimensions.

Example…

Types of structure in multivariate data.. Clusters. –Principal Component Analysis –Density Estimation Techniques. On or around low Dimensional Manifolds –Linear –NonLinear

Concept of Manifolds “A manifold is a topological space which is locally Euclidean.” In general, any object which is nearly "flat" on small scales is a manifold. Euclidean space is a simplest example of a manifold. Concept of submanifold. Manifolds arise naturally whenever there is a smooth variation of parameters [like pose of the face in previous example] The dimension of a manifold is the minimum integer number of co-ordinates necessary to identify each point in that manifold. Embed data in a higher dimensional space to a lower dimensional manifold Concept of Dimensionality Reduction:

Manifolds of Perception..Human Visual System You never see the same face twice. Preceive constancy when raw sensory inputs are in flux..

Linear methods.. Principal Component Analysis (PCA) One Dimensional Manifold

MultiDimensional Scaling.. Here we are given pairwise distances instead of the actual data points. –First convert the pairwise distance matrix into the dot product matrix –After that same as PCA. If we preserve the pairwise distances do we preserve the structure??

Example of MDS…

How to get dot product matrix from pairwise distance matrix? k j i

MDS.. MDS—origin as one of the points and orientation arbitrary. Centroid as origin

MDS is more general.. Instead of pairwise distances we can use paiwise “dissimilarities”. When the distances are Euclidean MDS is equivalent to PCA. Eg. Face recognition, wine tasting Can get the significant cognitive dimensions.

Nonlinear Manifolds.. A Unroll the manifold PCA and MDS see the Euclidean distance What is important is the geodesic distance

To preserve structure preserve the geodesic distance and not the euclidean distance.

Two methods Tenenbaum et.al’s Isomap Algorithm –Global approach. –On a low dimensional embedding Nearby points should be nearby. Farway points should be faraway. Roweis and Saul’s Locally Linear Embedding Algorithm –Local approach Nearby points nearby

Isomap Estimate the geodesic distance between faraway points. For neighboring points Euclidean distance is a good approximation to the geodesic distance. For farway points estimate the distance by a series of short hops between neighboring points. –Find shortest paths in a graph with edges connecting neighboring data points Once we have all pairwise geodesic distances use classical metric MDS

Floyd’s Algorithm-shortest path XInf 2X0X 3 X0X 4 X

Isomap - Algorithm Determine the neighbors. –All points in a fixed radius. –K nearest neighbors Construct a neighborhood graph. –Each point is connected to the other if it is a K nearest neighbor. –Edge Length equals the Euclidean distance Compute the shortest paths between two nodes –Floyd’s Algorithm –Djkastra’s ALgorithm Construct a lower dimensional embedding. –Classical MDS

Isomap

Residual Variance Face Images SwisRoll Hand Images2

Locally Linear Embedding manifold is a topological space which is locally Euclidean.” Fit Locally, Think Globally

We expect each data point and its neighbours to lie on or close to a locally linear patch of the manifold. Each point can be written as a linear combination of its neighbors. The weights choosen to minimize the reconstruction Error. Derivation on board Fit Locally…

Important property... The weights that minimize the reconstruction errors are invariant to rotation, rescaling and translation of the data points. –Invariance to translation is enforced by adding the constraint that the weights sum to one. The same weights that reconstruct the datapoints in D dimensions should reconstruct it in the manifold in d dimensions. –The weights characterize the intrinsic geometric properties of each neighborhood.

Derivation on board Think Globally…

Grolliers Encyclopedia

Summary.. ISOMAPLLE Do MDS on the geodesic distance matrix. Model local neighborhoods as linear a patches and then embed in a lower dimensional manifold. Global approachLocal aproach Dynamic programming approaches Computationally efficient..sparse matrices Convergence limited by the manifold curvature and number of points. Good representational capacity

Short Circuit Problem??? Unstable? Only free parameter is How many neighbours? –How to choose neighborhoods. Susceptible to short- circuit errors if neighborhood is larger than the folds in the manifold. If small we get isolated patches.

??? Does Isomap work on closed manifold, manifolds with holes? LLE may be better.. Isomap Convergence Proof? How smooth should the manifold be? Noisy Data? How to choose K? Sparse Data?

Conformal & Isometric Embedding

C-Isomap Isometric mapping –Intrinsically flat manifold –Invariants?? Geodesic distances are reserved. Metric space under geodesic distance. Conformal Embedding –Locally isometric upo a scale factor s(y) –Estimate s(y) and rescale. –C-Isomap –Original data should be uniformly dense

Thank You ! | Questions ?