Distance Preserving Embeddings of Low-Dimensional Manifolds Nakul Verma UC San Diego.

Slides:



Advertisements
Similar presentations
Lower Bounds for Additive Spanners, Emulators, and More David P. Woodruff MIT and Tsinghua University To appear in FOCS, 2006.
Advertisements

Subspace Embeddings for the L1 norm with Applications Christian Sohler David Woodruff TU Dortmund IBM Almaden.
Complex Integration.
Arc Length and Curvature
Shortest Vector In A Lattice is NP-Hard to approximate
Copyright © Cengage Learning. All rights reserved. 14 Partial Derivatives.
Chapter 9: Vector Differential Calculus Vector Functions of One Variable -- a vector, each component of which is a function of the same variable.
Differential geometry I
ESSENTIAL CALCULUS CH11 Partial derivatives
Discrete Differential Geometry Planar Curves 2D/3D Shape Manipulation, 3D Printing March 13, 2013 Slides from Olga Sorkine, Eitan Grinspun.
Richard G. Baraniuk Chinmay Hegde Sriram Nagaraj Go With The Flow A New Manifold Modeling and Learning Framework for Image Ensembles Aswin C. Sankaranarayanan.
Stokes Theorem. Recall Green’s Theorem for calculating line integrals Suppose C is a piecewise smooth closed curve that is the boundary of an open region.
A novel supervised feature extraction and classification framework for land cover recognition of the off-land scenario Yan Cui
Discrete Geometry Tutorial 2 1
Corp. Research Princeton, NJ Cut Metrics and Geometry of Grid Graphs Yuri Boykov, Siemens Research, Princeton, NJ joint work with Vladimir Kolmogorov,
Functional Methods for Testing Data. The data Each person a = 1,…,N Each person a = 1,…,N responds to each item i = 1,…,n responds to each item i = 1,…,n.
Invariant correspondence
“Random Projections on Smooth Manifolds” -A short summary
Dimensionality Reduction
Lecture IV – Invariant Correspondence
4. Convergence of random variables  Convergence in probability  Convergence in distribution  Convergence in quadratic mean  Properties  The law of.
Differential geometry II
Dimension Reduction of Combustion Chemistry using Pre-Image Curves Zhuyin (laniu) Ren October 18 th, 2004.
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Dimensionality Reduction
1 Numerical geometry of non-rigid shapes Non-Euclidean Embedding Non-Euclidean Embedding Lecture 6 © Alexander & Michael Bronstein tosca.cs.technion.ac.il/book.
PARTIAL DERIVATIVES PARTIAL DERIVATIVES One of the most important ideas in single-variable calculus is:  As we zoom in toward a point on the graph.
CS Subdivision I: The Univariate Setting Peter Schröder.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.
DEPARTMENT OF MATHEMATI CS [ YEAR OF ESTABLISHMENT – 1997 ] DEPARTMENT OF MATHEMATICS, CVRCE.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
1 CE 530 Molecular Simulation Lecture 7 David A. Kofke Department of Chemical Engineering SUNY Buffalo
Shape Spaces Kathryn Leonard 22 January 2005 MSRI Intro to Image Analysis.
CHAPTER FIVE Orthogonality Why orthogonal? Least square problem Accuracy of Numerical computation.
Cs: compressed sensing
Mesh Deformation Based on Discrete Differential Geometry Reporter: Zhongping Ji
Curves.
AN ORTHOGONAL PROJECTION
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Chapter 7: Introduction to Sampling Distributions Section 2: The Central Limit Theorem.
Non-Euclidean Example: The Unit Sphere. Differential Geometry Formal mathematical theory Work with small ‘patches’ –the ‘patches’ look Euclidean Do calculus.
13 th Nov Geometry of Graphs and It’s Applications Suijt P Gujar. Topics in Approximation Algorithms Instructor : T Kavitha.
Non-Isometric Manifold Learning Analysis and an Algorithm Piotr Dollár, Vincent Rabaud, Serge Belongie University of California, San Diego.
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Spectral Sequencing Based on Graph Distance Rong Liu, Hao Zhang, Oliver van Kaick {lrong, haoz, cs.sfu.ca {lrong, haoz, cs.sfu.ca.
Embeddings, flow, and cuts: an introduction University of Washington James R. Lee.
Lower Bounds for Embedding Edit Distance into Normed Spaces A. Andoni, M. Deza, A. Gupta, P. Indyk, S. Raskhodnikova.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
Surveying II. Lecture 1.. Types of errors There are several types of error that can occur, with different characteristics. Mistakes Such as miscounting.
Section 3.9 Linear Approximation and the Derivative.
Hodge Theory Calculus on Smooth Manifolds. by William M. Faucette Adapted from lectures by Mark Andrea A. Cataldo.
Section 14.3 Local Linearity and the Differential.
An Introduction to Riemannian Geometry
Dimension reduction for finite trees in L1
Morphing and Shape Processing
Differential of a function
ISOMAP TRACKING WITH PARTICLE FILTERING
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Differential Manifolds and Tensors
Use Simpson's Rule with n = 10 to estimate the length of the arc of the twisted cubic {image} , from the origin to the point (3, 9, 27)
Turnstile Streaming Algorithms Might as Well Be Linear Sketches
Copyright © Cengage Learning. All rights reserved.
Yair Bartal Lee-Ad Gottlieb Hebrew U. Ariel University
Quantum Foundations Lecture 3
Lecture 15: Least Square Regression Metric Embeddings
NonLinear Dimensionality Reduction or Unfolding Manifolds
Topological Signatures For Fast Mobility Analysis
Chapter 3: Simplicial Homology Instructor: Yusu Wang
Presentation transcript:

Distance Preserving Embeddings of Low-Dimensional Manifolds Nakul Verma UC San Diego

Study Manifold Embeddings Formally Let’s fix a desirable property: preserving geodesic distances. We are interested in the following question: Given: a sample X from n-dimensional manifold, and an embedding procedure Define: the quality of embedding as -isometric, if for all Questions: I.Can one come up an that achieves -isometry? II.How much can one reduce d and still have -isometry? III.Do we need any restriction on X or M?

Recent progress in this area An interesting result: Let be a n-dimensional manifold with volume V and curvature . Then projecting it to a random linear subspace of dimension achieves -isometry with high probability. Highly undesirable! To have all distances within factor of 99% requires projection dimension > 10,000! Does not need any samples from the underlying manifold!

What we show For any compact n-dimensional manifold (with or without boun- dary), we present an algorithm that can embed M in dimensions that achieves -isometry (using only samples from M). Embedding dimension is independent of ! Sample size is a function of

The Algorithm Embedding Stage: Find a representation of M in lower dimensional space without worrying about maintaining any distances. Correction Stage: Apply a series of corrections, each corresponding to a different region of the manifold, to restore back the distances. We can use a random linear projection without penalty This requires a bit of thinking…

Corrections Say, this is the manifold after the embedding stage different regions are contracted by different amounts Zoomed in a local region

Corrections Zoomed in a local region cannot systematically attach the boundary of the stretched region back to the manifold… Suppose we linearly stretch this local region

Corrections Zoomed in a local region Instead we locally twist the space! This creates the necessary stretch to restore back the local distances!

Technical hurdles to look-out for Care needs to be taken so as not to have sharp (non-differentiable) edges on the boundary while locally twisting the space. Sufficient ambient space needs to available to create the local twist. Interference between different corrections at overlapping localities need to be reconciled.

The algorithm at work

Theoretical Guarantee Theorem: Let be compact n-dimensional manifold with volume V and curvature . For any > 0, let X be -dense sample from M. Then with high probability, our algorithm (given access to X) embeds any point from M in dimension with -isometry.

A quick proof overview The goal is to prove that the geodesic distances between all pairs of points p and q in M are approximately preserved. Recall that length of any curve  is given by the expression: Therefore, suffices to show that our algorithm preserves lengths of all vectors tangent to at all points in M. length of a curve is the infinitesimal sum of the length of the tangent vectors along its path

A quick proof overview From differential geometry, we know that for any smooth map F the exterior derivative or the pushforward map DF acts on the tangent vectors. We carefully analyze how each correction step of the algorithm changes the corresponding pushforward map.

Conclusion and Implications Gave the first sample complexity result for approximately isometric embedding for a manifold learning algorithms. Novel algorithmic and analysis techniques are of independent interest. One can use an existing manifold learning algorithm as the ‘embedding’ step. The corrections in second step enhance the embedding to make it isometric, making this as a universal procedure.

Questions / Discussion