Non-Isometric Manifold Learning Analysis and an Algorithm Piotr Dollár, Vincent Rabaud, Serge Belongie University of California, San Diego.

Slides:



Advertisements
Similar presentations
Learning Riemannian metrics for motion classification Fabio Cuzzolin INRIA Rhone-Alpes Computational Imaging Group, Pompeu Fabra University, Barcellona.
Advertisements

Machine Learning for Vision-Based Motion Analysis Learning pullback metrics for linear models Oxford Brookes Vision Group Oxford Brookes University 17/10/2008.
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Non-linear Dimensionality Reduction by Locally Linear Inlaying Presented by Peng Zhang Tianjin University, China Yuexian Hou, Peng Zhang, Xiaowei Zhang,
Distance Preserving Embeddings of Low-Dimensional Manifolds Nakul Verma UC San Diego.
1 Manifold Alignment for Multitemporal Hyperspectral Image Classification H. Lexie Yang 1, Melba M. Crawford 2 School of Civil Engineering, Purdue University.
Richard G. Baraniuk Chinmay Hegde Sriram Nagaraj Go With The Flow A New Manifold Modeling and Learning Framework for Image Ensembles Aswin C. Sankaranarayanan.
Input Space versus Feature Space in Kernel- Based Methods Scholkopf, Mika, Burges, Knirsch, Muller, Ratsch, Smola presented by: Joe Drish Department of.
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
1 A Survey on Distance Metric Learning (Part 1) Gerry Tesauro IBM T.J.Watson Research Center.
Graph Laplacian Regularization for Large-Scale Semidefinite Programming Kilian Weinberger et al. NIPS 2006 presented by Aggeliki Tsoli.
Image Denoising using Locally Learned Dictionaries Priyam Chatterjee Peyman Milanfar Dept. of Electrical Engineering University of California, Santa Cruz.
A novel supervised feature extraction and classification framework for land cover recognition of the off-land scenario Yan Cui
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
“Random Projections on Smooth Manifolds” -A short summary
EE 290A: Generalized Principal Component Analysis Lecture 5: Generalized Principal Component Analysis Sastry & Yang © Spring, 2011EE 290A, University of.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Algorithmic Classification of Resonant Orbits Using Persistent Homology in Poincaré Sections Thomas Coffee.
Matching a 3D Active Shape Model on sparse cardiac image data, a comparison of two methods Marleen Engels Supervised by: dr. ir. H.C. van Assen Committee:
Image Manifolds : Learning-based Methods in Vision Alexei Efros, CMU, Spring 2007 © A.A. Efros With slides by Dave Thompson.
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Richard Baraniuk Chinmay Hegde Marco Duarte Mark Davenport Rice University Michael Wakin University of Michigan Compressive Learning and Inference.
Visual Tracking with Online Multiple Instance Learning
Cs: compressed sensing
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Iterated Denoising for Image Recovery Onur G. Guleryuz To see the animations and movies please use full-screen mode. Clicking on.
O UT - OF -S AMPLE E XTENSION AND R ECONSTRUCTION ON M ANIFOLDS Bhuwan Dhingra Final Year (Dual Degree) Dept of Electrical Engg.
PROJECTIVE CONSTRAINT VIOLATION STABILIZATION METHOD FOR MULTIBODY SYSTEMS ON MANIFOLDS Prof. dr. Zdravko Terze Dept. of Aeronautical Engineering, Faculty.
Learning a Kernel Matrix for Nonlinear Dimensionality Reduction By K. Weinberger, F. Sha, and L. Saul Presented by Michael Barnathan.
Boris Babenko 1, Ming-Hsuan Yang 2, Serge Belongie 1 1. University of California, San Diego 2. University of California, Merced OLCV, Kyoto, Japan.
Supervised Learning of Edges and Object Boundaries Piotr Dollár Zhuowen Tu Serge Belongie.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Non-Euclidean Example: The Unit Sphere. Differential Geometry Formal mathematical theory Work with small ‘patches’ –the ‘patches’ look Euclidean Do calculus.
Nonlinear Learning Using Local Coordinate Coding K. Yu, T. Zhang and Y. Gong, NIPS 2009 Improved Local Coordinate Coding Using Local Tangents K. Yu and.
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Manifold learning: MDS and Isomap
Geometry of Shape Manifolds
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
H. Lexie Yang1, Dr. Melba M. Crawford2
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Maximal Data Piling Visual similarity of & ? Can show (Ahn & Marron 2009), for d < n: I.e. directions are the same! How can this be? Note lengths are different.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
© 2009 Robert Hecht-Nielsen. All rights reserved. 1 Andrew Smith University of California, San Diego Building a Visual Hierarchy.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
On the manifolds of spatial hearing
Air Systems Division Definition of anisotropic denoising operators via sectional curvature Stanley Durrleman September 19, 2006.
Neural Network Approximation of High- dimensional Functions Peter Andras School of Computing and Mathematics Keele University
Using Neumann Series to Solve Inverse Problems in Imaging Christopher Kumar Anand.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
Unsupervised Riemannian Clustering of Probability Density Functions
Machine Learning Basics
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Goodfellow: Chapter 14 Autoencoders
Biointelligence Laboratory, Seoul National University
Video Analysis via Nonlinear Dimensionality Reduction Technique
Dept. of Aeronautical Engineering,
Autoencoders Supervised learning uses explicit labels/correct output in order to train a network. E.g., classification of images. Unsupervised learning.
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

Non-Isometric Manifold Learning Analysis and an Algorithm Piotr Dollár, Vincent Rabaud, Serge Belongie University of California, San Diego

I. Motivation 1. Extend manifold learning to applications other than embedding 2. Establish notion of test error and generalization for manifold learning

Linear Manifolds (subspaces) Typical operations: project onto subspace distance to subspace distance between points generalize to unseen regions

Non-Linear Manifolds Desired operations: project onto manifold distance to manifold geodesic distance generalize to unseen regions

II. Locally Smooth Manifold Learning (LSML) Represent manifold by its tangent space Non-local Manifold Tangent Learning [Bengio et al. NIPS05] Learning to Traverse Image Manifolds [Dollar et al. NIPS06]

Learning the tangent space Data on d dim. manifold in D dim. space

Learning the tangent space Learn function from point to tangent basis

Loss function

Optimization procedure

Need evaluation methodology to: objectively compare methods control model complexity extend to non-isometric manifolds III. Analyzing Manifold Learning Methods

Evaluation metric Applicable for manifolds that can be densely sampled

Finite Sample Performance Performance: LSML > ISOMAP > MVU >> LLE Applicability: LSML >> LLE > MVU > ISOMAP

Model Complexity All methods have at least one parameter: k Bias-Variance tradeoff

LSML test error

IV. Using the Tangent Space projection manifold de-noising geodesic distance generalization

Projection x' is the projection of x onto a manifold if it satisfies: gradient descent is performed after initializing the projection x' to some point on the manifold: tangents at projection matrix

Manifold de-noising - original - de-noised Goal: recover points on manifold ( ) from points corrupted by noise ( )

Geodesic distance Shortest path: gradient descent On manifold:

 embedding of sparse and structured data   can apply to non-isometric manifolds  Geodesic distance

Generalization  Generalization beyond support of training data   Reconstruction within the support of training data

Thank you!