Manifold Learning Using Geodesic Entropic Graphs Alfred O. Hero and Jose Costa Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan -

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Coherent Laplacian 3D protrusion segmentation Oxford Brookes Vision Group Queen Mary, University of London, 11/12/2009 Fabio Cuzzolin.
Text mining Gergely Kótyuk Laboratory of Cryptography and System Security (CrySyS) Budapest University of Technology and Economics
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
1 Manifold Alignment for Multitemporal Hyperspectral Image Classification H. Lexie Yang 1, Melba M. Crawford 2 School of Civil Engineering, Purdue University.
Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
DIMENSIONALITY REDUCTION: FEATURE EXTRACTION & FEATURE SELECTION Principle Component Analysis.
Richard G. Baraniuk Chinmay Hegde Sriram Nagaraj Go With The Flow A New Manifold Modeling and Learning Framework for Image Ensembles Aswin C. Sankaranarayanan.
A Geometric Perspective on Machine Learning 何晓飞 浙江大学计算机学院 1.
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction Keywords: Dimensionality reduction, manifold learning, subspace learning,
Train a Classifier Based on the Huge Face Database
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
“Random Projections on Smooth Manifolds” -A short summary
Graph Based Semi- Supervised Learning Fei Wang Department of Statistical Science Cornell University.
Asst. Prof. Yusuf Sahillioğlu
1 High dimensionality Evgeny Maksakov CS533C Department of Computer Science UBC.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Entropic graphs: Applications Alfred O. Hero Dept. EECS, Dept BME, Dept. Statistics University of Michigan - Ann Arbor
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Multisite Internet Data Analysis Alfred O. Hero, Clyde Shih, David Barsic University of Michigan - Ann Arbor
Manifold learning and pattern matching with entropic graphs Alfred O. Hero Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan - Ann.
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
1 M. Bronstein Multigrid multidimensional scaling Multigrid Multidimensional Scaling Michael M. Bronstein Department of Computer Science Technion – Israel.
Random Projections of Signal Manifolds Michael Wakin and Richard Baraniuk Random Projections for Manifold Learning Chinmay Hegde, Michael Wakin and Richard.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Diffusion Maps and Spectral Clustering
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Representative Previous Work
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
CSE 185 Introduction to Computer Vision Pattern Recognition.
1 Graph Embedding (GE) & Marginal Fisher Analysis (MFA) 吳沛勳 劉冠成 韓仁智
Estimating Intrinsic Dimension Justin Eberhardt UMD, Mathematics and Statistics Advisor: Dr. Kang James.
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Speech and.
Coarse-to-Fine Combinatorial Matching for Dense Isometric Shape Correspondence Yusuf Sahillioğlu and Yücel Yemez Computer Eng. Dept., Koç University, Istanbul,
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
1 Learning from Shadows Dimensionality Reduction and its Application in Artificial Intelligence, Signal Processing and Robotics Ali Ghodsi Department of.
THE MANIFOLDS OF SPATIAL HEARING Ramani Duraiswami | Vikas C. Raykar Perceptual Interfaces and Reality Lab University of Maryland, College park.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Computer examples Tenenbaum, de Silva, Langford “A Global Geometric Framework for Nonlinear Dimensionality Reduction”
Materials Process Design and Control Laboratory A NONLINEAR DIMENSION REDUCTION STRATEGY FOR GENERATING DATA DRIVEN STOCHASTIC INPUT MODELS Baskar Ganapathysubramanian.
A New Method of Probability Density Estimation for Mutual Information Based Image Registration Ajit Rajwade, Arunava Banerjee, Anand Rangarajan. Dept.
A survey of different shape analysis techniques 1 A Survey of Different Shape Analysis Techniques -- Huang Nan.
WISP: Nov. 04 Inference and Signal Processing for Networks ALFRED O. HERO III Depts. EECS, BME, Statistics University of Michigan - Ann Arbor
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
Manifold learning: MDS and Isomap
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
H. Lexie Yang1, Dr. Melba M. Crawford2
Non-Linear Dimensionality Reduction
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
CENG 789 – Digital Geometry Processing 04- Distances, Descriptors and Sampling on Meshes Asst. Prof. Yusuf Sahillioğlu Computer Eng. Dept,, Turkey.
Data Mining Course 2007 Eric Postma Clustering. Overview Three approaches to clustering 1.Minimization of reconstruction error PCA, nlPCA, k-means clustering.
Eric Xing © Eric CMU, Machine Learning Data visualization and dimensionality reduction Eric Xing Lecture 7, August 13, 2010.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
Visualizing High-Dimensional Data
Intrinsic Data Geometry from a Training Set
Unsupervised Riemannian Clustering of Probability Density Functions
Nonparametric Semantic Segmentation
Dipartimento di Ingegneria «Enzo Ferrari»,
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Object Modeling with Layers
Using Manifold Structure for Partially Labeled Classification
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

Manifold Learning Using Geodesic Entropic Graphs Alfred O. Hero and Jose Costa Dept. EECS, Dept Biomed. Eng., Dept. Statistics University of Michigan - Ann Arbor Research supported in part by: ARO-DARPA MURI DAAD Manifold Learning and Dimension Reduction 2.Entropic Graphs 3.Examples

1.Dimension Reduction and Pattern Matching 128x128 images of three vehicles over 1 deg increments of 360 deg azimuth at 0 deg elevation The 3(360)=1080 images evolve on a lower dimensional imbedded manifold in R^(16384) Courtesy of Center for Imaging Science, JHU HMMV T62Truck

Land Vehicle Image Manifold Entropy: Manifold (intrinsic) Dimension: d Embediing (extrinsic) Dimension: D QuantitiesOf Interest

Assumption: is a conformal mapping A statistical sample Sampling distribution 2dim manifold Sampling Embedding Sampling on a Domain Manifold

Background on Manifold Learning 1.Manifold intrinsic dimension estimation 1.Local KLE, Fukunaga, Olsen (1971) 2.Nearest neighbor algorithm, Pettis, Bailey, Jain, Dubes (1971) 3.Fractal measures, Camastra and Vinciarelli (2002) 4.Packing numbers, Kegl (2002) 2.Manifold Reconstruction 1.Isomap-MDS, Tenenbaum, de Silva, Langford (2000) 2.Locally Linear Embeddings (LLE), Roweiss, Saul (2000) 3.Laplacian eigenmaps (LE), Belkin, Niyogi (2002) 4.Hessian eigenmaps (HE), Grimes, Donoho (2003) 3.Characterization of sampling distributions on manifolds 1.Statistics of directional data, Watson (1956), Mardia (1972) 2.Data compression on 3D surfaces, Kolarov, Lynch (1997) 3.Statistics of shape, Kendall (1984), Kent, Mardia (2001)

2. Entropic Graphs A Planar Sample and its Euclidean MST

MST and Geodesic MST For a set of points in D- dimensional Euclidean space, the Euclidean MST with edge power weighting gamma is defined as edge lengths of a spanning tree over When pairwise distances are geodesic distances on obtain Geodesic MST For dense samplings GMST length = MST length

Convergence of Euclidean MST Beardwood, Halton, Hammersley Theorem:

Convergence Theorem for GMST Ref: Costa&Hero:TSP2003

Special Cases Isometric embedding ( distance preserving) Conformal embedding ( angle preserving)

Joint Estimation Algorithm Convergence theorem suggests log-linear model Use bootstrap resampling to estimate mean MST length and apply LS to jointly estimate slope and intercept from sequence Extract d and H from slope and intercept

3. Examples Random Samples on the Swiss Roll Ref: Tenenbaum&etal (2000)

Bootstrap Estimates of GMST Length Bootstrap SE bar (83% CI)

loglogLinear Fit to GMST Length

Dimension and Entropy Estimates From LS fit find: Intrinsic dimension estimate Alpha-entropy estimate ( ) –Ground truth:

Dimension Estimation Comparisons

Application to Faces Yale face database 2 –Photographic folios of many people’s faces –Each face folio contains images at 585 different illumination/pose conditions –Subsampled to 64 by 64 pixels (4096 extrinsic dimensions) Objective: determine intrinsic dimension and entropy of a typical face folio

GMST for 3 Face Folios Ref: Costa&Hero 2003

Conclusions Characterizing high dimension sampling distributions –Standard techniques (histogram, density estimation) fail due to curse of dimensionality –Entropic graphs can be used to construct consistent estimators of entropy and information divergence –Robustification to outliers via pruning Manifold learning and model reduction –LLE, LE, HE estimate d by finding local linear representation of manifold –Entropic graph estimates d from global resampling –Computational complexity of MST is only n log n Advantages of Geodesic Entropic Graph Methods

References A. O. Hero, B. Ma, O. Michel and J. D. Gorman, “Application of entropic graphs,” IEEE Signal Processing Magazine, Sept H. Neemuchwala, A.O. Hero and P. Carson, “Entropic graphs for image registration,” to appear in European Journal of Signal Processing, J. Costa and A. O. Hero, “Manifold learning with geodesic minimal spanning trees,” accepted in IEEE T- SP (Special Issue on Machine Learning), A. O. Hero, J. Costa and B. Ma, "Convergence rates of minimal graphs with random vertices," submitted to IEEE T-IT, March J. Costa, A. O. Hero and C. Vignat, "On solutions to multivariate maximum alpha-entropy Problems", in Energy Minimization Methods in Computer Vision and Pattern Recognition (EMM-CVPR), Eds. M. Figueiredo, R. Rangagaran, J. Zerubia, Springer-Verlag, 2003