ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.

Slides:



Advertisements
Similar presentations
Coherent Laplacian 3D protrusion segmentation Oxford Brookes Vision Group Queen Mary, University of London, 11/12/2009 Fabio Cuzzolin.
Advertisements

EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Face Recognition and Biometric Systems Eigenfaces (2)
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
PCA + SVD.
AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
Clustering and Dimensionality Reduction Brendan and Yifang April
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
“Random Projections on Smooth Manifolds” -A short summary
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Dimensionality Reduction and Embeddings
Lecture Notes for CMPUT 466/551 Nilanjan Ray
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Dimensional reduction, PCA
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Computer Vision I Instructor: Prof. Ko Nishino. Today How do we recognize objects in images?
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Dimensionality Reduction
NonLinear Dimensionality Reduction or Unfolding Manifolds Tennenbaum|Silva|Langford [Isomap] Roweis|Saul [Locally Linear Embedding] Presented by Vikas.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Dimensionality Reduction. Multimedia DBs Many multimedia applications require efficient indexing in high-dimensions (time-series, images and videos, etc)
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
CS 485/685 Computer Vision Face Recognition Using Principal Components Analysis (PCA) M. Turk, A. Pentland, "Eigenfaces for Recognition", Journal of Cognitive.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Computer vision: models, learning and inference Chapter 19 Temporal models.
SIS Sequential Importance Sampling Advanced Methods In Simulation Winter 2009 Presented by: Chen Bukay, Ella Pemov, Amit Dvash.
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
Probabilistic Robotics Bayes Filter Implementations.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Data Reduction. 1.Overview 2.The Curse of Dimensionality 3.Data Sampling 4.Binning and Reduction of Cardinality.
Young Ki Baik, Computer Vision Lab.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
Mobile Robot Localization (ch. 7)
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
CSE 185 Introduction to Computer Vision Face Recognition.
Ch 12. Continuous Latent Variables Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by S.-J. Kim and J.-K. Rhee Revised by D.-Y.
Dimensionality Reduction
Manifold learning: MDS and Isomap
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Jan Kamenický.  Many features ⇒ many dimensions  Dimensionality reduction ◦ Feature extraction (useful representation) ◦ Classification ◦ Visualization.
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Non-Linear Dimensionality Reduction
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
EE4-62 MLCV Lecture Face Recognition – Subspace/Manifold Learning Tae-Kyun Kim 1 EE4-62 MLCV.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Data Projections & Visualization Rajmonda Caceres MIT Lincoln Laboratory.
Supervisor: Nakhmani Arie Semester: Winter 2007 Target Recognition Harmatz Isca.
Data Mining Course 2007 Eric Postma Clustering. Overview Three approaches to clustering 1.Minimization of reconstruction error PCA, nlPCA, k-means clustering.
Math 285 Project Diffusion Maps Xiaoyan Chong Department of Mathematics and Statistics San Jose State University.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Manifold Learning JAMES MCQUEEN – UW DEPARTMENT OF STATISTICS.
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
Ch 12. Continuous Latent Variables ~ 12
9.3 Filtered delay embeddings
Dipartimento di Ingegneria «Enzo Ferrari»,
Principal Component Analysis
ISOMAP TRACKING WITH PARTICLE FILTERING
Announcements Project 1 artifact winners
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
PRAKASH CHOCKALINGAM, NALIN PRADEEP, AND STAN BIRCHFIELD
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
CS4670: Intro to Computer Vision
NonLinear Dimensionality Reduction or Unfolding Manifolds
Presentation transcript:

ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane

Dimensionality Reduction Let x i be H-dimensional and y i be L- dimensional then dimensionality reduction solves the problem x i = f (y i ) where H>L

Dimensionality Reduction Techniques Linear PCA Transforms data into a new coordinate system so that largest variance in on the 1 st dimension, 2 nd largest along 2 nd dimension … Classical MDS Preserves Euclidean distances between points Nonlinear Isomap Preserves geodesic distances between points LLE Preserves local configurations in data

Face Database

Principal Components Analysis (PCA) 1) Make the mean of the data zero 2) Compute covariance matrix C 3) Compute eigenvalues and eigenvectors of C 4) Choose the principal components 5) Generate low-dimensional points using principal components

Performance of PCA on Face- data

Classical Multidimensional Scaling (MDS) Compute Distance Matrix S Compute inner product matrix B = -0.5JSJ where J = I N – (1/N)11 T Decompose B into eigenvectors and eigenvalues Use top d eigenvectors and eigenvalues to form the d dimensional embedding.

Performance of MDS on face- data

Locally Linear Embedding (LLE) Find neighbors of each data point Compute weights that best reconstruct each data point from its neighbors Compute low-dimensional vectors best reconstructed by the weights

Performance of LLE on Face- data

Geodesic Distance Geodesic distance – the length of the shortest curve between two points taken along the surface of a manifold

Isometric Feature Mapping (Isomap) Construct neighborhood graph Compute shortest paths between points Apply classical MDS

Performance of Isomap on face- data

Tracking vs. Detection Detection - locating an object independent of the past information When motion is unpredictable For reacquisition of a lost target Tracking - locating an object based on past information Saves computation time

Recursive Bayesian Framework Estimate the pdf of state at time t given the pdf of state at time t - 1 and measurement at time t Predict Predict state of the system at time t using a system- model and pdf from time t – 1 Update Update the predicted state using measurement at time t by Bayes’ rule

Kalman Filtering vs. Particle Filtering Kalman filter assumes the pdf of the state to be Gaussian at all times and requires the measurement and process noise to be Gaussian Particle filter makes no such assumption and in fact estimates the pdf at every time-step

Resampling

Condensation algorithm Algorithm – 1) Resample 2) Predict 3) Measure

Condensation algorithm

Isomap Tracking with Particle Filtering Create training set of a person’s face (off-line) Use Isomap to reduce dimensionality of the training set (off-line) Run particle filter on test sequence to track the person

Training Data

Isomap of Training Data

Isomap Discrepancy Isomap gave dimensionality of 2 when head poses moving up were removed. Thus, the dimensionality of 3 recovered by training data can be attributed to the non-symmetry of the face about the horizontal axis.

Weighting Particles by SSD

Weighting Particles by Chamfer distance

State evolution without resampling

State evolution with resampling

Experimental Results

Videos

Videos Continued

Conclusion and Future work Isomap provides good frame-work for pose estimation Algorithm can track and estimate a person’s pose at the same time Use of particle filter allows parallel implementation Goal is to be able to build an Isomap on-line so that the particle filter tracker can learn as it tracks

Thank You!