Feb. 19, 2002 Kinh Tieu on LLE presentation

Slides:



Advertisements
Similar presentations
The blue and green colors are actually the same.
Advertisements

Non-linear Dimensionality Reduction by Locally Linear Inlaying Presented by Peng Zhang Tianjin University, China Yuexian Hou, Peng Zhang, Xiaowei Zhang,
R ECONSTRUCTION ON S MOOTH M ANIFOLDS Bhuwan Dhingra Dual Degree Student Electrical Engineering IIT Kanpur.
Manifold Learning Dimensionality Reduction. Outline Introduction Dim. Reduction Manifold Isomap Overall procedure Approximating geodesic dist. Dijkstra’s.
1 Machine Learning: Lecture 7 Instance-Based Learning (IBL) (Based on Chapter 8 of Mitchell T.., Machine Learning, 1997)
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
1 A Survey on Distance Metric Learning (Part 1) Gerry Tesauro IBM T.J.Watson Research Center.
Non-linear Dimensionality Reduction CMPUT 466/551 Nilanjan Ray Prepared on materials from the book Non-linear dimensionality reduction By Lee and Verleysen,
University of Joensuu Dept. of Computer Science P.O. Box 111 FIN Joensuu Tel fax Isomap Algorithm.
Chapter 8: Generalization and Function Approximation pLook at how experience with a limited part of the state set be used to produce good behavior over.
Radial Basis Functions
L15:Microarray analysis (Classification) The Biological Problem Two conditions that need to be differentiated, (Have different treatments). EX: ALL (Acute.
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
1 Efficient Clustering of High-Dimensional Data Sets Andrew McCallum WhizBang! Labs & CMU Kamal Nigam WhizBang! Labs Lyle Ungar UPenn.
November 2, 2010Neural Networks Lecture 14: Radial Basis Functions 1 Cascade Correlation Weights to each new hidden node are trained to maximize the covariance.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Conditional Fuzzy C Means A fuzzy clustering approach for mining event-related dynamics Christos N. Zigkolis.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Multiple Object Class Detection with a Generative Model K. Mikolajczyk, B. Leibe and B. Schiele Carolina Galleguillos.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
8/10/ RBF NetworksM.W. Mak Radial Basis Function Networks 1. Introduction 2. Finding RBF Parameters 3. Decision Surface of RBF Networks 4. Comparison.
Algorithms for Data Analytics Chapter 3. Plans Introduction to Data-intensive computing (Lecture 1) Statistical Inference: Foundations of statistics (Chapter.
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
CSE 185 Introduction to Computer Vision Pattern Recognition.
Chapter 8: Generalization and Function Approximation pLook at how experience with a limited part of the state set be used to produce good behavior over.
Least-Mean-Square Training of Cluster-Weighted-Modeling National Taiwan University Department of Computer Science and Information Engineering.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Dimensionality reduction: Some Assumptions High-dimensional data often lies on or near a much lower dimensional, curved manifold. A good way to represent.
Line detection Assume there is a binary image, we use F(ά,X)=0 as the parametric equation of a curve with a vector of parameters ά=[α 1, …, α m ] and X=[x.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
ISOMAP TRACKING WITH PARTICLE FILTER Presented by Nikhil Rane.
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Manifold learning: MDS and Isomap
Non-Isometric Manifold Learning Analysis and an Algorithm Piotr Dollár, Vincent Rabaud, Serge Belongie University of California, San Diego.
國立雲林科技大學 National Yunlin University of Science and Technology Self-organizing map learning nonlinearly embedded manifoldsmanifolds Author :Timo Simila.
Nonlinear Dimensionality Reduction Approach (ISOMAP)
Data Mining Course 0 Manifold learning Xin Yang. Data Mining Course 1 Outline Manifold and Manifold Learning Classical Dimensionality Reduction Semi-Supervised.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
Chapter 13 (Prototype Methods and Nearest-Neighbors )
Data Mining Course 2007 Eric Postma Clustering. Overview Three approaches to clustering 1.Minimization of reconstruction error PCA, nlPCA, k-means clustering.
1 Some Guidelines for Good Research Dr Leow Wee Kheng Dept. of Computer Science.
CSC321: Lecture 25: Non-linear dimensionality reduction Geoffrey Hinton.
Advanced Artificial Intelligence Lecture 8: Advance machine learning.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
CSC321: Extra Lecture (not on the exam) Non-linear dimensionality reduction Geoffrey Hinton.
High Dimensional Probabilistic Modelling through Manifolds
Spectral Methods for Dimensionality
Nonlinear Dimensionality Reduction
Fuzzy Logic in Pattern Recognition
Unsupervised Riemannian Clustering of Probability Density Functions
Sample Presentation. Slide 1 Info Slide 2 Info.
Machine Learning Basics
Dimensionality Reduction
Clustering (3) Center-based algorithms Fuzzy k-means
Announcements Project 1 artifact winners
Outline Parameter estimation – continued Non-parametric methods.
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Lecture 25 Radial Basis Network (II)
Object Modeling with Layers
CSCI B609: “Foundations of Data Science”
Lecture 14 PCA, pPCA, ICA.
Reducing Training Time in a One-shot Machine Learning-based Compiler
Lecture 7: Simple Classifier (KNN)
Data science online training.
Chapter 8: Generalization and Function Approximation
CS4670: Intro to Computer Vision
Machine learning overview
Prediction Networks Prediction A simple example (section 3.7.3)
Maximum likelihood estimation of intrinsic dimension
Presentation transcript:

Feb. 19, 2002 Kinh Tieu on LLE presentation I’ll talk a little about Bregler&Omohundro Kinh Tieu with computational experiments Next class: Josh Tenenbaum guest lecturer, Matt Grimes with computational examples.

Comments on LLE paper Note the form of the Science paper: 2 layers—main text and details in footnotes. P. 2325 comparison with isomap very carefully described. Paper ends with a speculative bounce.

NIPS, 1993

Queries for manifolds Completion Nearest point Interpolation Extrapolation

Goal of surface learning

Surface learning

Steps in surface learning K-means clustering to find prototype centers. PCA using a specified number of nearest neighbors at each prototype. Use local variance info to put membership Gaussians at each prototype center. Projection onto manifold surface: Refine model by minimizing mse between training samples and nearest surface point using EM and gradient descent.

Snakes, without manifold-based prior models

With manifold-based prior…