1 Learning from Shadows Dimensionality Reduction and its Application in Artificial Intelligence, Signal Processing and Robotics Ali Ghodsi Department of.

Slides:



Advertisements
Similar presentations
Text mining Gergely Kótyuk Laboratory of Cryptography and System Security (CrySyS) Budapest University of Technology and Economics
Advertisements

Probabilistic Tracking and Recognition of Non-rigid Hand Motion
M. Belkin and P. Niyogi, Neural Computation, pp. 1373–1396, 2003.
1 Manifold Alignment for Multitemporal Hyperspectral Image Classification H. Lexie Yang 1, Melba M. Crawford 2 School of Civil Engineering, Purdue University.
Input Space versus Feature Space in Kernel- Based Methods Scholkopf, Mika, Burges, Knirsch, Muller, Ratsch, Smola presented by: Joe Drish Department of.
Nonlinear Dimension Reduction Presenter: Xingwei Yang The powerpoint is organized from: 1.Ronald R. Coifman et al. (Yale University) 2. Jieping Ye, (Arizona.
Support vector machine
An Overview of Machine Learning
AGE ESTIMATION: A CLASSIFICATION PROBLEM HANDE ALEMDAR, BERNA ALTINEL, NEŞE ALYÜZ, SERHAN DANİŞ.
Presented by: Mingyuan Zhou Duke University, ECE April 3, 2009
Discriminative, Unsupervised, Convex Learning Dale Schuurmans Department of Computing Science University of Alberta MITACS Workshop, August 26, 2005.
Introduction to Machine Learning BMI/IBGP 730 Kun Huang Department of Biomedical Informatics The Ohio State University.
Data Visualization STAT 890, STAT 442, CM 462
Watching Unlabeled Video Helps Learn New Human Actions from Very Few Labeled Snapshots Chao-Yeh Chen and Kristen Grauman University of Texas at Austin.
Multivariate Methods Pattern Recognition and Hypothesis Testing.
Hilbert Space Embeddings of Hidden Markov Models Le Song, Byron Boots, Sajid Siddiqi, Geoff Gordon and Alex Smola 1.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.)
Principal Component Analysis
LLE and ISOMAP Analysis of Robot Images Rong Xu. Background Intuition of Dimensionality Reduction Linear Approach –PCA(Principal Component Analysis) Nonlinear.
Manifold Learning: ISOMAP Alan O'Connor April 29, 2008.
Three Algorithms for Nonlinear Dimensionality Reduction Haixuan Yang Group Meeting Jan. 011, 2005.
Distance Metric Learning: A Comprehensive Survey
A Global Geometric Framework for Nonlinear Dimensionality Reduction Joshua B. Tenenbaum, Vin de Silva, John C. Langford Presented by Napat Triroj.
Atul Singh Junior Undergraduate CSE, IIT Kanpur.  Dimension reduction is a technique which is used to represent a high dimensional data in a more compact.
Multiple Object Class Detection with a Generative Model K. Mikolajczyk, B. Leibe and B. Schiele Carolina Galleguillos.
Lightseminar: Learned Representation in AI An Introduction to Locally Linear Embedding Lawrence K. Saul Sam T. Roweis presented by Chan-Su Lee.
Nonlinear Dimensionality Reduction by Locally Linear Embedding Sam T. Roweis and Lawrence K. Saul Reference: "Nonlinear dimensionality reduction by locally.
Distance Metric Learning in Data Mining
Diffusion Maps and Spectral Clustering
Nonlinear Dimensionality Reduction Approaches. Dimensionality Reduction The goal: The meaningful low-dimensional structures hidden in their high-dimensional.
Representative Previous Work
Manifold learning: Locally Linear Embedding Jieping Ye Department of Computer Science and Engineering Arizona State University
Dimensionality Reduction: Principal Components Analysis Optional Reading: Smith, A Tutorial on Principal Components Analysis (linked to class webpage)
Graph Embedding: A General Framework for Dimensionality Reduction Dong XU School of Computer Engineering Nanyang Technological University
FODAVA-Lead Education, Community Building, and Research: Dimension Reduction and Data Reduction: Foundations for Interactive Visualization Haesun Park.
IEEE TRANSSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
Different Features. Glasses vs. No Glasses Beard vs. No Beard.
Computer Vision Lab. SNU Young Ki Baik Nonlinear Dimensionality Reduction Approach (ISOMAP, LLE)
GRASP Learning a Kernel Matrix for Nonlinear Dimensionality Reduction Kilian Q. Weinberger, Fei Sha and Lawrence K. Saul ICML’04 Department of Computer.
Manifold learning: MDS and Isomap
Nonlinear Dimensionality Reduction Approach (ISOMAP)
H. Lexie Yang1, Dr. Melba M. Crawford2
Carlos H. R. Lima - Depto. of Civil and Environmental Engineering, University of Brasilia. Brazil. Upmanu Lall - Water Center, Columbia.
An Efficient Greedy Method for Unsupervised Feature Selection
CS Statistical Machine learning Lecture 12 Yuan (Alan) Qi Purdue CS Oct
Distance metric learning Vs. Fisher discriminant analysis.
June 25-29, 2006ICML2006, Pittsburgh, USA Local Fisher Discriminant Analysis for Supervised Dimensionality Reduction Masashi Sugiyama Tokyo Institute of.
CS 2750: Machine Learning Dimensionality Reduction Prof. Adriana Kovashka University of Pittsburgh January 27, 2016.
Principal Component Analysis and Linear Discriminant Analysis for Feature Reduction Jieping Ye Department of Computer Science and Engineering Arizona State.
Nonlinear Dimension Reduction: Semi-Definite Embedding vs. Local Linear Embedding Li Zhang and Lin Liao.
Out of sample extension of PCA, Kernel PCA, and MDS WILSON A. FLORERO-SALINAS DAN LI MATH 285, FALL
PART III: TRANSIENT INTERFERENCE SUPPRESSION USING DIFFUSION MAPS.
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Columbia University Advanced Machine Learning & Perception – Fall 2006 Term Project Nonlinear Dimensionality Reduction and K-Nearest Neighbor Classification.
Nonlinear Dimensionality Reduction
Unsupervised Riemannian Clustering of Probability Density Functions
CS 2750: Machine Learning Dimensionality Reduction
Machine Learning Basics
Machine Learning Ali Ghodsi Department of Statistics
Special Topics in Data Mining Applications Focus on: Text Mining
Outline Peter N. Belhumeur, Joao P. Hespanha, and David J. Kriegman, “Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection,”
Outline Nonlinear Dimension Reduction Brief introduction Isomap LLE
Object Modeling with Layers
Michal Rosen-Zvi University of California, Irvine
Nonlinear Dimension Reduction:
Using Manifold Structure for Partially Labeled Classification
Machine Learning – a Probabilistic Perspective
Derek Hoiem CS 598, Spring 2009 Jan 27, 2009
CAMCOS Report Day December 9th, 2015 San Jose State University
What is Artificial Intelligence?
Presentation transcript:

1 Learning from Shadows Dimensionality Reduction and its Application in Artificial Intelligence, Signal Processing and Robotics Ali Ghodsi Department of Statistics and Actuarial Science University of Waterloo October 2006

2

3 Dimensionality Reduction

4

5 Manifold and Hidden Variables

6 Data Representation

7

9

by by 2 2 by by by 1

11

12

13

14

15

16 Hastie, Tibshirani, Friedman 2001

17 The Big Picture

18 Uses of Dimensionality Reduction (Manifold Learning)

19 Denoising Mika et. al Zhu and Ghodsi 2005

20 Tenenbaum, V de Silva, Langford 2001

21 Roweis and. Saul 2000

22 Arranging words: Each word was initially represented by a high-dimensional vector that counted the number of times it appeared in different encyclopedia articles. Words with similar contexts are collocated Roweis and Saul 2000

23 Hinton and Roweis 2002

24 Embedding of Sparse Music Similarity Graph Platt, 2004

25 Pattern Recognition Ghodsi, Huang, Schuurmans 2004

26 Pattern Recognition

27 Clustering

28 Glasses vs. No Glasses

29 Beard vs. No Beard

30 Beard Distinction Ghodsi, Wilkinson, Southey 2006

31 Glasses Distinction

32 Multiple-Attribute Metric

33 Reinforcement Learning Mahadevan and Maggioini, 2005

34 Semi-supervised Learning Use graph-based discretization of manifold to infer missing labels. Build classifiers from bottom eigenvectors of graph Laplacian. Belkin & Niyogi, 2004; Zien et al, Eds., 2005

35 Learning Correspondences How can we learn manifold structure that is shared across multiple data sets? Ham et al, 2003, 2005

36 Mapping and Robot Localization Bowling, Ghodsi, Wilkinson 2005 Ham, Lin, D.D. 2005

37 Action Respecting Embedding Joint Work with Michael Bowling and Dana Wilkinson

38 Modelling Temporal Data and Actions

39 Outline Background –PCA –Kernel PCA Action Respecting Embedding (ARE) –Prediction and Planning –Probabilistic Actions Future Work

40 Principal Component Analysis (PCA)

41 Principal Component Analysis (PCA)

42 Kernel Methods

43 Kernel Trick

44 Observed, Feature and Embedded Spaces

45 Kernel PCA

46 Problem

47 Idea

48 Action Respecting Embedding (ARE)

49 Action Respecting Constraint

50 Preserve distances between each point and its k nearest neighbors. Local Distances Constraint

51 Preserve local distances Local Distances Constraint

52 Semidefinite Programming

53 Experiment

54 Experiment 1

55 Experiment 2

56 Experiment 3

57 Experiment 4

58 Experiment 5

59 Planning

60 Planning

61 Planning

62 Experiment

63 Probabilistic Actions

64 Future work

65 Related Papers