Download presentation
Presentation is loading. Please wait.
Published bySheryl Patrick Modified over 9 years ago
1
1 Learning from Shadows Dimensionality Reduction and its Application in Artificial Intelligence, Signal Processing and Robotics Ali Ghodsi Department of Statistics and Actuarial Science University of Waterloo October 2006
2
2
3
3 Dimensionality Reduction
4
4
5
5 Manifold and Hidden Variables
6
6 Data Representation
7
7
8
8 11111 10101 11111 10.50.50.51 11111
9
9
10
10 644 by 103 644 by 2 2 by 103 23 by 28 -2.19 -0.02 -3.19 1.02 2 by 1
11
11
12
12
13
13
14
14
15
15
16
16 Hastie, Tibshirani, Friedman 2001
17
17 The Big Picture
18
18 Uses of Dimensionality Reduction (Manifold Learning)
19
19 Denoising Mika et. al. 1999 Zhu and Ghodsi 2005
20
20 Tenenbaum, V de Silva, Langford 2001
21
21 Roweis and. Saul 2000
22
22 Arranging words: Each word was initially represented by a high-dimensional vector that counted the number of times it appeared in different encyclopedia articles. Words with similar contexts are collocated Roweis and Saul 2000
23
23 Hinton and Roweis 2002
24
24 Embedding of Sparse Music Similarity Graph Platt, 2004
25
25 Pattern Recognition Ghodsi, Huang, Schuurmans 2004
26
26 Pattern Recognition
27
27 Clustering
28
28 Glasses vs. No Glasses
29
29 Beard vs. No Beard
30
30 Beard Distinction Ghodsi, Wilkinson, Southey 2006
31
31 Glasses Distinction
32
32 Multiple-Attribute Metric
33
33 Reinforcement Learning Mahadevan and Maggioini, 2005
34
34 Semi-supervised Learning Use graph-based discretization of manifold to infer missing labels. Build classifiers from bottom eigenvectors of graph Laplacian. Belkin & Niyogi, 2004; Zien et al, Eds., 2005
35
35 Learning Correspondences How can we learn manifold structure that is shared across multiple data sets? Ham et al, 2003, 2005
36
36 Mapping and Robot Localization Bowling, Ghodsi, Wilkinson 2005 Ham, Lin, D.D. 2005
37
37 Action Respecting Embedding Joint Work with Michael Bowling and Dana Wilkinson
38
38 Modelling Temporal Data and Actions
39
39 Outline Background –PCA –Kernel PCA Action Respecting Embedding (ARE) –Prediction and Planning –Probabilistic Actions Future Work
40
40 Principal Component Analysis (PCA)
41
41 Principal Component Analysis (PCA)
42
42 Kernel Methods
43
43 Kernel Trick
44
44 Observed, Feature and Embedded Spaces
45
45 Kernel PCA
46
46 Problem
47
47 Idea
48
48 Action Respecting Embedding (ARE)
49
49 Action Respecting Constraint
50
50 Preserve distances between each point and its k nearest neighbors. Local Distances Constraint
51
51 Preserve local distances Local Distances Constraint
52
52 Semidefinite Programming
53
53 Experiment
54
54 Experiment 1
55
55 Experiment 2
56
56 Experiment 3
57
57 Experiment 4
58
58 Experiment 5
59
59 Planning
60
60 Planning
61
61 Planning
62
62 Experiment
63
63 Probabilistic Actions
64
64 Future work
65
65 Related Papers
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.