Download presentation
Presentation is loading. Please wait.
1
Generalized Locality Preserving Projections
-—— Perform graph construction and dimensionality reduction simultaneously in one single objective function Limei Zhang [1] X. He, P. Niyogi, Locality preserving projections, Proc. Conf. Advances in Neural Information Processing Systems (NIPS), 2003
2
Outline Motivation Locality Preserving Projections (LPP)
Adjacency weight matrix is symmetrical Adjacency weight matrix is asymmetrical Generalized Locality Preserving Projections (GLPP) Soft LPP (SLPP) Entropy LPP (ELPP) Experiments Conclusion
3
Outline Motivation Locality Preserving Projections (LPP)
Adjacency weight matrix is symmetrical Adjacency weight matrix is asymmetrical Generalized Locality Preserving Projections (GLPP) Soft LPP (SLPP) Entropy LPP (ELPP) Experiments Conclusion
4
Motivation There are many graph-based methods. Dimensionality reduction methods: LE, ISOMAP, LLE, LPP, NPE, etc. including their Supervised, Semi-supervised or Tensor extensions... Semi-supervised learning methods: Manifold Regularization, Manifold Contraction, Label Propagation, Local and Global Consistency, … The graph is at the heart of these methods. However, graph construction has not been studied extensively. separate the constructing graph and learning algorithms in two different steps: The graph is fixed during the subsequent learning task and has no direct connection to it; The graph is manual predefined( for example, the difficulty of the selection of parameters; the sensibility to the parameters) .
6
perform the graph construction and the learning task simultaneously in one single objective function
Choose LPP to demonstrate the idea since its simplicity and foundational role a simultaneous learning framework for graph construction and dimensionality reduction-—— Generalized LPP (GLLP)
7
Outline Motivation Locality Preserving Projections (LPP)
Adjacency weight matrix is symmetrical Adjacency weight matrix is asymmetrical Generalized Locality Preserving Projections (GLPP) Fuzzy LPP (FLPP) Entropy LPP (ELPP) Experiments Conclusion
8
LPP A linear transform optimally preserves local neighborhood information Objective function or Assume that the weight matrix is asymmetrical for general purpose.
9
Asymmetrical LPP:
10
Outline Motivation Locality Preserving Projections (LPP)
Adjacency weight matrix is symmetrical Adjacency weight matrix is asymmetrical Generalized Locality Preserving Projections (GLPP) Soft LPP (SLPP) Entropy LPP (ELPP) Experiments Conclusion
11
Non-convex, alternating iterative optimization
Soft LPP Non-convex, alternating iterative optimization Soft
12
Ⅰ. ?? Ⅱ.
13
soft soft
14
Soft LPP Entropy LPP add different regularization; doubly stochastic matrix... ??
15
Ⅰ. Ⅱ.
16
The advantages of the optimized weight in comparison with the neighbor weight
The heart The optimized weight The neighbor weight SLPP ELPP gradual update vs predefined (have more locality preserving power) nonlocal (include local) vs local asymmetrical vs symmetrical The forms are similar
18
Outline Motivation Locality Preserving Projections (LPP)
Adjacency weight matrix is symmetrical Adjacency weight matrix is asymmetrical Generalized Locality Preserving Projections (GLPP) Fuzzy LPP (FLPP) Entropy LPP (ELPP) Experiments Conclusion
19
Experiments Ⅰ. How does GLPP work?
Ⅱ. How effective does GLPP compare to LPP?
20
Ⅰ. How GLPP work? 2-D Data Visualization on Wine dataset Initial LPP
First Iteration, SLPP Second Third Four Fifth
22
Soft LPP is insensitivity to the initial value and the parameters m.
wine dataset Yale dataset Soft wine dataset
23
ELPP(dimension,alpha)
Ⅱ. How effective GLPP compare to LPP? Classification Accuracy on Six Datasets LPP(dimension,k,t) ELPP(dimension,alpha) SLPP(dimension,m) wine (11, 2, 1) (13, 0.8) (12, 2) AR (136, 1, 1) (87, ) (118, 3) Yale(3/8) (44, 1, 1) (40, ) (42, 3) Yale(5/6) (89, 1, 1) (82, ) (89, 3) usps (27, 1, 1) (27, 9.88) (33, 3) Isolet (146, 5, 1) (86, ) (58, 3) iris (4,2,1) (1,0.0988) (4,3)
24
Outline Motivation Locality Preserving Projections (LPP)
Adjacency weight matrix is symmetrical Adjacency weight matrix is asymmetrical Generalized Locality Preserving Projections (GLPP) Fuzzy LPP (FLPP) Entropy LPP (ELPP) Experiments Conclusion
25
Propose a novel method of graph construction
Conclusion The advantages of GLPP ( Soft LPP, Entropy LPP) perform the graph construction and the learning task simultaneously in one single objective function The selection of parameter is easier and the algorithm is insensitive to it The optimized graph is nonlocal: can link faraway points; mitigate the curse of dimensionality; insensitive to the noise and outliers The algorithm converges fast and insensitive to the initial parameter value The proposed algorithm is essentially a unified framework.
26
Questions Trace Ratio Ratio Trace Non-convergence!!
27
Approach one: Ratio Trace
Convergent Ⅰ. Ⅱ.
28
Approach two: orthogonal
Soft OLPP Convergent Ⅰ. Orthogonal LPP Ⅱ.
29
YaleB dataset (Ratio Trace)
30
Thanks! Questions?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.