Download presentation
Presentation is loading. Please wait.
Published byWalter Clark Modified over 9 years ago
1
Ranking Projection Zhi-Sheng Chen 2010/02/03 1/30 Multi-Media Information Lab, NTHU
2
Introduction Ranking is everywhere Retrieval for music, image, video, sound, … etc Scoring for speech, multimedia… etc Find a projection that Preserves the given order of the data Reduces the dimensionality of the data 2/30 Multi-Media Information Lab, NTHU
3
The Basic Criteria of Linear Ranking Projection Given the ranking order (c 1, c 2, c 3, c 4 ). In the projection space, we have the criteria Where d(.,.) is the distance measure between two classes In our cases we use the difference of the means 3/30 Multi-Media Information Lab, NTHU
4
The Basic Criteria of Linear Ranking Projection Let be the projection vector, the previous criteria can be rewritten as 4/30 Multi-Media Information Lab, NTHU
5
The Ordinal Weights Roughly speaking, these distances measure have different importance according to their order. Ex: is more importance than How about and ? Instead of finding the precisely rules of ordinal weights, we use a roughly ordinal weighted rule 5/30 Multi-Media Information Lab, NTHU
6
The Ordinal Weights Given a ranking order, we define a score to each term. The largest and the smallest scores indicate the top and the latest terms of the order. Simply define the ordinal weight function as So the weighted criteria becomes 6/30 Multi-Media Information Lab, NTHU
7
Some Results for Weighted Criteria (c 1, c 2, c 3, c 4 ) 7/30 Multi-Media Information Lab, NTHU
8
Some Results for Weighted Criteria (c 3, c 1, c 4, c 2 ) For the projection onto more than one-dim, the solution becomes selecting the kth eigenvectors w.r.t. the smallest kth eigenvalues 8/30 Multi-Media Information Lab, NTHU
9
Class with several groups We may not care the order of some groups of the data points within the class 9/30 Multi-Media Information Lab, NTHU
10
Grouped Classes For the above case, let the order be (c 1, c 2, c 3 ), then the criteria becomes 10/30 Multi-Media Information Lab, NTHU
11
Grouped Classes Result 11/30 Multi-Media Information Lab, NTHU
12
Reweighting function Take a look at this case However, the proper projection is … We got a problem here 12/30 Multi-Media Information Lab, NTHU
13
Reweighting function Solved by reweighting Every groups in the same class are weighted by the distance from the mean of the class Farer groups have the larger weights The modified criteria becomes … 13/30 Multi-Media Information Lab, NTHU
14
Reweighting function 14/30 Multi-Media Information Lab, NTHU
15
Non-linear Ranking Projection It is impossible to find a linear projection that have the order (c 3, c 2, c 1, c 4 ) 15/30 Multi-Media Information Lab, NTHU
16
General Idea of Kernel Transform the data into the high dimensional space through, and do the ranking projection on this space The projection algorithm can be done by using the dot product, i.e. Hence, we can define the term is called the Gram matrix (the discussion of the validation of the kernel is skip here) Several kernels: Polynomial kernel Gaussian kernel Radius base kernel … etc. 16/30 Multi-Media Information Lab, NTHU
17
Non-linear Ranking Projection Using “kernelized” approach to find a non-linear projection Consider the criteria of basic linear case Similar to the kernelized LDA (KDA), we can let the projection vector be 17/30 Multi-Media Information Lab, NTHU
18
Non-linear Ranking Projection Then Thus 18/30 Multi-Media Information Lab, NTHU
19
Non-linear Ranking Projection The kernelized criteria becomes Extending to ordinal weighting and grouped class is straightforward. Extending to re-weighting is more delicate. 19/30 Multi-Media Information Lab, NTHU
20
Results Experiments 1 Polynomial kernel, degree=2 Polynomial kernel, degree=3 20/30 Multi-Media Information Lab, NTHU Order: c3, c1, c4, c2
21
Results Gaussian kernel 21/30 Multi-Media Information Lab, NTHU Order: c3, c1, c4, c2
22
Results Experiments 2 Gaussian kernel Polynomial kernel, degree=2 22/30 Multi-Media Information Lab, NTHU Order: c3, c2, c1
23
Results Experiments 3 Polynomial kernel, degree=2 Gaussian kernel 23/30 Multi-Media Information Lab, NTHU Order: c3, c2, c1
24
Results Experiments 4 Polynomial kernel, degree=2 Gaussian kernel 24/30 Multi-Media Information Lab, NTHU Order: c3, c2, c1
25
Results Airplane dataset 214 data points Feature dimension is 13 Scores: 1 to 7 25/30 Multi-Media Information Lab, NTHU
26
Results Linear ranking projection 26/30 Multi-Media Information Lab, NTHU
27
Results Polynomial kernel, degree=2 Polynomial kernel, degree=5 Polynomial kernel, degree=10 27/30 Multi-Media Information Lab, NTHU
28
Results Each data points are projected onto the same points due to the computer precision Preserve the order well Gaussian kernel 28/30 Multi-Media Information Lab, NTHU
29
Future Work Some works need to be done For grouped classes Time consuming We can use “kernelized” K-means clustering to reduce the size of the data points The re-weighting function in the high dimensional space (kernel approach) has not done yet The precision problem in the kernelized approach Potential work Derives a probabilistic model? How to cope with the “missing” data (i.e. some dimensions of features are missing)? For what kernel is appropriate? 29/30 Multi-Media Information Lab, NTHU
30
Questions? 30/30 Multi-Media Information Lab, NTHU
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.