Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Graph Embedding (GE) & Marginal Fisher Analysis (MFA) 吳沛勳 劉冠成 韓仁智

Similar presentations


Presentation on theme: "1 Graph Embedding (GE) & Marginal Fisher Analysis (MFA) 吳沛勳 劉冠成 韓仁智"— Presentation transcript:

1 1 Graph Embedding (GE) & Marginal Fisher Analysis (MFA) 吳沛勳 seanpsw@gmail.comseanpsw@gmail.com 劉冠成 zeter168@gmail.comzeter168@gmail.com 韓仁智 jchan@visionatics.com.twjchan@visionatics.com.tw

2 2 Outline 1.Introduction 2.System Flowchart 3.Dimensionality Reduction- Graph Embedding 3.1 Cost Function -Intrinsic Graph/Penalty Graph 3.2 Linearization 3.3 Example: LDA 4.Marginal Fisher Analysis 5.Experiments Result

3 3 Outline 1.Introduction 2.System Flowchart 3.Dimensionality Reduction- Graph Embedding 3.1 Cost Function -Intrinsic Graph/Penalty Graph 3.2 Linearization 3.3 Example: LDA 4.Marginal Fisher Analysis 5.Experiments Result

4 4 1. Introduction We present a general framework called Graph Embedding (GE). In graph embedding, the underlying merits and shortcomings of different dimensionality reduction schemes, existing or new, are revealed by differences in the design of their intrinsic and penalty graphs and their types of embedding. A novel dimensionality reduction algorithm, Marginal Fisher Analysis (MFA).

5 5 Outline 1.Introduction 2.System Flowchart 3.Dimensionality Reduction- Graph Embedding 3.1 Cost Function -Intrinsic Graph/Penalty Graph 3.2 Linearization 3.3 Example: LDA 4.Marginal Fisher Analysis 5.Experiments Result

6 6 2. Face Recognition Flowchart N : # I mage (200, 20 image pre person) : # People (10) m : Image Size (24x24) w : Unitary Linear Projection Vector k : k nearest neighbors 1. Training Image Set: X 2.1 MFA Space Creation: w 2.2 Projection to MFA Space: Y 3. k-NN Classification 1. Test image: x test Classification Result Training ProcessTest Process 2. Projection to MFA Space: y test influence k1 : k -NN for intrinsic graph k2 : k -NN for penalty graph

7 7 Outline 1.Introduction 2.System Flowchart 3.Dimensionality Reduction- Graph Embedding 3.1 Cost Function -Intrinsic Graph/Penalty Graph 3.2 Linearization 3.3 Example: LDA 4.Marginal Fisher Analysis 5.Experiments Result

8 8 3. Graph Embedding For a dimensionality reduction problem, we require an intrinsic graph G and, optionally, a penalty graph as input. We now introduce the dimensionality reduction problem from the new point of view of graph embedding. Let G={X,W} be an undirected weighted graph (two-way direction) with vertex set X (N nodes) and similarity (or weighted) matrix. W : 1.Symmetric matrix 2.May be negative

9 9 3. Graph Embedding: Laplacian Matrix L=Degree-Adjacent=> (2) W is weighted matrix,also call similarity matrix G={X,W} :

10 10 3.1 Cost Function (1/2) Our graph-preserving criterion is given as follows: For larger (positive) similarity samples and : For smaller (negative) similarity samples and : B typically is the Laplacian matrix of a penalty graph. Use Lagrange multipliers to solve: (3) Y must be an eigenvector of the Intrinsic Graph Penalty Graph unknownKnown

11 11 3.1 Cost Function: Intrinsic Graph in LDA If we have 6 images for 2 people. is not important, because in GE i always not equal to j.

12 12 3.1 Cost Function: Penalty Graph We define an intrinsic graph to be the graph G itself Penalty graph : 1.As a graph whose vertices X are the same as those of G. 2.Whose edge weight matrix corresponds to the similarity characteristic that is to be suppressed in the dimension- reduced feature space. 3.Penalty graph = constraint

13 13 3.1 Cost Function: Penalty Graph in LDA If we have 6 images for 2 people. Maximize class covariance is equal to maximize data covariance. Here, LDA penalty graph is as PCA intrinsic graph: Consider only btw-class scatter.

14 14 3.2 Linearization Assuming that the low-dimensional vector representations of the vertices can be obtained from a linear projection as, where w is the unitary projection vector, the objective function in (3) becomes : (4) Solution: w can be solved by singular value decomposition (SVD)

15 15 Outline 1.Introduction 2.System Flowchart 3.Dimensionality Reduction- Graph Embedding 3.1 Cost Function -Intrinsic Graph/Penalty Graph 3.2 Linearization 3.3 Example: LDA 4.Marginal Fisher Analysis (MFA) 5.Experiments Result

16 16 4.1 Marginal Fisher Analysis (MFA) (1/3) Fig. 4. The adjacency relationships of the intrinsic and penalty graphs for the Marginal Fisher Analysis algorithm. k-NN : k 1 =5 In same class k-NN : k 2 =2 In difference class Marginal k-NN (Within-Class) (Btw-Class)

17 17 (15) Cost function: Minimize within-class (intrinsic graph) Maximize between- class (penalty graph) (13) Intrinsic graph 4.1 Marginal Fisher Analysis (MFA) (2/3) indicates the index set of the k 1 nearest neighbors of the sample x i in the same class. L k-NN : k 1 =5 In same class

18 18 4.1 Marginal Fisher Analysis (MFA) (3/3) (14) Penalty graph How to decide k 1 and k 2 -nearest neighbor : k-nearest neighbor (k-NN) is a set of data pairs that are the k 2 nearest pairs among the set denote the index set belonging to the c th class B

19 19 4.1 Q&A (1/2) Q1: MFA: How to Decide k 1 k 2 A1:  k 1 : (Within Class) Sampled five values between two and {min c (n c -1)}, and chose the value with the best MFA performance. n c : # of images per class (subject) (We direct use 5)  k 2 : (Btw-Class) Choose the best k 2 between 20 and 8N c at sampled intervals of 20. N c : # of classes (subjects) (We direct use 20)

20 20 MFA: Comparison with LDA Advantages : 1.The number of available projection directions (axes) is much larger than that of LDA (MFA finds more significant axes and has better classification results. ). MFA: Rank (B -1 L) LDA: N c -1 2.There is no assumption on the data distribution, thus it is more general for discriminant analysis, LDA assumption data is approximately Gaussian distributed. Data distribution: MFA: Non-linear LDA: Linear 3.The inter-class margin can better characterize the separability of different classes than the inter-class scatter in LDA. MFA: Maximize margin LDA: Difference between means Disadvantage: LDA-> incremental LDA, MFA->? 4.1 Q&A (2/2) margin Positive Negative

21 21 Outline 1.Introduction 2.System Flowchart 3.Dimensionality Reduction- Graph Embedding 3.1 Cost Function -Intrinsic Graph/Penalty Graph 3.2 Linearization 3.3 Example: LDA 4.Marginal Fisher Analysis 5.Experiments Result

22 22 5. Experiments: Database (1/2) DadabaseYale B People10 #Image 30 per person, random select 20 image for training, and remain 10 image for test (G30/P20) Image size24x24 VariationsVariable illumination, cropped face 24x24

23 23 5. Experiments Result k-NNk=1k=3k=5 ErrorRate11.72%±3.5811.04%±3.6611.10±3.27 For each k run 100 times, and calculate mean and standard deviation.

24 24 Reference 1.P. Belhumeur, J. Hespanha, and D. Kriegman, “Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol. 19, No. 7, pp. 711–720, 1997. 2.T.K. Kim, S.F. Wong, B. Stenger, J. Kittler and R. Cipolla, “Incremental Linear Discriminant Analysis Using Sufficient Spanning Set Approximations”, CVPR, pp. 1-8, 2007. 3.S. Yan, D. Xu, B. Zhang, H. Zhang, Q. Yang, and S. Lin, “Graph Embedding and Extensions: A General Framework for Dimensionality Reduction,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol. 29, No. 1, pp. 40–51, 2007.


Download ppt "1 Graph Embedding (GE) & Marginal Fisher Analysis (MFA) 吳沛勳 劉冠成 韓仁智"

Similar presentations


Ads by Google