Download presentation
Presentation is loading. Please wait.
Published bySophie Garrett Modified over 9 years ago
1
Feature extraction using fuzzy complete linear discriminant analysis The reporter : Cui Yan 2012. 4. 26
2
The report outlines 1.The fuzzy K-nearest neighbor classifier (FKNN) 2.The fuzzy complete linear discriminant analysis 3.Expriments
3
The Fuzzy K-nearest neighbor classifier (FKNN)
4
Each sample should be classified similarly to its surrounding samples, therefore, a unknown sample could be predicated by considering the classification of its nearest neighbor samples. The K-nearest neighbor classifier (KNN)
5
KNN tries to classify an unknown sample based on its k-known classification neighbors.
6
FKNN Given a sample set, a fuzzy M -class partition of these vectors specify the membership degrees of each sample corres- ponding to each class. The membership degree of a training vector to o each of M classes is specified by, which is computed by the following steps:
7
Step 1: Compute the distance matrix between pairs of feature vectors in the training. Step 2: Set diagonal elements of this matrix to infinity (practically place large numeric values there).
8
Step 3: Sort the distance matrix (treat each of its column separately) in an ascending order. Collect the class labels of the patterns located in the closest neigh- borhood of the pattern under consi- deration (as we are concerned with k neighbors, this returns a list of k integers).
9
Step 4: Compute the membership grade to class i for j-th pattern using the expression proposed in [1]. [1] J.M. Keller, M.R. Gray, J.A. Givens, A fuzzy k-nearest neighbor algorithm, IEEE Trans. Syst.Man Cybernet. 1985, 15(4):580-585
10
A example for FKNN
13
Set k=3
14
The fuzzy complete linear discriminant analysis
15
For the training set, we define the i-th class mean by combining the fuzzy membership degree as And the total mean as (1) (2)
16
Incorporating the fuzzy membership degree, the between-class, the within-class and the total class fuzzy scatter matrix of samples can be defined as (3)
17
step1: Calculate the membership degree matrix U by the FKNN algorithm. step 2: According toEqs.(1)-(3) work out the between-class, within-class and total class fuzzy scatter matrices. step 3: Work out the orthogonal eigenvectors p1,..., pl of the total class fuzzy scatter matrix corresponding to positive eigenvalues. Algorithm of the fuzzy complete linear analysis
18
step 4: Let P = (p1,..., pl) and, work out the orthogonal eigenvectors g1,..., gr of correspending the zero eigenvalues. step 5: Let P1 = (g1,..., gr) and, work out the orthogonal eigenvectors v1,..., vr of, calculate the irregular discriminant vectors by.
19
step 6: Work out the orthogonal eigenvectors q1,…, qs of correspending the non-zero eigenvalues. step 7: Let P2 = (q1,…, qs) and, work out the optimal discriminant vectors vr+1,..., vr+s by the Fisher LDA, calculate the regular discriminant vectors by. step 8: (Recognition): Project all samples into the obtained optimal discriminant vectors and classify.
20
Experiments
21
We compare Fuzzy-CLDA with CLDA, UWLDA, FLDA, Fuzzy Fisherface, FIFDA on 3 different data sets from the UCI data sources. The characteristics of the three datasets can be found from (http://archive.ics.uci.edu/ml/datasets).http://archive.ics.uci.edu/ml/datasets All data sets are randomly split to the train set and test set with the ratio 1:4. Experiments are repeated 25 times to obtain mean prediction error rate as a performance measure, NCC is adopted to classify the test samples by using L2 norm.
24
Thanks ! 2012. 4. 26
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.