Presentation is loading. Please wait.

Presentation is loading. Please wait.

Improving Support Vector Machine through Parameter Optimized Rujiang Bai, Junhua Liao Shandong University of Technology Library Zibo 255049, China { brj,

Similar presentations


Presentation on theme: "Improving Support Vector Machine through Parameter Optimized Rujiang Bai, Junhua Liao Shandong University of Technology Library Zibo 255049, China { brj,"— Presentation transcript:

1 Improving Support Vector Machine through Parameter Optimized Rujiang Bai, Junhua Liao Shandong University of Technology Library Zibo 255049, China { brj, ljhbrj}@sdut.edu.cn

2 Outline Motivation and Background Motivation and Background RGSC(Rough sets and Genetic algorithms for SVM classifier) RGSC(Rough sets and Genetic algorithms for SVM classifier) Experiment Experiment Conclusion Conclusion

3 Motivation and Background Three problems Three problems 1 、 the high dimensionality of input feature vectors impacts on the classification speed. 2 、 The kernel parameters setting for SVM in a training process impacts on the classification accuracy. 3 、 Feature selection also impacts classification accuracy.

4 Motivation and Background Goal Goal The objective of this work is to reduce the dimension of feature vectors, optimizing the parameters to improve the SVM classification accuracy and speed. MethodMethod In order to improve classification speed we spent rough sets theory to reduce the feature vector space. We present a genetic algorithm approach for feature selection and parameters optimization to improve classification accuracy.

5

6 (1) Preprocessing: preprocessing includes remove HTML tags, segment word and construct Vector Space Model. (2) Feature reduction by rough sets. Our objective is to find a reduction with minimal number of attributes. Described in Alg. 1. (3) Converting genotype to phenotype. This step will convert each parameter and feature chromosome from its genotype into a phenotype. (4) Feature subset. After the genetic operation and converting each feature subset chromosome from the genotype into the phenotype, a feature subset can be determined. RGSC

7 (5) Fitness evaluation. For each chromosome representing C, γ and selected features, training dataset is used to train the SVM classifier, while the testing dataset is used to calculate classification accuracy. When the classification accuracy is obtained, each chromosome is evaluated by fitness function— formula (8). RGSC

8 (6) Termination criteria. When the termination criteria are satisfied, the process ends; otherwise, we proceed with the next generation. (7) Genetic operation. In this step, the system searches for better solutions by genetic operations, including selection, crossover, mutation, and replacement. (8) Input the preprocessed data sets into the obtained optimized SVM classifier. RGSC

9 Algorithm of RST-based feature reduce RGSC

10 Algorithm : Rough Sets Attribute Reduction algorithm Input: a decision table,, Output: a reduction of T, denoted as Redu. 1. construct the binary discernibility matrix M of T ; 2. delete the rows in the M which are all 0’s, Redu= /* delete pairs of inconsistent objects*/ 3. while 4. {(1) select an attribute ci in the M with the highest discernibility degree (if there are several (j=1,2,…,m) with the same highest discernibility degree, choose randomly an attribute from them); 5. (2) Redu Redu ; 6. (3) remove the rows which have ‘‘1’’ in the column from M; 7. (4) remove the column from M; }endwhile /* the following steps remove redundant attributes from Redu */ 8. suppose that Redu = contains k attributes which are sorted by the order of entering Redu, is the first attributes chosen into Redu, is the last one chosen into Redu. 9. get the binary discernibility matrix MR of decision table TR= ; 10. delete the rows in the MR which are all 0’s;

11 Chromosome design To implement our proposed approach, this research used the RBF kernel function for the SVM classifier. The chromosome comprises three parts, C, γ, and the features mask. RGSC

12 Fitness function Fitness function WA SVM classification accuracy weight SVM_accuracy SVM classification accuracy WF weight for the number of features Ci cost of feature i Fi ‘1’ represents that feature i is selected; ‘0’ represents that feature i is not selected RGSC

13 Experiments Experiment environment Experiment environment Our implementation was carried out on the YALE(Yet Another Learning Environment) 3.3 development environment(Available at:http://rapid-i.com/). Feature reduction by Rough Sets Theory carried out on ROSETTA(you can download it from http://rosetta.sourceforge.net/). http://rosetta.sourceforge.net/ The empirical evaluation was performed on Intel Pentium IV CPU running at 3.0 GHz and 1GB RAM.

14

15

16

17

18 Conclusion In this paper, we have proposed a document classification method using an SVM based on Rough Sets Theory and Genetic Algorithms. The feature vectors are reduced by Rough Set Theory. The feature vectors are selected and parameters optimization by Genetic Algorithms. The experimental results show that the RGSC we proposed yields the best result of these three methods.

19 Thank you!


Download ppt "Improving Support Vector Machine through Parameter Optimized Rujiang Bai, Junhua Liao Shandong University of Technology Library Zibo 255049, China { brj,"

Similar presentations


Ads by Google