Presentation is loading. Please wait.

Presentation is loading. Please wait.

Feature Selection with Kernel Class Separability 指導教授:王振興 電機所 N28961523 林哲偉 電機所 N26974164 曾信輝 電機所 N26974172 吳俐瑩 Date: 2009.01.14 Lei Wang, “Feature selection.

Similar presentations


Presentation on theme: "Feature Selection with Kernel Class Separability 指導教授:王振興 電機所 N28961523 林哲偉 電機所 N26974164 曾信輝 電機所 N26974172 吳俐瑩 Date: 2009.01.14 Lei Wang, “Feature selection."— Presentation transcript:

1 Feature Selection with Kernel Class Separability 指導教授:王振興 電機所 N28961523 林哲偉 電機所 N26974164 曾信輝 電機所 N26974172 吳俐瑩 Date: 2009.01.14 Lei Wang, “Feature selection with kernel class separability,” IEEE Tras. Pattern Analysis and Machine Intelligence, vol. 30, no. 9, pp.1534-1546, 2008 2015/12/241

2 Outline Introduction Feature Selection Feature Selection Criterion Characteristic Analysis Experimental Results Conclusions Future work 2015/12/242

3 Introduction Classification can often benefit from efficient feature selection. A class separability criterion is developed in a high-dimensional kernel space. The criterion is applied to a variety of selection modes using different search strategies. 2015/12/243

4 Feature Selection Feature selection often consists of a selection criterion and a search strategy. In this paper, the author compared 5 different selection criteria, and 3 search strategy. The author executed 30 trials for each. 2015/12/244

5 Flow Chart 2015/12/245 10 15 20 25 30 35 40 45 50 30 randomly chosen data

6 Feature Selection Criterion Correlation coefficient –Higher relevance –Cannot handle linearly nonseparable data Kolmogorov-Smirnov test –Less possibility or higher test value –Needs a sufficient number of samples 2015/12/246

7 Feature Selection Criterion Class separability (Non-kernel) –Simple –Cannot handle linearly nonseparable data Radius-margin bound –Well handles linearly nonseparable data –Not computationally efficient Kernel class separability –Better performance than above 2015/12/247

8 Characteristic Analysis In “Class separability” approach, the criterion is tr(S B )/tr(S W ). –tr( . ) denotes as “trace” of a matrix – In “Kernel-based class separability” approach, the criterion is T Φ =tr(S B Φ )/tr(S W Φ ). –T * = max( T Φ ) Using Gaussian kernel function 2015/12/248

9 9

10 10

11 Experimental Results Synthetic Dataset 600 data points 52 features 2 classes 2015/12/2411

12 Implementation 2015/12/2412

13 Time Cost 2015/12/2413

14 Use SVM test error to evaluate the significance of KCSM and RMB. SVM Classifier 2015/12/2414

15 15

16 Conclusions and Discussions From our simulation results, the proposed kernel-based class separability measure is the best choice for feature selection in these 5 measures. However, the time cost increases dramatically with the growing number of data. 2015/12/2416

17 Future work 2015/12/2417 US Postal Service  7291 training samples and 2007 test samples. Each sample is characterized by 256 features. We will try to implement the USPS dataset for further investigation.


Download ppt "Feature Selection with Kernel Class Separability 指導教授:王振興 電機所 N28961523 林哲偉 電機所 N26974164 曾信輝 電機所 N26974172 吳俐瑩 Date: 2009.01.14 Lei Wang, “Feature selection."

Similar presentations


Ads by Google