Presentation is loading. Please wait.

Presentation is loading. Please wait.

Variations of Minimax Probability Machine Huang, Kaizhu 2003-09-16.

Similar presentations


Presentation on theme: "Variations of Minimax Probability Machine Huang, Kaizhu 2003-09-16."— Presentation transcript:

1 Variations of Minimax Probability Machine Huang, Kaizhu 2003-09-16

2 Overview Classification –types, problems Minimax Probability Machine Main work –Biased Minimax Probability Machine –Minimum Error Minimax probability Machine Experiments Future work

3 Classification

4 Types of Classifiers Generative Classifiers Discriminative Classifiers

5 Classification—Generative Classifier p1 p2 Generative model assumes specific distributions on two class of data and uses these distributions to construct classification boundary.

6 Problems of Generative Model All models are wrong, but some are useful – by Box The distributional assumptions lack the generality and are invalidate in real cases It seems that Generative model should not assume specific model on the data

7 Classification—Discriminative Classifier:SVM support vectors

8 Problems of SVM support vectors It seems that SVM should consider the distribution of the data

9 SVMGM It seems that Generative model should not assume specific models on the data It seems that SVM should consider the distribution of the data

10 Minimax Probability Machine (MPM) Features: –With distribution considerations –With no specific distribution assumption

11 Minimax Probability Machine With distribution considerations –Assume the mean and covariance directly estimated from data reliably represent the real mean of covariance Without specific distribution assumption –Directly construct classifiers from data

12 Minimax Probability Machine (Formulation) Objective

13 Minimax Probability Machine (Cont’d) MPM problem leads to Second Order Cone Programming Dual Problem Geometric interpretation

14 Minimax Probability Machine (Cont’d) Summary –Distribution-free –In general case, the accuracy of classification of the future data is bounded by α –Demonstrated to achieve comparative performance with the SVM.

15 Problems of MPM 1.In real cases, the importance for two classes is not always the same, which implies the lower bound α for two classes is not necessarily the same. – Motivate Biased Minimax Probability Machine 2.On the other hand, it seems that no reason exists that these equal bounds are required to be equal. The derived model is thus non-optimal in this sense.– Motivate Minimum Error Minimax Probability Machine

16 Biased Minimax Probability Machine Observation: In diagnosing a severe epidemic disease, misclassification of the positive class causes more serious consequence than misclassification of the negative class. A typical setting: as long as the accuracy of classification of the less important maintains at an acceptable level ( specified by the real practitioners), the accuracy of classification of the important class should be as high as possible.

17 Objective the same meaning as previous an acceptable accuracy level Equivalently Biased Minimax Probability Machine (BMPM)

18 Objective Equivalently, Equivalently, BMPM (Cont’d)

19 Parametric Method 1.Find by solving 2.Update Equivalently Least-squares approach

20 Biased Minimax Probability Machine MPM BMPM at an acceptable accuracy level

21 Minimum Error Minimax Probability Machine MPM MEMPM The MEMPM achieves the distribution- free Bayes optimal hyperplane in the worst-case setting.

22 Minimum Error Minimax Probability Machine MEMPM achieves the Bayes optimal hyerplane when we assume some specific distribution, e.g. Gaussian distribution on data. Lemma : If the distribution of the normalized random variable is independent of a, the classifier derived by MEMPM will exactly represent the real Bayes optimal hyerplane.

23 Objective Equivalently MEMPM (Cont’d)

24 Objective Line search + sequential BMPM method MEMPM (Cont’d)

25 Kernelized BMPM where Kernelized Version

26 Kernelized BMPM where and Kernelized Version (Cont’d)

27 Illustration of kernel methods Linear Kernel

28 Experimental results (BMPM) Five benchmark datasets –Twonorm, Breast, Ionosphere, Pima, Sonar Procedure – 5-fold cross validation –Linear –Gaussian Kernel Parameter setting – pima – others

29 Experimental results

30 Experiments for MEMPM Six benchmark datasets –Twonorm, Breast, Ionosphere, Pima, Heart, Vote Procedure – 10-fold cross validation –Linear –Gaussian Kernel

31 Results for MEMPM

32 Experiments for MEMPM Six benchmark datasets –Twonorm, Breast, Ionosphere, Pima, Heart, Vote Procedure – 10-fold cross validation –Linear –Gaussian Kernel

33 Results for MEMPM

34 Conclusions and Future works Conclusions –First quantitative method to analyze the biased classification task –Minimize the classification error rate in the worst case Future works –Improve the efficiency of algorithm, especially in the kernelized version Any decomposed method? –Robust estimation –Relation between VC bound in Support Vector Machine and bound in MEMPM –Regression model?

35 Reference Popescu, I. and Bertsimas, D. (2001). Optimal inequalities in probability theory: A convex optimization approach. Technical Report TM62, INSEAD. Lanckriet, G. R. G., El Ghaoui, L., and Jordan, M. I. (200a). Minimax probability machine. In Advances in Neural Information Processing Systems (NIPS) 14, Cambridge, MA. MIT Press. Kaizhu Huang, Haiqin Yang, Irwin King, R. Michael Lyu, and Laiwan Chan. Biased minimax probability machine. 2003. Kaizhu Huang, Haiqin Yang, Irwin King, R. Michael Lyu, and Laiwan Chan. Minimum error minimax probability machine. 2003.


Download ppt "Variations of Minimax Probability Machine Huang, Kaizhu 2003-09-16."

Similar presentations


Ads by Google