Download presentation
Presentation is loading. Please wait.
Published byRandolf Craig Modified over 8 years ago
1
The Chinese University of Hong Kong Learning Larger Margin Machine Locally and Globally Dept. of Computer Science and Engineering The Chinese University of Hong Kong Shatin, NT. Hong Kong Kaizhu Huang February 9, 2004
2
The Chinese University of Hong Kong Contributions Theory: A unified model of Support Vector Machine (SVM), Minimax Probability Machine (MPM), and Linear Discriminant Analysis (LDA). Practice: A sequential Conic Programming Problem.
3
The Chinese University of Hong Kong Outline Background And Motivation Maxi-Min Margin Machine(M 4 ) –Model Definition –Geometrical Interpretation –Solving Methods –Connections With Other Models M 4 : Non-separable case Experimental Results Future Work Conclusion
4
The Chinese University of Hong Kong Background: Classifier
5
The Chinese University of Hong Kong Background: SVM A more reasonable decision plane SVM Support Vectors
6
The Chinese University of Hong Kong Maxi-Min Margin Machine(M 4 )
7
The Chinese University of Hong Kong M4:Geometrical Interpretation
8
The Chinese University of Hong Kong M4:Solving Method Basic Technique: Divide and Conquer –If we fix to a specific, the problem changes to check whether this satisfies the following constraints: –If yes, we increase ; otherwise, we decrease it. Second Order Cone Programming Problem!!!
9
The Chinese University of Hong Kong M4:Solving Method (Continue) Iterate the following two steps to solve M 4 :
10
The Chinese University of Hong Kong M4:Solving Method (Continue) can it satisfy the constraints? Yes No
11
The Chinese University of Hong Kong Connection with MPM Span all the data points and add them together + Exactly MPM Optimization Problem!!!
12
The Chinese University of Hong Kong Connection with MPM Remarks: –The procedure is not reversible: MPM is a special case of M 4 –MPM focuses on building decision boundary GLOBALLY, i.e., it exclusively depends on the means and covariances. However, means and covariances may not be accurately estimated. MPM M 4
13
The Chinese University of Hong Kong Connection With SVM The magnitude of w can scale up without influencing the optimization SVM with a further assumption:
14
The Chinese University of Hong Kong Connection With SVM M 4 SVM M 4 SVM
15
The Chinese University of Hong Kong Connection With SVM SVM assumes
16
The Chinese University of Hong Kong Links With LDA Perform the similar procedure as in MPM LDA
17
The Chinese University of Hong Kong Link With LDA
18
The Chinese University of Hong Kong Non-separable Case
19
The Chinese University of Hong Kong Experimental Results-Synthetic Toy example
20
The Chinese University of Hong Kong Experimental Results-Benchmark Datasets
21
The Chinese University of Hong Kong Future Work Kernelization? –Nonlinear extension of M4 Speed-up algorithms? –Is critical in large-scale applications Generation error bound? –SVM and MPM have both error bounds. Multi-way classification extension?
22
The Chinese University of Hong Kong Conclusion Propose a unified model of MPM and SVM Propose feasible solving methods based on sequential Second Order Cone Programming.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.