Download presentation
Presentation is loading. Please wait.
Published bySherman Gaines Modified over 9 years ago
1
Chapter 10 The Support Vector Method For Estimating Indicator Functions jpzhang@fudan.edu.cn Intelligent Information Processing Laboratory, Fudan University
2
Optimal hyperplane Remarkable statistical properties. Construct a new class of learning machines. Support vector machines.
3
The optimal hyperplane The optimal hyperplane for nonseparable sets Statistical properties of the optimal hyperplane Proof of the theorems The Idea of the Support Vector Machine
4
One More Approach to the Support Vector Method Selection of SV Machine Using Bounds Examples of SV Machines For Pattern Recognition SV Method for transductive inference Multiclass classification Remarks on generalization of the SV method
20
Properties Objective function do not depend explicitly on the dimensionality of the vector x Depend on the inner product of two vectors. Allow us to construct separating hyperplanes in high-dimensional space.
21
The optimal hyperplane for nonseparable sets
29
The Basic Solution: Soft Margin Generalization
30
Statistical properties of the optimal hyperplane
39
Proof of the theorems ( 略 )
40
The Idea of the Support Vector Machine Support Vector Machine: Maps the input vectors x into the high- dimensional feature space Z through nonlinear mapping, chosen a priori. In this space, an optimal separating hyperplane is constructed.
41
Problem How to find a separating hyperplane that generalizes well. (conceptual problem) Dimensionality is huge Generalize well How to treat such high-dimensional spaces computationally. (technical problem) Curse of dimensionality
42
Generalization in high- dimensional space Conceptual Optimal hyperplane Generalization ability is high even if the feature space has a high dimensionality. Technical One does not need to consider the feature space in explicit form. Calculate the inner products between support vectors and the vectors of the feature space.
43
Mercer Theorem
45
Constructing SV Machines
46
One More Approach to the Support Vector Method Minimizing the Number of Support Vectors Generalization for the Nonseparable Case Linear Optimization Method for SV Machines
47
Minimizing the Number of Support Vectors The optimal hyperplane has an expansion on the support vectors If a method of constructing the hyperplane has a unique solution then the generalization ability of the constructed hyperplane depends on the number of support vectors.
48
Selection of SV Machine Using Bounds
49
Examples of SV Machines For Pattern Recognition
50
SV Method for transductive inference
51
Multiclass classification
52
Remarks on generalization of the SV method
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.