Download presentation
Presentation is loading. Please wait.
Published byAbigayle Joseph Modified over 9 years ago
1
Support Vector Machine-Linearwww.mecharithm.com
2
Support Vector Machine: Separable case Min f(x) s.t. g(x)<=0 دارای جواب بهینه به ازای مشتق تابع لاگرانژ است اگر تابع f(x) و g(x) هر دو محدب باشند در غیر اینصورت تابع دارای Saddle point است ( مثال y=x^3) بررسی تحدب تابع : یک روش استفاده از Hessian Matrix است مقادیر ویژه ماتریس H مثبت باشند در این صورت H معین مثبت است و تابع f نیز محدب است
3
The optimal solution of (2.11) satisfies the following Karush-Kuhn-Tucker (KKT) conditions
8
L1 Soft-Margin Support Vector Machines
9
we consider minimizing
10
We select the value of p as either 1 or 2. We call the obtained hyperplane the soft-margin hyperplane. When p = 1, we call the support vector machine the L1 soft-margin support vector machine or the L1 support vector machine for short (L1 SVM) and when p = 2, the L2 softmargin support vector machine or L2 support vector machine (L2 SVM).
11
Using (2.41), we reduce (2.42) to (2.44), respectively, to The only difference between L1 soft-margin support vector machines and hardmargin support vector machines is that αi cannot exceed C.
12
1.
14
Two-Class Least Squares Support Vector Machines Min The training of the LS support vector machine is done by solving a set of linear equations, instead of a quadratic programming problem.
15
برای حالت خطی : g(x)=x KKT:
16
حال با 4.4 می توان از روی ضرایب لاگرانژ بدست آمده وزنها را بدست آورد
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.