Download presentation
Presentation is loading. Please wait.
Published byHong Pan Modified over 5 years ago
1
Pattern Recognition ->Machine Learning- >Data Analytics Supervised Learning Unsupervised Learning Semi-supervised Learning Reinforcement Learning
2
Bayes Theorem Conditional probability: P(A|B), P(B|A) Marginal probability: P(A), P(B) Joint probability: P(A,B) P(AB) Bayes theorem: P(A), P(B), P(A|B) →P(B|A) P(A), P(B), P(B|A) →P(A|B)
3
Example P(C) = 0.01 (1%) P(pos|C) = 0.90 (90%) → 90% test is positive if you have C Sensitivity P(neg|~C) = 0.90 (90%) → 90% test is negative if you don’t have C Prior Specificity Question: if test is positive, the probability of having C ? P(C|pos) = ?
4
All people C
6
Naïve Bayes (Example: Text learning) Chris Sara LifeWorkFamily 0.1 0.8 0.1 0.5 0.2 0.3 LifeWorkFamily P(Chris) = 0.5 P(Sara) = 0.5 Life Family Chris or Sara? Life Work Chris or Sara?
7
Naïve Bayes Prior: P(y 1 ), P(y 2 )…., P(y m ) Conditional prob: P(x 1,x 2,…,x n |y 1 ),…. P(x 1,x 2,…,x n |y m ) Solution: argmax(P(y j |x 1,x 2,…,x n ) ) j Naïve: assume independence of x 1, x 2,…, x n
8
Gaussian Naïve Bayes Conditional prob: P(x i |y) ~ N(µ y, σ y ) Implementation: Scikit learning-sklearn
9
Gaussian Naïve Bayes Self-driving car dataset
10
Gaussian Naïve Bayes
11
Support Vector Machine (SVM) Basic idea
12
Support Vector Machine (SVM) Basic idea
13
Support Vector Machine (SVM) Basic idea
14
Support Vector Machine (SVM) Basic idea Maximize distance to nearest point Maximize margin Support vector
15
Support Vector Machine (SVM) Basic idea Maximize robustness of classifier
16
Support Vector Machine (SVM) Basic idea
17
Support Vector Machine (SVM) Basic idea 1.Lower classifier error ! 2.Maximize margin
18
Support Vector Machine (SVM) Outlier outlier
19
Support Vector Machine (SVM) Outlier outlier
20
Support Vector Machine (SVM) Will SVMs work ? 1.Yes 2.No
21
Support Vector Machine (SVM) Trick Features x y SVM Label
22
Support Vector Machine (SVM) Is this linearly separable? x y SVM Label X 2 +y 2
23
Support Vector Machine (SVM) x y SVM Label Z=X 2 +y 2
24
Support Vector Machine (SVM) x y SVM Label Z=X 2 +y 2
25
Support Vector Machine (SVM) Add one more feature to make linearly separable 1.x 2 +y 2 2.|x| 3.|y|
27
Support Vector Machine (SVM) Add one or more non-linear features to make linearly separable Kernel Trick x, y x 1 x 2 x 3 x 4 x 5 Kernel Not separableSeparable Solution(linear boundary) Non-linear separable
28
Support Vector Machine (SVM) Nonlinear Kernels Sigmoid function
29
Implementation (sklearn SVM)
32
Naïve Bayes, Accuracy: 0.884 Linear SVM, Accuracy: 0.92RBF SVM, Accuracy: 0.94
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.