Download presentation
Presentation is loading. Please wait.
1
Classifiers Fujinaga
2
Bayes (optimal) Classifier (1)
A priori probabilities: and Decision rule: given and decide if and probability of error Let be the feature(s). Let be the class (state)- conditional probability distribution function (pdf) for ; i.e., the pdf for given that the state of nature is
3
Bayes (optimal) Classifier (2)
Assume we know and and also we discover the value of Using Bayes Rule: Decide if (Maximum likelihood)
4
Bayes (optimal) Classifier (3)
A posteriori for a two class decision problem. The red region on the x axes depicts values for x for which you would decide ‘apple’ and the orange region is for ‘orange’. At every x, the posteriors must sum to 1.
5
Fisher’s Linear Discriminant
If Petal Width < xPetal Length, then Versicolor If Petal Width > xPetal Length, then Verginica
6
Decision Tree If Petal Length < 2.65, then Setosa
If Petal Length > 4.95, then Verginica If 2.65 < Petal Length < 4.95 then if Petal Width < 1.65 then Versicolor if Petal Width > 1.65 then Virginica
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.