Download presentation
Presentation is loading. Please wait.
Published byClaude Stevens Modified over 9 years ago
1
Generative classifiers: The Gaussian classifier Ata Kaban School of Computer Science University of Birmingham
2
Outline We have already seen how Bayes rule can be turned into a classifier In all our examples so far we had discrete valued attributes (e.g. in {‘sunny’,’rainy’}, {+,-}) Today we learn how to do this when the data attributes are continuous valued
3
Example Task: predict gender of individuals based on their heights Given 100 height examples of women 100 height examples of man Height (meters) Frequency
4
Class priors
5
Class-conditional likelihood
6
Class posterior
7
Discriminant function Height (meters) Frequency
8
Discriminant function
10
How do we compute it?
11
Illustration – our 1D example Height (meters) Frequency
12
Gaussian - univariate
13
Gaussian - multivariate
15
2D example with 2 classes Attribute 1 Attribute 2
16
Naïve Bayes
17
Are we done?
18
Multi-class classification We may have more than 2 classes – e.g. ‘healthy’, ‘disease type 1’, ‘disease type 2’. Our Gaussian classifier is easy to use in multi- class problems. We compute the posterior probability for each of the classes We predict the class whose posterior probability is highest.
19
Summing up This type of classifier is called ‘generative’, because it rests on the assumption that the cloud of points in each class can be seen as generated by some distribution, e.g. a Gaussian, and works out its decisions based on estimating these distributions. One could instead model the discriminant function directly! That type of classifier is called ‘discriminative’. For the brave: Try to work out the form of the discriminant function by plugging into it the form of the Gaussian class conditional densities. You will get a quadratic function of x in general. When does it reduce to a linear functon? Recommended reading: Rogers & Girolami, Chapter 5.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.