Optimal Bayes Classification Tutorial 1: Optimal Bayes Classification
Theory Review We assume all variables are random variables, with known distributions Notation: - A finite set of classes (categories), - Input space (patterns), - A classifier, maps to
Basic Assumption The following distributions are known: - Prior probabilities for each class. - Conditional probability of the input, given that the class is If is a continuous space, denotes the probability density.
Reminder: Fish Classification Two classes Prior probability can be estimated from relative frequency Class conditional probability can be estimated by a frequency histogram
Optimal Bayes Classifier Bayes Rule: Optimal Bayes Classifier: This classifier minimizes the conditional error probability and the average error probability.
Exercise 1 It is given that and The prior probability is uniform: And the class conditional probability is Gaussian: Where What is the Bayes optimal decision rule? What are the decision boundaries in the plane? Does the decision boundary for the Gaussian case always have the same form? What it depends on?
Exercise 2 It is given that the input space is binary vectors of length d, meaning that . The output space is with general prior . We define the per-coordinate class conditional probabilities: In addition, each coordinate is statistically independent of all other coordinates What is the optimal decision rule? What Happens if for some i, ?
Exercise 3 Two classes are given, with uniform prior. In addition it is given that: What is the Bayes optimal decision rule?