ECE 471/571 – Lecture 3 Discriminant Function and Normal Density 08/27/15
ECE471/571, Hairong Qi2 Different Approaches - More Detail Pattern Classification Statistical ApproachNon-Statistical Approach SupervisedUnsupervised Basic concepts: Distance Agglomerative method Basic concepts: Baysian decision rule (MPP, LR, Discri.) Parametric learning (ML, BL) Non-Parametric learning (kNN) NN (Perceptron, BP) k-means Winner-take-all Kohonen maps Dimensionality Reduction Fisher ’ s linear discriminant K-L transform (PCA) Performance Evaluation ROC curve TP, TN, FN, FP Stochastic Methods local optimization (GD) global optimization (SA, GA)
3 Bayes Decision Rule Maximum Posterior Probability Maximum Likelihood
4 Discrimimant Function One way to represent pattern classifier- use discriminant functions g i (x) For two-class cases,
5 Normal/Gaussian Density The rule
6 Multivariate Normal Density
7 Discriminant Function for Normal Density
8 Case 1: i = 2 I The features are statistically independent, and have the same variance Geometrically, the samples fall in equal-size hyperspherical clusters Decision boundary: hyperplane of d-1 dimension
9 Linear Discriminant Function and Linear Machine
10 Minimum-Distance Classifier When P( i ) are the same for all c classes, the discriminant function is actually measuring the minimum distance from each x to each of the c mean vectors
11 Case 2: i = The covariance matrices for all the classes are identical but not a scalar of identity matrix. Geometrically, the samples fall in hyperellipsoidal Decision boundary: hyperplane of d-1 dimension Squared Mahalanobis distance
12 Case 3: i = arbitrary The covariance matrices are different from each category Quadratic classifier Decision boundary: hyperquadratic for 2-D Gaussian
13 Case Study b b b b b b b b b b b b b b b b b b b b c c c c c c c c c c a a a a a a a a a a a a u b u c u b u a u o u o Calculate Calculate Derive the discriminant function g i (x)