Download presentation
Presentation is loading. Please wait.
1
ECE471-571 – Pattern Recognition Lecture 3 – Discriminant Function and Normal Density
Hairong Qi, Gonzalez Family Professor Electrical Engineering and Computer Science University of Tennessee, Knoxville
2
Pattern Classification
Statistical Approach Non-Statistical Approach Supervised Unsupervised Decision-tree Basic concepts: Baysian decision rule (MPP, LR, Discri.) Basic concepts: Distance Agglomerative method Syntactic approach Parameter estimate (ML, BL) k-means Non-Parametric learning (kNN) Winner-takes-all LDF (Perceptron) Kohonen maps NN (BP) Mean-shift Support Vector Machine Deep Learning (DL) Dimensionality Reduction FLD, PCA Performance Evaluation ROC curve (TP, TN, FN, FP) cross validation Stochastic Methods local opt (GD) global opt (SA, GA) Classifier Fusion majority voting NB, BKS
3
Bayes Decision Rule Maximum Posterior Probability Likelihood
If , then x belongs to class 1, otherwise, 2. Likelihood Ratio
4
Discrimimant Function
One way to represent pattern classifier- use discriminant functions gi(x) For two-class cases, g_i(x) = P(\omega_i|x) g_i(x) = p(x|\omega_i)P(\omega_i) g_i(x) = \ln p(x|\omega_i)+\ln P(\omega_i)
5
Multivariate Normal Density
6
Discriminant Function for Normal Density
7
Case 1: Si=s2I The features are statistically independent, and have the same variance Geometrically, the samples fall in equal-size hyperspherical clusters Decision boundary: hyperplane of d-1 dimension
8
Linear Discriminant Function and Linear Machine
9
Minimum-Distance Classifier
When P(wi) are the same for all c classes, the discriminant function is actually measuring the minimum distance from each x to each of the c mean vectors
10
Case 2: Si = S The covariance matrices for all the classes are identical but not a scalar of identity matrix. Geometrically, the samples fall in hyperellipsoidal Decision boundary: hyperplane of d-1 dimension Squared Mahalanobis distance
11
Case 3: Si = arbitrary The covariance matrices are different from each category Quadratic classifier Decision boundary: hyperquadratic for 2-D Gaussian
12
Case Study Calculate m Calculate S
a a a a a a a a a a a a b b b b b b b b b b b b b b b b b b b b c c c c c c c c c c u b u c u b u a u o u o Calculate m Calculate S Derive the discriminant function gi(x)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.