November 30, PATTERN RECOGNITION
November 30, TEXTURE CLASSIFICATION PROJECT Characterize each texture so as to differentiate it from one another. Probably should use 2nd order co- occurrence matrices and their respective properties. Examine what is a good neighborhood size to work with and how many grey levels for computation of co-occurrence matrix. Might want to examine eigenvalue invariants of symmetric co-occurrence matrix but not absolutely necessary. May have to look at ‘ranges of values’ that differentiates the class for each texture.
November 30, A LOW-DIMENSIONAL EXAMPLE RED GREEN 2D color space x x x xx x x x x x xx x x x x xx x x x x APPLES ORANGES
November 30, STATISTICS MEAN STANDARD DEVIATION CLASS SEPARATION DISTANCE
November 30, MAXIMUM- LIKLIHOOD CLASSIFIER a priori probability conditional probability Probability that x occurs given the occurrence of Ci a posteriori probability Probability that Ci occurs given the occurrence of x Notation: Ci is some object class x is a feature (e.g., size, color, etc…)
November 30, MAXIMUM- LIKLIHOOD CLASSIFIER Bayes’ Rule In general:
November 30, MAXIMUM- LIKLIHOOD CLASSIFIER Bayes’ Rule
November 30, FACE RECOGNITION Feature space is the grey value states of an image, or subimage. Dimension of feature space is the number of pixels in image or subimage. Can reduce size of feature space for computational purposes, to order of the training sample size.
November 30, FACE RECOGNITION Training samples F1,F2,…,Fn where Fi is a vector in feature space. “Mean face” is given by:Consider now difference vectors:Consider matrix:Consider covariance matrix:
November 30, FACE RECOGNITION