IMAGE PROCESSING RECOGNITION AND CLASSIFICATION By DR. FERDA ERNAWAN Faculty of Computer Systems & Software Engineering ferda@ump.edu.my
Contents This lecture will cover: Introduction to Patterns and Recognition Classification vs Clustering Features Classification Types Template Matching Learning Outcomes: To expose pattern recognition application
Pattern Pattern recognition covers representation of patterns, clustering and classification (Murty and Devi, 2015)
Recognition Identification of a pattern Clustering (Unsupervised: learning categories) Classification (Supervised: known categories) Unsupervised learning Patterns in data Clusters Supervised learning Providing labels New data Training-> test Classify + regress
Unsupervised Learning The system has to learn the classifier from unlabeled data The system should estimate the probability density function of the data Unsupervised learning approach covers: Clustering (e.g. mixture models, hierarchical clustering, K-means) Techniques for dimensionality reduction (e.g. singular value decomposition, factorization, principle component analysis)
K-mean Clustering K-mean clustering needs a set of n-dimensional k denotes the number of desired cluster K-mean algorithm partitions the vectors into clusters, such that is over all clusters, minimizes the sum, of the within-cluster sums of point-to-cluster-centroid distances.
Supervised Pattern Recognition The system has some important properties such as it has known description or it has a set of samples for each properties A classifier determines a set of designated classes or to the reject class from the inputs vector. For example: Decision tree, nearest class mean.
Classification vs. Clustering Classification is a set of patterns that has been classified A set of patterns is called a set of training The learning strategy Learning can also be unsupervised In this case, there is no training set Instead, the classes itself based on a patterns. Category “A” Category “B” Classification Clustering
Classification Classification is a group based on similarity of items (objects/patterns/image regions/pixels).
Pattern Recognition Given an input pattern, make a decision about the “category” or “class” of the pattern
Approaches Statistical Pattern : The patterns are generated by a probabilistic system Pattern classes represented by statistical measures. Structural (or Syntactic) Pattern : The structural pattern process is based on the structural interrelationships of features. Pattern classes represented by means of formal structures (e.g. strings, automata, grammars).
Features Features are the individual measurable properties being observed. A set of features utilized for pattern recognition is called feature vector. The feature number used is called the dimensionality of the feature vector. n-dimensional feature vectors represent n-dimensional feature space.
Features height weight Class 1 Class 2 Class 1 Class 2
Features Feature extraction aims to create discriminative features for classification Criteria as good features is given as: Properties of objects have similar feature values from the same class. Properties of objects have different feature values from different classes. Good features Bad features
Features Use fewer features if possible Use features to differentiate classes Character recognition example Sample of good features: aspect ratio, presence of loops Sample of bad features: number of connected components, number of black pixels.
Identify the class of a given pattern Classifier Identify the class of a given pattern
Classifier Feature Vectors Distance Nearest neighbor classification: Find closest to the input pattern. Classify class to the same class as closest.
Classifier K-Nearest Classifier Use k nearest neighbors instead of 1 to classify pattern. Class 1 Class 2
Classifier A classifier partitions feature into class-labeled regions as shown in below: and The classification process determines feature vector x belongs to a region. Borders between regions are called decision boundaries
A Pattern Recognition System Two Modes:
Classification Type
Simple Classification: Threshold
Pixel and Object Classification Pixel-wise classification Characteristics aspect. Uses color, spectral information, texture, intensity, etc. Object-wise classification Physical aspect. Uses mean intensity, mean color, size, shape, etc. to describe patterns.
Object-wise and Pixel-wise Classification
Pixel-wise Classification Pixels haven’t classified in the image. Extract features (e.g. temporal changes, gray-level representation of texture, color). Conduct training for classifier. New input data is classified by classifier.
Pixel-wise Classification Given a color image with 256x256 pixels. 3 features (blue, green and red components). 4 classes (stalk, stamen, background and leaf).
Object-wise Classification Image is segmented into regions and label them. These are the patterns to be classified. Extract (calculate) features for each pattern. Train a classifier with class-labelled to determine discriminant in the feature space. For new input data, determine their classes using discriminant function.
Object-wise Classification
Classification - Example Sorting Fish based on species. Determine Salmon or Sea Bass.
Classification - Example Features extraction of fish include: Lightness of fish Position of the mouth Length of fish Number and shape of fins Width
Classification - Example
Select Features Select the lightness as a possible feature for classification. The length of fish is a poor feature for classification.
Classification - Example x = [x1, x2] FISH Lightness Width
Classification - Example
Classification - Example Other features can be added in classification process, while the feature should not reduce the performance (e.g. noisy features). A good decision boundary should provides an optimal performance.
Classification - Example Generalization?
Template Matching
Template Matching Finding matches are conducted by normalized correlation (NC).
Pseudocode for Template Matching [location,d] = template(image, template) [nx ny] = size(image) [tx ty] = size(template) for i = tx:(nx-tx) for j = ty:(ny-ty) d(i,j) = distance(template, window(i,j) end location = index_of_minimum(d(i,j)) % locate minimum distance
Example of a “Nose Template” t2= i2straight(2,1).image(35:45,27:37);
References M.N. Murty and V.S. Devi. Introduction to pattern recognition and machine learning. World Scientific Publishing. ISBN: 978-98114335454 R.O. Duda, P.E. Hart, D.G. Stork. Pattern Classification. Second Edition, John Wiley & Sons. 2012 ISBN:0471-056693