Download presentation
Presentation is loading. Please wait.
1
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
By DR. FERDA ERNAWAN Faculty of Computer Systems & Software Engineering
2
Contents This lecture will cover:
Introduction to Patterns and Recognition Classification vs Clustering Features Classification Types Template Matching Learning Outcomes: To expose pattern recognition application
3
Pattern Pattern recognition covers representation of patterns, clustering and classification (Murty and Devi, 2015)
4
Recognition Identification of a pattern
Clustering (Unsupervised: learning categories) Classification (Supervised: known categories) Unsupervised learning Patterns in data Clusters Supervised learning Providing labels New data Training-> test Classify + regress
5
Unsupervised Learning
The system has to learn the classifier from unlabeled data The system should estimate the probability density function of the data Unsupervised learning approach covers: Clustering (e.g. mixture models, hierarchical clustering, K-means) Techniques for dimensionality reduction (e.g. singular value decomposition, factorization, principle component analysis)
6
K-mean Clustering K-mean clustering needs a set of n-dimensional
k denotes the number of desired cluster K-mean algorithm partitions the vectors into clusters, such that is over all clusters, minimizes the sum, of the within-cluster sums of point-to-cluster-centroid distances.
7
Supervised Pattern Recognition
The system has some important properties such as it has known description or it has a set of samples for each properties A classifier determines a set of designated classes or to the reject class from the inputs vector. For example: Decision tree, nearest class mean.
8
Classification vs. Clustering
Classification is a set of patterns that has been classified A set of patterns is called a set of training The learning strategy Learning can also be unsupervised In this case, there is no training set Instead, the classes itself based on a patterns. Category “A” Category “B” Classification Clustering
9
Classification Classification is a group based on similarity of items (objects/patterns/image regions/pixels).
10
Pattern Recognition Given an input pattern, make a decision about the “category” or “class” of the pattern
11
Approaches Statistical Pattern :
The patterns are generated by a probabilistic system Pattern classes represented by statistical measures. Structural (or Syntactic) Pattern : The structural pattern process is based on the structural interrelationships of features. Pattern classes represented by means of formal structures (e.g. strings, automata, grammars).
12
Features Features are the individual measurable properties being observed. A set of features utilized for pattern recognition is called feature vector. The feature number used is called the dimensionality of the feature vector. n-dimensional feature vectors represent n-dimensional feature space.
13
Features height weight Class 1 Class 2 Class 1 Class 2
14
Features Feature extraction aims to create discriminative features for classification Criteria as good features is given as: Properties of objects have similar feature values from the same class. Properties of objects have different feature values from different classes. Good features Bad features
15
Features Use fewer features if possible
Use features to differentiate classes Character recognition example Sample of good features: aspect ratio, presence of loops Sample of bad features: number of connected components, number of black pixels.
16
Identify the class of a given pattern
Classifier Identify the class of a given pattern
17
Classifier Feature Vectors Distance Nearest neighbor classification:
Find closest to the input pattern. Classify class to the same class as closest.
18
Classifier K-Nearest Classifier
Use k nearest neighbors instead of 1 to classify pattern. Class 1 Class 2
19
Classifier A classifier partitions feature into class-labeled regions as shown in below: and The classification process determines feature vector x belongs to a region. Borders between regions are called decision boundaries
20
A Pattern Recognition System
Two Modes:
21
Classification Type
22
Simple Classification: Threshold
23
Pixel and Object Classification
Pixel-wise classification Characteristics aspect. Uses color, spectral information, texture, intensity, etc. Object-wise classification Physical aspect. Uses mean intensity, mean color, size, shape, etc. to describe patterns.
24
Object-wise and Pixel-wise Classification
25
Pixel-wise Classification
Pixels haven’t classified in the image. Extract features (e.g. temporal changes, gray-level representation of texture, color). Conduct training for classifier. New input data is classified by classifier.
26
Pixel-wise Classification
Given a color image with 256x256 pixels. 3 features (blue, green and red components). 4 classes (stalk, stamen, background and leaf).
27
Object-wise Classification
Image is segmented into regions and label them. These are the patterns to be classified. Extract (calculate) features for each pattern. Train a classifier with class-labelled to determine discriminant in the feature space. For new input data, determine their classes using discriminant function.
28
Object-wise Classification
29
Classification - Example
Sorting Fish based on species. Determine Salmon or Sea Bass.
30
Classification - Example
Features extraction of fish include: Lightness of fish Position of the mouth Length of fish Number and shape of fins Width
31
Classification - Example
32
Select Features Select the lightness as a possible feature for classification. The length of fish is a poor feature for classification.
33
Classification - Example
x = [x1, x2] FISH Lightness Width
34
Classification - Example
35
Classification - Example
Other features can be added in classification process, while the feature should not reduce the performance (e.g. noisy features). A good decision boundary should provides an optimal performance.
36
Classification - Example
Generalization?
37
Template Matching
38
Template Matching Finding matches are conducted by normalized correlation (NC).
39
Pseudocode for Template Matching
[location,d] = template(image, template) [nx ny] = size(image) [tx ty] = size(template) for i = tx:(nx-tx) for j = ty:(ny-ty) d(i,j) = distance(template, window(i,j) end location = index_of_minimum(d(i,j)) % locate minimum distance
40
Example of a “Nose Template”
t2= i2straight(2,1).image(35:45,27:37);
41
References M.N. Murty and V.S. Devi. Introduction to pattern recognition and machine learning. World Scientific Publishing. ISBN: R.O. Duda, P.E. Hart, D.G. Stork. Pattern Classification. Second Edition, John Wiley & Sons ISBN:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.