IMAGE PROCESSING RECOGNITION AND CLASSIFICATION

Slides:



Advertisements
Similar presentations
An Overview of Machine Learning
Advertisements

Supervised Learning Recap
Image Indexing and Retrieval using Moment Invariants Imran Ahmad School of Computer Science University of Windsor – Canada.

Lecture 17: Supervised Learning Recap Machine Learning April 6, 2010.
Chapter 1: Introduction to Pattern Recognition
CS292 Computational Vision and Language Pattern Recognition and Classification.
OUTLINE Course description, What is pattern recognition, Cost of error, Decision boundaries, The desgin cycle.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Machine Vision and Dig. Image Analysis 1 Prof. Heikki Kälviäinen C50A6100 Lectures 12: Object Recognition Professor Heikki Kälviäinen Machine Vision and.
Pattern Classification, Chapter 1 1 Basic Probability.
Lecture #1COMP 527 Pattern Recognition1 Pattern Recognition Why? To provide machines with perception & cognition capabilities so that they could interact.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Introduction to Pattern Recognition
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
CSE 185 Introduction to Computer Vision Pattern Recognition.
: Chapter 10: Image Recognition 1 Montri Karnjanadecha ac.th/~montri Image Processing.
Introduction to Pattern Recognition Charles Tappert Seidenberg School of CSIS, Pace University.
Chapter 4 CONCEPTS OF LEARNING, CLASSIFICATION AND REGRESSION Cios / Pedrycz / Swiniarski / Kurgan.
Classification. An Example (from Pattern Classification by Duda & Hart & Stork – Second Edition, 2001)
IBS-09-SL RM 501 – Ranjit Goswami 1 Basic Probability.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
ENT 273 Object Recognition and Feature Detection Hema C.R.
Compiled By: Raj G Tiwari.  A pattern is an object, process or event that can be given a name.  A pattern class (or category) is a set of patterns sharing.
Nearest Neighbor (NN) Rule & k-Nearest Neighbor (k-NN) Rule Non-parametric : Can be used with arbitrary distributions, No need to assume that the form.
Image Classification 영상분류
Digital Image Processing CCS331 Relationships of Pixel 1.
Pattern Recognition April 19, 2007 Suggested Reading: Horn Chapter 14.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
MACHINE LEARNING 8. Clustering. Motivation Based on E ALPAYDIN 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Classification problem:
Levels of Image Data Representation 4.2. Traditional Image Data Structures 4.3. Hierarchical Data Structures Chapter 4 – Data structures for.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
Covariance matrices for all of the classes are identical, But covariance matrices are arbitrary.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Digital Image Processing CCS331 Relationships of Pixel 1.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Mixture Densities Maximum Likelihood Estimates.
Big data classification using neural network
Artificial Intelligence
Machine Learning for Computer Security
Machine Learning Clustering: K-means Supervised Learning
Pattern Classification Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley & Sons, 2000 Dr. Ding Yuxin Pattern Recognition.
Introduction Machine Learning 14/02/2017.
A Personal Tour of Machine Learning and Its Applications
CSSE463: Image Recognition Day 21
School of Computer Science & Engineering
Chapter 12 Object Recognition
Pattern Recognition Sergios Theodoridis Konstantinos Koutroumbas
Introduction to Pattern Recognition and Machine Learning
CSSE463: Image Recognition Day 23
An Introduction to Supervised Learning
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
LECTURE 21: CLUSTERING Objectives: Mixture Densities Maximum Likelihood Estimates Application to Gaussian Mixture Models k-Means Clustering Fuzzy k-Means.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
CSSE463: Image Recognition Day 23
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
CSSE463: Image Recognition Day 23
Machine Learning – a Probabilistic Perspective
EM Algorithm and its Applications
Pattern Recognition and Training
Pattern Recognition and Training
Presentation transcript:

IMAGE PROCESSING RECOGNITION AND CLASSIFICATION By DR. FERDA ERNAWAN Faculty of Computer Systems & Software Engineering ferda@ump.edu.my

Contents This lecture will cover: Introduction to Patterns and Recognition Classification vs Clustering Features Classification Types Template Matching Learning Outcomes: To expose pattern recognition application

Pattern Pattern recognition covers representation of patterns, clustering and classification (Murty and Devi, 2015)

Recognition Identification of a pattern Clustering (Unsupervised: learning categories) Classification (Supervised: known categories) Unsupervised learning Patterns in data Clusters Supervised learning Providing labels New data Training-> test Classify + regress

Unsupervised Learning The system has to learn the classifier from unlabeled data The system should estimate the probability density function of the data Unsupervised learning approach covers: Clustering (e.g. mixture models, hierarchical clustering, K-means) Techniques for dimensionality reduction (e.g. singular value decomposition, factorization, principle component analysis)

K-mean Clustering K-mean clustering needs a set of n-dimensional k denotes the number of desired cluster K-mean algorithm partitions the vectors into clusters, such that is over all clusters, minimizes the sum, of the within-cluster sums of point-to-cluster-centroid distances.

Supervised Pattern Recognition The system has some important properties such as it has known description or it has a set of samples for each properties A classifier determines a set of designated classes or to the reject class from the inputs vector. For example: Decision tree, nearest class mean.

Classification vs. Clustering Classification is a set of patterns that has been classified A set of patterns is called a set of training The learning strategy Learning can also be unsupervised In this case, there is no training set Instead, the classes itself based on a patterns. Category “A” Category “B” Classification Clustering

Classification Classification is a group based on similarity of items (objects/patterns/image regions/pixels).

Pattern Recognition Given an input pattern, make a decision about the “category” or “class” of the pattern

Approaches Statistical Pattern : The patterns are generated by a probabilistic system Pattern classes represented by statistical measures. Structural (or Syntactic) Pattern : The structural pattern process is based on the structural interrelationships of features. Pattern classes represented by means of formal structures (e.g. strings, automata, grammars).

Features Features are the individual measurable properties being observed. A set of features utilized for pattern recognition is called feature vector. The feature number used is called the dimensionality of the feature vector. n-dimensional feature vectors represent n-dimensional feature space.

Features height weight Class 1 Class 2 Class 1 Class 2

Features Feature extraction aims to create discriminative features for classification Criteria as good features is given as: Properties of objects have similar feature values from the same class. Properties of objects have different feature values from different classes. Good features Bad features

Features Use fewer features if possible Use features to differentiate classes Character recognition example Sample of good features: aspect ratio, presence of loops Sample of bad features: number of connected components, number of black pixels.

Identify the class of a given pattern Classifier Identify the class of a given pattern

Classifier Feature Vectors Distance Nearest neighbor classification: Find closest to the input pattern. Classify class to the same class as closest.

Classifier K-Nearest Classifier Use k nearest neighbors instead of 1 to classify pattern. Class 1 Class 2

Classifier A classifier partitions feature into class-labeled regions as shown in below: and The classification process determines feature vector x belongs to a region. Borders between regions are called decision boundaries

A Pattern Recognition System Two Modes:

Classification Type

Simple Classification: Threshold

Pixel and Object Classification Pixel-wise classification Characteristics aspect. Uses color, spectral information, texture, intensity, etc. Object-wise classification Physical aspect. Uses mean intensity, mean color, size, shape, etc. to describe patterns.

Object-wise and Pixel-wise Classification

Pixel-wise Classification Pixels haven’t classified in the image. Extract features (e.g. temporal changes, gray-level representation of texture, color). Conduct training for classifier. New input data is classified by classifier.

Pixel-wise Classification Given a color image with 256x256 pixels. 3 features (blue, green and red components). 4 classes (stalk, stamen, background and leaf).  

Object-wise Classification Image is segmented into regions and label them. These are the patterns to be classified. Extract (calculate) features for each pattern. Train a classifier with class-labelled to determine discriminant in the feature space. For new input data, determine their classes using discriminant function.

Object-wise Classification

Classification - Example Sorting Fish based on species. Determine Salmon or Sea Bass.

Classification - Example Features extraction of fish include: Lightness of fish Position of the mouth Length of fish Number and shape of fins Width

Classification - Example

Select Features Select the lightness as a possible feature for classification. The length of fish is a poor feature for classification.

Classification - Example x = [x1, x2] FISH Lightness Width

Classification - Example

Classification - Example Other features can be added in classification process, while the feature should not reduce the performance (e.g. noisy features). A good decision boundary should provides an optimal performance.

Classification - Example Generalization?

Template Matching

Template Matching Finding matches are conducted by normalized correlation (NC).

Pseudocode for Template Matching [location,d] = template(image, template) [nx ny] = size(image) [tx ty] = size(template) for i = tx:(nx-tx) for j = ty:(ny-ty) d(i,j) = distance(template, window(i,j) end location = index_of_minimum(d(i,j)) % locate minimum distance

Example of a “Nose Template” t2= i2straight(2,1).image(35:45,27:37);

References M.N. Murty and V.S. Devi. Introduction to pattern recognition and machine learning. World Scientific Publishing. ISBN: 978-98114335454 R.O. Duda, P.E. Hart, D.G. Stork. Pattern Classification. Second Edition, John Wiley & Sons. 2012 ISBN:0471-056693