Download presentation
Presentation is loading. Please wait.
1
Miguel Tavares Coimbra
VC 18/19 – TP14 Pattern Recognition Mestrado em Ciência de Computadores Mestrado Integrado em Engenharia de Redes e Sistemas Informáticos Miguel Tavares Coimbra
2
Outline Introduction to Pattern Recognition
Statistical Pattern Recognition Classifiers VC 18/19 - TP14 - Pattern Recognition
3
Topic: Introduction to Pattern Recognition
Statistical Pattern Recognition Classifiers VC 18/19 - TP14 - Pattern Recognition
4
This is a horse http://www.flickr.com/photos/kimbar/2027234083/
VC 18/19 - TP14 - Pattern Recognition
5
This is a horse http://www.flickr.com/photos/genewolf/2031802050/
VC 18/19 - TP14 - Pattern Recognition
6
This is a... Horse? VC 18/19 - TP14 - Pattern Recognition
7
Decisions I can manipulate images. I want to make decisions!
Classify / Identify features. Recognize patterns. VC 18/19 - TP14 - Pattern Recognition
8
One definition Pattern recognition How do I do this so well?
"the act of taking in raw data and taking an action based on the category of the data". Wikipedia How do I do this so well? How can I make machines do this? VC 18/19 - TP14 - Pattern Recognition
9
The problem Do you ‘see’ a horse? What a computer sees
VC 18/19 - TP14 - Pattern Recognition
10
Mathematics We only deal with numbers. Very complex problem!!
How do we represent knowledge? How do we represent visual features? How do we classify them? Very complex problem!! Let’s break it into smaller ones... VC 18/19 - TP14 - Pattern Recognition
11
Typical PR system Sensor Feature Extraction Classifier
Does the actual job of classifying or describing observations, relying on the extracted features. Feature Extraction Computes numeric or symbolic information from the observations; Sensor Gathers the observations to be classified or described VC 18/19 - TP14 - Pattern Recognition
12
Sensor In our specific case: Image acquiring mechanism.
Let’s assume we don’t control it. One observation = One Image Video = Multiple Observations VC 18/19 - TP14 - Pattern Recognition
13
Feature Extraction What exactly are features? These vary a lot!
Colour, texture, shape, etc. Animal with 4 legs. Horse. Horse jumping. These vary a lot! Some imply some sort of ‘recognition’ e.g. How do I know the horse is jumping? VC 18/19 - TP14 - Pattern Recognition
14
Broad classification of features
Low-level Colour, texture Middle-level Object with head and four legs. Object moving up. Horse High-level Horse jumping. Horse competition. VC 18/19 - TP14 - Pattern Recognition
15
Low-level features Objective
Directly reflect specific image and video features. Colour Texture Shape Motion Etc. VC 18/19 - TP14 - Pattern Recognition
16
Middle-level features
Some degree of subjectivity They are typically one solution of a problem with multiple solutions. Examples: Segmentation Optical Flow Identification Etc. VC 18/19 - TP14 - Pattern Recognition
17
How do humans do this so well?
High-level features Semantic Interpretation Knowledge Context Examples: This person suffers from epilepsy. The virus attacks the cell with some degree of intelligence. This person is running from that one. How do humans do this so well? VC 18/19 - TP14 - Pattern Recognition
18
How do i cross this bridge?
The semantic gap Fundamental problem of current research! High-level: Interpretation Decision -Understanding … Low-level: Colour Texture Shape … Now what?? How do i cross this bridge? VC 18/19 - TP14 - Pattern Recognition
19
Middle-Level Features
Features & Decisions Various Possible Solutions Low-Level Features Middle-Level Features High-Level Features How do I decide? Decision Decision One Solution VC 18/19 - TP14 - Pattern Recognition
20
Middle-Level Features
Classification Middle-Level Features High-Level Features Horse Upward Motion Rider Classifier Horse Jumping Competition M inputs, N outputs VC 18/19 - TP14 - Pattern Recognition
21
Layers of classification
First layer of classificaiton Second layer of classificaiton Brown Four legs Head ... Horse Upward Motion Rider Horse Jumping Competition VC 18/19 - TP14 - Pattern Recognition
22
Classifiers How do I map my M inputs to my N outputs?
Mathematical tools: Distance-based classifiers. Rule-based classifiers. Neural Networks. Support Vector Machines ... VC 18/19 - TP14 - Pattern Recognition
23
Types of PR methods Statistical pattern recognition
based on statistical characterizations of patterns, assuming that the patterns are generated by a probabilistic system. Syntactical (or structural) pattern recognition based on the structural interrelationships of features. VC 18/19 - TP14 - Pattern Recognition
24
Topic: Statistical Pattern Recognition
Introduction to Pattern Recognition Statistical Pattern Recognition Classifiers VC 18/19 - TP14 - Pattern Recognition
25
Is Porto in Portugal? VC 18/19 - TP14 - Pattern Recognition
26
Porto is in Portugal I want to make decisions. I know certain things.
Is Porto in Portugal? I know certain things. A world map including cities and countries. I can make this decision! Porto is in Portugal. I had enough a priori knowledge to make this decision. VC 18/19 - TP14 - Pattern Recognition
27
What if I don’t have a map?
I still want to make this decision. I observe: Amarante has coordinates x1,y1 and is in Portugal. Viseu has coordinates x2, y2 and is in Portugal. Vigo has coordinates x3, y3 and is in Spain. I classify: Porto is close to Amarante and Viseu so Porto is in Portugal. What if I try to classify Valença? VC 18/19 - TP14 - Pattern Recognition
28
Statistical PR I used statistics to make a decision.
I can make decisions even when I don’t have full a priori knowledge of the whole process. I can make mistakes. How did I recognize this pattern? I learned from previous observations where I knew the classification result. I classified a new observation. What pattern? VC 18/19 - TP14 - Pattern Recognition
29
Back to the Features Feature Fi Feature Fi with N values.
Feature vector F with M features. Naming conventions: Elements of a feature vector are called coefficients. Features may have one or more coefficients. Feature vectors may have one or more features. VC 18/19 - TP14 - Pattern Recognition
30
Back to our Porto example
I’ve classified that Porto is in Portugal. What feature did I use? Spatial location Let’s get more formal I’ve defined a feature vector F with one feature F1, which has two coefficients f1x, f1y. VC 18/19 - TP14 - Pattern Recognition
31
Feature Space Feature Vector Feature Space Two total coefficients.
Can be seen as a feature ‘space’ with two orthogonal axis. Feature Space Hyper-space with N dimensions where N is the total number of coefficients of my feature vector. VC 18/19 - TP14 - Pattern Recognition
32
I know the border is here
A Priori Knowledge I have a precise model of my feature space based on a priori knowledge. City is in Spain if F1Y>23 Great models = Great classifications. F1Y(London) = 100 London is in Spain (??) I know the border is here Porto is in Portugal! VC 18/19 - TP14 - Pattern Recognition
33
What if I don’t have a model?
I need to learn from observations. Derive a model. Direct classification. Training stage. Learn model parameters. Classification ‘Learned’ Model VC 18/19 - TP14 - Pattern Recognition
34
Classes In our example, cities can belong to:
Portugal Spain I have two classes of cities. A class represents a sub-space of my feature space. SPAIN PORTUGAL VC 18/19 - TP14 - Pattern Recognition
35
Classifiers A Classifier C maps a class into the feature space.
Various types of classifiers. Nearest-Neighbours. Bayesian. Soft-computing machines. Etc... VC 18/19 - TP14 - Pattern Recognition
36
Topic: Classifiers Introduction to Pattern Recognition
Statistical Pattern Recognition Classifiers VC 18/19 - TP14 - Pattern Recognition
37
Distance to Mean I can represent a class by its mean feature vector.
To classify a new object, I choose the class with the closest mean feature vector. Different distance measures! Euclidean Distance VC 18/19 - TP14 - Pattern Recognition
38
Possible Distance Measures
L1 Distance Euclidean Distance (L2 Distance) L1 or Taxicab Distance VC 18/19 - TP14 - Pattern Recognition
39
Gaussian Distribution
Defined by two parameters: Mean: μ Variance: σ2 Great approximation to the distribution of many phenomena. Central Limit Theorem VC 18/19 - TP14 - Pattern Recognition
40
Multivariate Distribution
For N dimensions: Mean feature vector: Covariance Matrix: VC 18/19 - TP14 - Pattern Recognition
41
Mahalanobis Distance Based on the covariance of coefficients.
Superior to the Euclidean distance. VC 18/19 - TP14 - Pattern Recognition
42
K-Nearest Neighbours Algorithm Characteristics
Choose the closest K neighbours to a new observation. Classify the new object based on the class of these K objects. Characteristics Assumes no model. Does not scale very well... VC 18/19 - TP14 - Pattern Recognition
43
Resources Gonzalez & Woods, 3rd Ed, Chapter 12.
Andrew Moore, Statistic Data Mining Tutorial, VC 18/19 - TP14 - Pattern Recognition
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.