Pattern Recognition and Training

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Data Mining Tools Overview Business Intelligence for Managers.
Component Analysis (Review)
Word Spotting DTW.
Image Indexing and Retrieval using Moment Invariants Imran Ahmad School of Computer Science University of Windsor – Canada.
Chapter 5: Linear Discriminant Functions
Chapter 2: Pattern Recognition
Segmentation Divide the image into segments. Each segment:
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Handwritten Thai Character Recognition Using Fourier Descriptors and Robust C-Prototype Olarik Surinta Supot Nitsuwat.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
1 Template-Based Classification Method for Chinese Character Recognition Presenter: Tienwei Tsai Department of Informaiton Management, Chihlee Institute.
: Chapter 10: Image Recognition 1 Montri Karnjanadecha ac.th/~montri Image Processing.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Example Clustered Transformations MAP Adaptation Resources: ECE 7000:
OBJECT RECOGNITION. The next step in Robot Vision is the Object Recognition. This problem is accomplished using the extracted feature information. The.
S EGMENTATION FOR H ANDWRITTEN D OCUMENTS Omar Alaql Fab. 20, 2014.
COMPARISON OF IMAGE ANALYSIS FOR THAI HANDWRITTEN CHARACTER RECOGNITION Olarik Surinta, chatklaw Jareanpon Department of Management Information System.
Image Classification 영상분류
Data Extraction using Image Similarity CIS 601 Image Processing Ajay Kumar Yadav.
1 E. Fatemizadeh Statistical Pattern Recognition.
Pattern Recognition April 19, 2007 Suggested Reading: Horn Chapter 14.
Handwritten Recognition with Neural Network Chatklaw Jareanpon, Olarik Surinta Mahasarakham University.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
MUSTAFA OZAN ÖZEN PINAR SAĞLAM LEVENT ÜNVER MEHMET YILMAZ.
Digital Image Processing Lecture 24: Object Recognition June 13, 2005 Prof. Charlene Tsai *From Gonzalez Chapter 12.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
Digital Image Processing & Pattern Analysis (CSCE 563) Introduction to Pattern Analysis Prof. Amr Goneid Department of Computer Science & Engineering The.
1 An Efficient Classification Approach Based on Grid Code Transformation and Mask-Matching Method Presenter: Yo-Ping Huang.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Text From Corners: A Novel Approach to Detect Text and Caption in Videos Xu Zhao, Kai-Hsiang Lin, Yun Fu, Member, IEEE, Yuxiao Hu, Member, IEEE, Yuncai.
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Unsupervised Classification
1 A Statistical Matching Method in Wavelet Domain for Handwritten Character Recognition Presented by Te-Wei Chiang July, 2005.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
CSSE463: Image Recognition Day 21
Bag-of-Visual-Words Based Feature Extraction
Fast Kernel-Density-Based Classification and Clustering Using P-Trees
Lecture 8:Eigenfaces and Shared Features
Principal Component Analysis (PCA)
Discrimination and Classification
Basic machine learning background with Python scikit-learn
Recognition using Nearest Neighbor (or kNN)
Computer Vision Lecture 5: Binary Image Processing
Fitting Curve Models to Edges
Image Processing, Leture #12
CSSE463: Image Recognition Day 23
PCA is “an orthogonal linear transformation that transfers the data to a new coordinate system such that the greatest variance by any projection of the.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
LECTURE 21: CLUSTERING Objectives: Mixture Densities Maximum Likelihood Estimates Application to Gaussian Mixture Models k-Means Clustering Fuzzy k-Means.
Shape Classification Evan Toler CAAM 210.
第 四 章 VQ 加速運算與編碼表壓縮 4-.
Digital Image Processing Lecture 24: Object Recognition
CSSE463: Image Recognition Day 23
Neuro-Computing Lecture 2 Single-Layer Perceptrons
CSSE463: Image Recognition Day 23
Finding Line and Curve Segments from Edge Images
Computer and Robot Vision I
Machine Learning – a Probabilistic Perspective
EM Algorithm and its Applications
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Recognition and Training
ECE – Pattern Recognition Lecture 10 – Nonparametric Density Estimation – k-nearest-neighbor (kNN) Hairong Qi, Gonzalez Family Professor Electrical.
Quantizing Compression
Presentation transcript:

Pattern Recognition and Training Pattern -- set of values (known size) that describe things The general problem Approaches to the decision-making process 1. Simple comparison 2. Common property 3. Clusters (using distance measurement |X1-u1| + |X2-u2| + … + |Xn-un| ) 4. Combination of 1, 2 and 3 06/07/62 240-373 Image Processing, Leture #13

240-373 Image Processing, Leture #13 Decision Functions Decision function: w = (w1, w2, w3, …, wn) If the pattern vector is x = [x1, x2, x3, …, xn, 1]T, then The unknown pattern is in group B if wTx > 0 The unknown pattern is in group A if wTx <= 0 Example: (8,4) is in group B because [1.5, -1.0, -3.5] [8, 4, 1]T = 8x1.5-4-3.5 = 4.5 and 4.5 > 0 How about (4,4)? 06/07/62 240-373 Image Processing, Leture #13

Decision Functions (Cont’d) The number of groups can be more than 2 Decision table Result of w1 Result of w2 Implication < 0 < 0 no group < 0 > 0 group A > 0 < 0 group C > 0 > 0 group B Decision function need not be a linear function 06/07/62 240-373 Image Processing, Leture #13

240-373 Image Processing, Leture #13 Cluster Means If the cluster consists of [3,4,8,2] [2,9,5,1][5,7,7,1], then the mean is [3.33, 6.67, 6.67, 1.33]. This represents the center of the four-dimensional cluster. The Euclidean distance from the center to a new pattern can be calculated as follows: new vector [3, 5, 7, 0], Euclidean distance = (3-3.33)2 + (5-6.67)2 + (7-6.67)2 + (0-1.33)2 = 4.78 06/07/62 240-373 Image Processing, Leture #13

240-373 Image Processing, Leture #13 Automatic Clustering Technique 1: K-means clustering USE: To automatically find the best groupings and means of K clusters. OPERATION: The pattern vectors of K different items are given to the system Classifying them as best it can (without knowing which vector belongs to which item) Let the pattern vectors be X1, …, Xn Take the first K points as the initial estimation of the cluster means M1 = X1, M2 = X2, …, Mk = Xk * Allocate each pattern vector to the nearest group (minimum distance) Calculate new cluster centers If they are the same as the old centers, then STOP, other wise goto step * 06/07/62 240-373 Image Processing, Leture #13

K-means clustering example M1 = (2, 5.0) M2 = (2, 5.5) Allocating each pattern vector to the nearest center gives 1 (2,5.0) group 1 2 (2,5.5) group 2 3 (6,2.5) group 1 4 (7,2.0) group 1 5 (7,3.0) group 1 6 (3,4.5) group 1 The group means now become group 1: M1 = (5, 3.4) group 2: M2 = (2, 5.5) 06/07/62 240-373 Image Processing, Leture #13

240-373 Image Processing, Leture #13 This gives new groupings as follows: 1 (2,5.0) group 2 2 (2,5.5) group 2 3 (6,2.5) group 1 4 (7,2.0) group 1 5 (7,3.0) group 1 6 (3,4.5) group 2 And the group means become group 1: M1 = (6.67, 2.5) group 2: M2 = (2.33, 5.0) Groupings now stay the same and the processing stops. 06/07/62 240-373 Image Processing, Leture #13

Optical Character Recognition Technique: Isolation of a character in an OCR document USE: To create a window containing only one character onto an array containing a text image OPERATION: 1. Assuming that the image is correctly oriented and the text is dark on a white background 2. Calculate row sums of the pixel gray-level values. High row sums indicate a space between the rows 3. Calculate column sums of the pixel gray-level values. High column sums indicate a space between the columns 06/07/62 240-373 Image Processing, Leture #13

240-373 Image Processing, Leture #13 Technique: Creating the pattern vector (feature extraction) USE: To create the pattern vector for a character so that it can be compared with the library OPERATION: 1. Assuming that the character has been isolated 2. Place a 4x4 grid over the image and count the number of “ink” pixels in each grid. 3. These number are then divided by the total number of pixels in the grid 4. Comparing resulting numbers with the library 06/07/62 240-373 Image Processing, Leture #13