Lecture #1COMP 527 Pattern Recognition1 Pattern Recognition Why? To provide machines with perception & cognition capabilities so that they could interact.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

An Overview of Machine Learning
Supervised Learning Recap
ECE 8443 – Pattern Recognition Objectives: Course Introduction Typical Applications Resources: Syllabus Internet Books and Notes D.H.S: Chapter 1 Glossary.
What is Statistical Modeling
Lecture 17: Supervised Learning Recap Machine Learning April 6, 2010.
Lecture 20 Object recognition I
Prénom Nom Document Analysis: Parameter Estimation for Pattern Recognition Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
CIS 678 Artificial Intelligence problems deduction, reasoning knowledge representation planning learning natural language processing motion and manipulation.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Chapter 2: Pattern Recognition
Principle of Locality for Statistical Shape Analysis Paul Yushkevich.
Multidimensional Analysis If you are comparing more than two conditions (for example 10 types of cancer) or if you are looking at a time series (cell cycle.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
Introduction to machine learning
Radial Basis Function Networks
General Information Course Id: COSC6342 Machine Learning Time: MO/WE 2:30-4p Instructor: Christoph F. Eick Classroom:SEC 201
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture Notes by Neşe Yalabık Spring 2011.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Introduction to Pattern Recognition
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
Introduction Mohammad Beigi Department of Biomedical Engineering Isfahan University
This week: overview on pattern recognition (related to machine learning)
Machine Learning CUNY Graduate Center Lecture 1: Introduction.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
General Information Course Id: COSC6342 Machine Learning Time: TU/TH 10a-11:30a Instructor: Christoph F. Eick Classroom:AH123
Machine Learning in Spoken Language Processing Lecture 21 Spoken Language Processing Prof. Andrew Rosenberg.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Compiled By: Raj G Tiwari.  A pattern is an object, process or event that can be given a name.  A pattern class (or category) is a set of patterns sharing.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Data Mining: Classification & Predication Hosam Al-Samarraie, PhD. Centre for Instructional Technology & Multimedia Universiti Sains Malaysia.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 21 Oct 28, 2005 Nanjing University of Science & Technology.
Jun-Won Suh Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Speaker Verification System.
CS774. Markov Random Field : Theory and Application Lecture 19 Kyomin Jung KAIST Nov
1 E. Fatemizadeh Statistical Pattern Recognition.
Pattern Recognition April 19, 2007 Suggested Reading: Horn Chapter 14.
MACHINE LEARNING 8. Clustering. Motivation Based on E ALPAYDIN 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Classification problem:
1 Unsupervised Learning and Clustering Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Introduction to Pattern Recognition (การรู้จํารูปแบบเบื้องต้น)
Automated Interpretation of EEGs: Integrating Temporal and Spectral Modeling Christian Ward, Dr. Iyad Obeid and Dr. Joseph Picone Neural Engineering Data.
Objectives: Terminology Components The Design Cycle Resources: DHS Slides – Chapter 1 Glossary Java Applet URL:.../publications/courses/ece_8443/lectures/current/lecture_02.ppt.../publications/courses/ece_8443/lectures/current/lecture_02.ppt.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 12: Advanced Discriminant Analysis Objectives:
Data Mining and Decision Support
3.Learning In previous lecture, we discussed the biological foundations of of neural computation including  single neuron models  connecting single neuron.
Statistical Models for Automatic Speech Recognition Lukáš Burget.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
1 Learning Bias & Clustering Louis Oliphant CS based on slides by Burr H. Settles.
Pattern Recognition NTUEE 高奕豪 2005/4/14. Outline Introduction Definition, Examples, Related Fields, System, and Design Approaches Bayesian, Hidden Markov.
Fuzzy Pattern Recognition. Overview of Pattern Recognition Pattern Recognition Procedure Feature Extraction Feature Reduction Classification (supervised)
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
General Information Course Id: COSC6342 Machine Learning Time: TU/TH 1-2:30p Instructor: Christoph F. Eick Classroom:AH301
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Mixture Densities Maximum Likelihood Estimates.
PatReco: Introduction Alexandros Potamianos Dept of ECE, Tech. Univ. of Crete Fall
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Machine Learning for Computer Security
LECTURE 11: Advanced Discriminant Analysis
A Personal Tour of Machine Learning and Its Applications
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
School of Computer Science & Engineering
Pattern Recognition Sergios Theodoridis Konstantinos Koutroumbas
Overview of Supervised Learning
What is Pattern Recognition?
REMOTE SENSING Multispectral Image Classification
An Introduction to Supervised Learning
Review of Statistical Pattern Recognition
ECE – Pattern Recognition Lecture 10 – Nonparametric Density Estimation – k-nearest-neighbor (kNN) Hairong Qi, Gonzalez Family Professor Electrical.
Presentation transcript:

Lecture #1COMP 527 Pattern Recognition1 Pattern Recognition Why? To provide machines with perception & cognition capabilities so that they could interact independently with their environments. Pattern Recognition a natural ability of human based on some description of an object, such description is termed Pattern.

Lecture #1COMP 527 Pattern Recognition2 Patterns and Pattern Classes Almost anything within the reach of our five senses can be chosen as a pattern: – Sensory patterns: speech, odors, tastes – Spatial patterns: characters, fingerprints, pictures – Temporal patterns: waveforms, electrocardiograms, movies – Conceptual recognition for abstract items (We will limit ourselves to deal with only physical objects/ events, but NOT abstract entities, say, concepts.) A pattern class is a group of patterns with certain common characteristics.

Lecture #1COMP 527 Pattern Recognition3 Pattern Recognition Pattern Recognition is the science to assign an object/event of interest to one of several pre­specified categories/classes based on certain measurements or observations. Measurements are usually problem dependent. E.g. weight or height for basketball players/jockeys color for apples/oranges Feature vectors represent measurements as coordinates of points in a vector space (feature space).

Lecture #1COMP 527 Pattern Recognition4 Pattern Recognition Systems

Lecture #1COMP 527 Pattern Recognition5 Statistical Pattern Recognition Taps into the vast and thorough knowledge of statistics to provide a formal treatment of PR. Observations are assumed to be generated by a state of nature –data can be described by a statistical model –model by a set of probability functions Strength: many powerful mathematical “tools” from the theory of probability and statistics. Shortcoming: it is usually impossible to design (statistically) error­free systems.

Lecture #1COMP 527 Pattern Recognition6 Example: OCR

Lecture #1COMP 527 Pattern Recognition7 Major Steps

Lecture #1COMP 527 Pattern Recognition8 Raw Features: Example

Lecture #1COMP 527 Pattern Recognition9 Feature Extraction: OCR Example

Lecture #1COMP 527 Pattern Recognition10 Feature Extraction Objectives: To remove irrelevant information and extract distinctive, representative information of the objects. discriminative invariant data compression => dimension reduction It is not easy!

Lecture #1COMP 527 Pattern Recognition11 Data Modeling To build statistical models for describing the data. Parametric models –single probability density function: e.g. Gaussian –mixture density function: e.g. Gaussian mixture model (GMM) –hidden Markov model --- may cope with data of different duration/length Non­parametric models –k-nearest neighbor –Parzen window –neural network

Lecture #1COMP 527 Pattern Recognition12 Training Training Data : Model is “learned” from a set of training data Data collection - should contain data from various regions of the pattern space. Do you know the whole pattern space? Training Algorithm : can be iterative. When to stop training? Generalization : Models trained on a finite set of data should also generalize well to unseen data. How to ensure that?

Lecture #1COMP 527 Pattern Recognition13 Supervised vs. Unsupervised Supervised PR Representative patterns from each pattern class under consideration are available. Supervised learning. Unsupervised PR A set of training patterns of unknown classification is given. Unsupervised learning.

Lecture #1COMP 527 Pattern Recognition14 Classification Classification of N Classes: can be thought as partitioning the feature space into N regions, as non­overlapping as possible, so that each region represents one of the N classes. Often called Decision­Theoretic Approach Decision Boundaries: the boundaries between the class regions in the feature space. Discriminant Functions: mathematical functions to describe the decision boundaries. Types of Classifiers: depending on the functional form of the decision boundary, classifiers may be categorized into: – Linear classifier – Quadratic classifier – Piecewise classifier

Lecture #1COMP 527 Pattern Recognition15 Decision Boundary

Lecture #1COMP 527 Pattern Recognition16 Summary Three main components: features, data model, and recognition algorithm. Make sure you find out a good set of features to work with before you build data models. Data modeling requires knowledge of statistics and optimization. Recognition requires classifier design (i.e. the discriminant functions), search, and algorithm design. Evaluation involves testing on unseen test data which must be large enough in order to claim statistical significance.