Pattern Recognition with N-Tuple Systems Simon Lucas Computer Science Dept Essex University.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Computer vision: models, learning and inference
Confidence Measures for Speech Recognition Reza Sadraei.
ROC Statistics for the Lazy Machine Learner in All of Us Bradley Malin Lecture for COS Lab School of Computer Science Carnegie Mellon University 9/22/2005.
Machine Learning Neural Networks
Beyond bags of features: Part-based models Many slides adapted from Fei-Fei Li, Rob Fergus, and Antonio Torralba.
Lecture 14 – Neural Networks
Confidence Estimation for Machine Translation J. Blatz et.al, Coling 04 SSLI MTRG 11/17/2004 Takahiro Shinozaki.
Announcements  Project proposal is due on 03/11  Three seminars this Friday (EB 3105) Dealing with Indefinite Representations in Pattern Recognition.
Real-time Computer Vision with Scanning N-Tuple Grids Simon Lucas Computer Science Dept.
Minimum Classification Error Networks Based on book chapter 9, by Shigeru Katagiri Jaakko Peltonen, 28 th February, 2002.
Machine Learning CMPT 726 Simon Fraser University
Adapted by Doug Downey from Machine Learning EECS 349, Bryan Pardo Machine Learning Clustering.
Application of RNNs to Language Processing Andrey Malinin, Shixiang Gu CUED Division F Speech Group.
Scalable Text Mining with Sparse Generative Models
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Binary Variables (1) Coin flipping: heads=1, tails=0 Bernoulli Distribution.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
1 Template-Based Classification Method for Chinese Character Recognition Presenter: Tienwei Tsai Department of Informaiton Management, Chihlee Institute.
This week: overview on pattern recognition (related to machine learning)
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
DIGITAL IMAGE PROCESSING Dr J. Shanbehzadeh M. Hosseinajad ( J.Shanbehzadeh M. Hosseinajad)
Neural Networks AI – Week 23 Sub-symbolic AI Multi-Layer Neural Networks Lee McCluskey, room 3/10
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Machine Learning Using Support Vector Machines (Paper Review) Presented to: Prof. Dr. Mohamed Batouche Prepared By: Asma B. Al-Saleh Amani A. Al-Ajlan.
Image Classification 영상분류
Yang, Luyu.  Postal service for sorting mails by the postal code written on the envelop  Bank system for processing checks by reading the amount of.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
An Introduction to Support Vector Machines (M. Law)
Ch 4. Linear Models for Classification (1/2) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized and revised by Hee-Woong Lim.
1 E. Fatemizadeh Statistical Pattern Recognition.
CSC321: Neural Networks Lecture 24 Products of Experts Geoffrey Hinton.
1 Pattern Recognition Pattern recognition is: 1. A research area in which patterns in data are found, recognized, discovered, …whatever. 2. A catchall.
Non-Bayes classifiers. Linear discriminants, neural networks.
Lecture notes for Stat 231: Pattern Recognition and Machine Learning 1. Stat 231. A.L. Yuille. Fall 2004 AdaBoost.. Binary Classification. Read 9.5 Duda,
Powerpoint Templates Page 1 Powerpoint Templates Scalable Text Classification with Sparse Generative Modeling Antti PuurulaWaikato University.
Protein motif extraction with neuro-fuzzy optimization Bill C. H. Chang and Author : Bill C. H. Chang and Saman K. Halgamuge Saman K. Halgamuge Adviser.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Supervised Learning Resources: AG: Conditional Maximum Likelihood DP:
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
John Lafferty Andrew McCallum Fernando Pereira
Fast Query-Optimized Kernel Machine Classification Via Incremental Approximate Nearest Support Vectors by Dennis DeCoste and Dominic Mazzoni International.
NTU & MSRA Ming-Feng Tsai
CSC321: Introduction to Neural Networks and Machine Learning Lecture 15: Mixtures of Experts Geoffrey Hinton.
Hybrid Classifiers for Object Classification with a Rich Background M. Osadchy, D. Keren, and B. Fadida-Specktor, ECCV 2012 Computer Vision and Video Analysis.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition Objectives: Bayes Rule Mutual Information Conditional.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
A Document-Level Sentiment Analysis Approach Using Artificial Neural Network and Sentiment Lexicons Yan Zhu.
1 Ensembles An ensemble is a set of classifiers whose combined results give the final decision. test feature vector classifier 1classifier 2classifier.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
Intro to Machine Learning
Implementation Issues & IR Systems
Dipartimento di Ingegneria «Enzo Ferrari»,
Basic machine learning background with Python scikit-learn
In summary C1={skin} C2={~skin} Given x=[R,G,B], is it skin or ~skin?
Computer Science Department Brigham Young University
Design of Hierarchical Classifiers for Efficient and Accurate Pattern Classification M N S S K Pavan Kumar Advisor : Dr. C. V. Jawahar.
network of simple neuron-like computing elements
Word Embedding Word2Vec.
Zip Codes and Neural Networks: Machine Learning for
Statistical n-gram David ling.
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Learning and Memorization
Patterson: Chap 1 A Review of Machine Learning
Presentation transcript:

Pattern Recognition with N-Tuple Systems Simon Lucas Computer Science Dept Essex University

Overview Standard Binary n-tuple Dealing with grey-levels –Continuous n-tuple –Bit-plane decomposition Dealing with sequences –Scanning N-Tuple Future Directions

N-Tuple Systems Bledsoe + Browning (late fifties) Sample a pattern at m sets of n-points per set Use each sample set to represent a memory address Have an n-tuple “bank” for each pattern class Simple training: –Note address occurrences for each class

What to store Various options –1-bit: address occurred or not –Freq – weighted: count number of occurs –Prob. – use count to estimate probability 1-bit version saturates Usually better to use probabilistic version (ML estimate)

N-Tuple Architecture

Standard N-Tuple Features Superfast training –As fast as you can read the data in! Superfast recognition (ditto) Simple Applicable to binary images

Grey-level

Threshold?

Niblack?

Beavis?

Continuous N-Tuple Samples grey-level image directly Pre-compiles samples into LUTs Fills LUT entries with ABS distance to closest sampled point Recognition speed not compromised BUT: slower to train Memory problems… Not probabilistic –Sensitive to spurious training data!

Continuous N-Tuple Results

Bit-Plane Decomposition Alternative to continuous n-tuple Uses a combination of binary n-tuple classifiers One for each bit-plane (so 8 for 256- grey level) Good results reported Speed sacrifice

Scanning N-Tuple Classifier (SNT) Introduced in 1995 (Lucas, Lucas + Amiri) Since investigated by other research groups (IBM, Kaist, Kent, Athens) In a recent study was one of the best classifiers on UNIPEN dataset Simple modification of n-gram model An n-gram with gaps!!!

Scanning N-Tuple 0232 Chain code image Scan sampler along chain code Estimate weights of address occurrences Classify by summing weights for each class Softmax function - > posterior probability Train DEMO!

Recent Work Extensive evaluation (IBM) Directional + bit-plane decomposition (Kent) (smaller tables) Mixture models for table compression (IBM, KAIST) Clustering (Athens) Discriminative Training (Essex) –Better accuracy (why????)

Terminology m – frequency count l – log likelihood weights a – class activation vector y – output vector (posterior prob.) t – target vector

Likelihood Score for Class k given Sequence s

Softmax Function Interpret as posterior probability y_k

Maximum Likelihood Est.

Discriminative Training Maximise probability of correct classification Minimise cross-entropy

Cross Entropy Error Term

Weight Update Rule If k = true class Apply weight updates

Cross-Entropy v. ML

Design Process

MNIST Results

Future Work Improve accuracy further –Mixture Models –Training data deformation models Better understanding of discrim v. ML Sparse (e.g. trie) SNT Optimal (all) threshold version for colour / grey-level images

Why Mixture? To tell A from B !!! A B ….

Why Opti-Thresh?

Global Mean Threshold

Optimally Thresholded Image

Conclusions N-Tuple classifiers – fantastic speed High degree of design skill needed to make them work well Compete with much more complex systems Interesting future work to be done!

Further Reading Continuous n-tuple –Simon M. Lucas, Face recognition with the continuous n-tuple classifier, Proceedings of the British Machine Vision Conference (1997), pages: [pdf]Simon M. Lucas[pdf] Scanning n-tuple –Simon M. Lucas and A. Amiri,, Statistical syntactic Methods for high performance OCR, IEE Proceedings on Vision, Image and Signal Processing (1996), v. 143, pages: [pdf]Simon M. Lucas[pdf] –Simon M. Lucas, Discriminative Training of the Scanning N-Tuple Classifier, International Workshop on Artificial Neural Networks (2003), pages: [pdf] (draft)Simon M. LucasInternational Workshop on Artificial Neural Networks[pdf] (draft) Plus many more references in those papers Search Google for n-tuple and also for scanning n- tuple