Christopher M. Bishop Object Recognition: A Statistical Learning Perspective Microsoft Research, Cambridge Sicily, 2003.

Slides:



Advertisements
Similar presentations
Adjusting Active Basis Model by Regularized Logistic Regression
Advertisements

Pattern Recognition and Machine Learning
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Machine learning continued Image source:
Computer vision: models, learning and inference Chapter 18 Models for style and identity.
Computer Vision for Human-Computer InteractionResearch Group, Universität Karlsruhe (TH) cv:hci Dr. Edgar Seemann 1 Computer Vision: Histograms of Oriented.
Large Scale Manifold Transduction Michael Karlen Jason Weston Ayse Erkan Ronan Collobert ICML 2008.
Chapter 4: Linear Models for Classification
Computer vision: models, learning and inference
Laboratory for Social & Neural Systems Research (SNS) PATTERN RECOGNITION AND MACHINE LEARNING Institute of Empirical Research in Economics (IEW)
Lecture 14 – Neural Networks
Pattern Recognition and Machine Learning
0 Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
MSRC Summer School - 30/06/2009 Cambridge – UK Hybrids of generative and discriminative methods for machine learning.
Linear Methods for Classification
Machine Learning CMPT 726 Simon Fraser University
Decision Theory Naïve Bayes ROC Curves
Object Recognition: Conceptual Issues Slides adapted from Fei-Fei Li, Rob Fergus, Antonio Torralba, and K. Grauman.
Classifiers for Recognition Reading: Chapter 22 (skip 22.3) Slide credits for this chapter: Frank Dellaert, Forsyth & Ponce, Paul Viola, Christopher Rasmussen.
Object Recognition: Conceptual Issues Slides adapted from Fei-Fei Li, Rob Fergus, Antonio Torralba, and K. Grauman.
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Handwritten Character Recognition using Hidden Markov Models Quantifying the marginal benefit of exploiting correlations between adjacent characters and.
8/10/ RBF NetworksM.W. Mak Radial Basis Function Networks 1. Introduction 2. Finding RBF Parameters 3. Decision Surface of RBF Networks 4. Comparison.
Statistical Natural Language Processing. What is NLP?  Natural Language Processing (NLP), or Computational Linguistics, is concerned with theoretical.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Bayesian Decision Theory Making Decisions Under uncertainty 1.
1 Linear Methods for Classification Lecture Notes for CMPUT 466/551 Nilanjan Ray.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
Midterm Review Rao Vemuri 16 Oct Posing a Machine Learning Problem Experience Table – Each row is an instance – Each column is an attribute/feature.
ECE 8443 – Pattern Recognition LECTURE 03: GAUSSIAN CLASSIFIERS Objectives: Normal Distributions Whitening Transformations Linear Discriminants Resources.
Learning Lateral Connections between Hidden Units Geoffrey Hinton University of Toronto in collaboration with Kejie Bao University of Toronto.
Pattern Recognition Lecture 1 - Overview Jim Rehg School of Interactive Computing Georgia Institute of Technology Atlanta, Georgia USA June 12, 2007.
Overview of Supervised Learning Overview of Supervised Learning2 Outline Linear Regression and Nearest Neighbors method Statistical Decision.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
Learning to perceive how hand-written digits were drawn Geoffrey Hinton Canadian Institute for Advanced Research and University of Toronto.
Face Detection Ying Wu Electrical and Computer Engineering Northwestern University, Evanston, IL
Visual Categorization With Bags of Keypoints Original Authors: G. Csurka, C.R. Dance, L. Fan, J. Willamowski, C. Bray ECCV Workshop on Statistical Learning.
Sparse Kernel Methods 1 Sparse Kernel Methods for Classification and Regression October 17, 2007 Kyungchul Park SKKU.
Ohad Hageby IDC Support Vector Machines & Kernel Machines IP Seminar 2008 IDC Herzliya.
CHAPTER 6 Naive Bayes Models for Classification. QUESTION????
An Introduction to Kalman Filtering by Arthur Pece
Introduction to Pattern Recognition (การรู้จํารูปแบบเบื้องต้น)
CZ5225: Modeling and Simulation in Biology Lecture 7, Microarray Class Classification by Machine learning Methods Prof. Chen Yu Zong Tel:
Elements of Pattern Recognition CNS/EE Lecture 5 M. Weber P. Perona.
1 Parameter Estimation Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of Networking and Multimedia,
MACHINE LEARNING 7. Dimensionality Reduction. Dimensionality of input Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Markov Random Fields & Conditional Random Fields
CSC321 Lecture 5 Applying backpropagation to shape recognition Geoffrey Hinton.
CS Statistical Machine learning Lecture 12 Yuan (Alan) Qi Purdue CS Oct
Convolutional Restricted Boltzmann Machines for Feature Learning Mohammad Norouzi Advisor: Dr. Greg Mori Simon Fraser University 27 Nov
Lecture 5: Statistical Methods for Classification CAP 5415: Computer Vision Fall 2006.
Foundational Issues Machine Learning 726 Simon Fraser University.
Giansalvo EXIN Cirrincione unit #4 Single-layer networks They directly compute linear discriminant functions using the TS without need of determining.
Linear Classifier Team teaching.
Ch 1. Introduction (Latter) Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by J.W. Ha Biointelligence Laboratory, Seoul National.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by J.W. Ha Biointelligence Laboratory, Seoul National University.
1 CISC 841 Bioinformatics (Fall 2008) Review Session.
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
Machine learning & object recognition Cordelia Schmid Jakob Verbeek.
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 1: INTRODUCTION.
LECTURE 09: BAYESIAN ESTIMATION (Cont.)
Pattern Recognition and Machine Learning
Generally Discriminant Analysis
LECTURE 23: INFORMATION THEORY REVIEW
Multivariate Methods Berlin Chen, 2005 References:
MAS 622J Course Project Classification of Affective States - GP Semi-Supervised Learning, SVM and kNN Hyungil Ahn
Naïve Bayes Classifier
Presentation transcript:

Christopher M. Bishop Object Recognition: A Statistical Learning Perspective Microsoft Research, Cambridge Sicily, 2003

Object Recognition Workshop, SicilyChristopher M. Bishop Question 1 “Will visual category recognition be solved by an architecture based on classification of feature vectors using advanced learning algorithms?” No –large number of classes –many degrees of freedom of variability (geometric, photometric,...) –transformations are highly non-linear in the pixel values (objects live on non-linear manifolds) –occlusion –expensive to provide detailed labelling of training data

Object Recognition Workshop, SicilyChristopher M. Bishop Question 2 “If we want to achieve a human like capacity to recognise 1000s of visual categories, learning from a few examples, what will move us forward most significantly?” Large training sets –algorithms which can effectively utilize lots of unlabelled/partially labelled data But: should the models be generative or discriminative?

Object Recognition Workshop, SicilyChristopher M. Bishop Generative vs. Discriminative Models Generative approach: separately model class-conditional densities and priors then evaluate posterior probabilities using Bayes’ theorem Discriminative approaches: 1.model posterior probabilities directly 2.just predict class label (no inference stage)

Object Recognition Workshop, SicilyChristopher M. Bishop Generative vs. Discriminative

Object Recognition Workshop, SicilyChristopher M. Bishop Advantages of Knowing Posterior Probabilities No re-training if loss matrix changes –inference hard, decision stage is easy Reject option: don’t make decision when largest probability is less than threshold Compensating for skewed class priors Combining models –e.g. independent measurements:

Object Recognition Workshop, SicilyChristopher M. Bishop Unlabelled Data Class 1 Class 2 Test point

Object Recognition Workshop, SicilyChristopher M. Bishop Unlabelled Data

Object Recognition Workshop, SicilyChristopher M. Bishop Generative Methods Relatively straightforward to characterize invariances They can handle partially labelled data  They wastefully model variability which is unimportant for classification  They scale badly with the number of classes and the number of invariant transformations (slow on test data)

Object Recognition Workshop, SicilyChristopher M. Bishop Discriminative Methods They use the flexibility of the model in relevant regions of input space They can be extremely fast once trained  They interpolate between training examples, and hence can fail if novel inputs are presented  They don’t easily handle compositionality (e.g. faces can have glasses and/or moutaches and/or hats)

Object Recognition Workshop, SicilyChristopher M. Bishop Hybrid Approaches Generatively inspired models, trained discriminatively –state of the art in speech recognition –hidden Markov model handles time-warp invariances –parameters determined by maximum mutual information not maximum likelihood