CSSE463: Image Recognition Day 31

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning
Advertisements

Lecture 14 – Neural Networks
Hidden Markov Model 11/28/07. Bayes Rule The posterior distribution Select k with the largest posterior distribution. Minimizes the average misclassification.
Predictive Automatic Relevance Determination by Expectation Propagation Yuan (Alan) Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani.
CSSE463: Image Recognition Day 17 Sunset detector spec posted. Sunset detector spec posted. Homework tonight: read it, prepare questions! Homework tonight:
Thanks to Nir Friedman, HU
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Bayesian Classification with a brief introduction to pattern recognition Modified from slides by Michael L. Raymer, Ph.D.
Bayesian Learning for Conditional Models Alan Qi MIT CSAIL September, 2005 Joint work with T. Minka, Z. Ghahramani, M. Szummer, and R. W. Picard.
Review: Probability Random variables, events Axioms of probability
1 Bayesian methods for parameter estimation and data assimilation with crop models Part 2: Likelihood function and prior distribution David Makowski and.
ECSE 6610 Pattern Recognition Professor Qiang Ji Spring, 2011.
Lecture 2: Bayesian Decision Theory 1. Diagram and formulation
A Comparison Between Bayesian Networks and Generalized Linear Models in the Indoor/Outdoor Scene Classification Problem.
CSSE463: Image Recognition Day 31 Today: Bayesian classifiers Today: Bayesian classifiers Tomorrow: project day. Tomorrow: project day. Questions? Questions?
URL:.../publications/courses/ece_8443/lectures/current/exam/2004/ ECE 8443 – Pattern Recognition LECTURE 15: EXAM NO. 1 (CHAP. 2) Spring 2004 Solutions:
Chapter 4: Pattern Recognition. Classification is a process that assigns a label to an object according to some representation of the object’s properties.
CSSE463: Image Recognition Day 11 Lab 4 (shape) tomorrow: feel free to start in advance Lab 4 (shape) tomorrow: feel free to start in advance Test Monday.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
1 Pattern Recognition: Statistical and Neural Lonnie C. Ludeman Lecture 8 Sept 23, 2005 Nanjing University of Science & Technology.
Review: Probability Random variables, events Axioms of probability Atomic events Joint and marginal probability distributions Conditional probability distributions.
CSSE463: Image Recognition Day 31 Today: Bayesian classifiers Today: Bayesian classifiers Tomorrow: Out-of-class project workday Tomorrow: Out-of-class.
Guest lecture: Feature Selection Alan Qi Dec 2, 2004.
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
Naïve Bayes Classification Material borrowed from Jonathan Huang and I. H. Witten’s and E. Frank’s “Data Mining” and Jeremy Wyatt and others.
CSSE463: Image Recognition Day 11 Due: Due: Written assignment 1 tomorrow, 4:00 pm Written assignment 1 tomorrow, 4:00 pm Start thinking about term project.
A New Method for Crater Detection Heather Dunlop November 2, 2006.
Cell Segmentation in Microscopy Imagery Using a Bag of Local Bayesian Classifiers Zhaozheng Yin RI/CMU, Fall 2009.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
Bayesian Conditional Random Fields using Power EP Tom Minka Joint work with Yuan Qi and Martin Szummer.
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
Naïve Bayes Classification Recitation, 1/25/07 Jonathan Huang.
Objectives: Loss Functions Risk Min. Error Rate Class. Resources: DHS – Chap. 2 (Part 1) DHS – Chap. 2 (Part 2) RGO - Intro to PR MCE for Speech MCE for.
Text Classification and Naïve Bayes Formalizing the Naïve Bayes Classifier.
CSSE463: Image Recognition Day 14
Bayesian inference, Naïve Bayes model
LECTURE 04: DECISION SURFACES
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
CSSE463: Image Recognition Day 34
CSSE463: Image Recognition Day 11
From last time: on-policy vs off-policy Take an action Observe a reward Choose the next action Learn (using chosen action) Take the next action Off-policy.
Alan Qi Thomas P. Minka Rosalind W. Picard Zoubin Ghahramani
Lecture 14: Introduction to Recognition
Special Topics In Scientific Computing
Data Mining Lecture 11.
Object detection as supervised classification
Lecture 26: Faces and probabilities
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 27
CHAPTER 7 BAYESIAN NETWORK INDEPENDENCE BAYESIAN NETWORK INFERENCE MACHINE LEARNING ISSUES.
CSSE463: Image Recognition Day 17
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Pattern Recognition and Image Analysis
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 14
Pattern Recognition and Machine Learning
Advanced Artificial Intelligence Classification
CSSE463: Image Recognition Day 14
CSSE463: Image Recognition Day 14
LECTURE 23: INFORMATION THEORY REVIEW
CSSE463: Image Recognition Day 27
Announcements Project 4 out today Project 2 winners help session today
Parametric Methods Berlin Chen, 2005 References:
CSSE463: Image Recognition Day 14
Where are we? We have covered: Project 1b was due today
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 11
CSSE463: Image Recognition Day 34
Mathematical Foundations of BME Reza Shadmehr
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Presentation transcript:

CSSE463: Image Recognition Day 31 Today: Bayesian classifiers Tomorrow: project meetings. Questions?

Bayesian classifiers Use training data If 2 classes: Assume that you know probabilities of each feature. If 2 classes: Classes w1 and w2 Say, circles vs. non-circles A single feature, x Both classes equally likely Both types of errors equally bad Where should we set the threshold between classes? Here? Where in graph are 2 types of errors? p(x) P(x|w1) P(x|w2) Circles Non-circles x Detected as circles Q1-4

What if we have prior information? Bayesian probabilities say that if we only expect 10% of the objects to be circles, that should affect our classification Q5-8

Bayesian classifier in general Bayes rule: Verify with example For classifiers: x = feature(s) wi = class P(w|x) = posterior probability P(w) = prior P(x) = unconditional probability Find best class by maximum a posteriori (MAP) priniciple. Find class i that maximizes P(wi|x). Denominator doesn’t affect calculations Example: indoor/outdoor classification Fixed Learned from examples (histogram) Learned from training set (or leave out if unknown)

Bayes rule is used in prediction of disease http://www.youtube.com/watch?v=D8VZqxcu0I0 Can you verify the approximation they found?

Indoor vs. outdoor classification I can use low-level image info (color, texture, etc) But there’s another source of really helpful info!

Camera Metadata Distributions Subject Distance Flash Scene Brightness Exposure Time Subject Distance

Why we need Bayes Rule Problem: We know conditional probabilities like P(flash was on | indoor) We want to find conditional probabilities like P(indoor | flash was on, exp time = 0.017, sd=8 ft, SVM output) Let w = class of image, and x = all the evidence. More generally, we know P( x | w ) from the training set (why?) But we want P(w | x)

Using Bayes Rule P(w|x) = P(x|w)P(w)/P(x) P(w|x) = aP(x|w)P(w) The denominator is constant for an image, so P(w|x) = aP(x|w)P(w) Q9

Using Bayes Rule P(w|x) = P(x|w)P(w)/P(x) P(w|x) = aP(x|w)P(w) The denominator is constant for an image, so P(w|x) = aP(x|w)P(w) We have two types of features, from image metadata (M) and from low-level features, like color (L) Conditional independence means P(x|w) = P(M|w)P(L|w) P(w|X) = aP(M|w) P(L|w) P(w) Priors (initial bias) From histograms From SVM

Bayesian network Efficient way to encode conditional probability distributions and calculate marginals Use for classification by having the classification node at the root Examples Indoor-outdoor classification Automatic image orientation detection

Indoor vs. outdoor classification Each edge in the graph has an associated matrix of conditional probabilities SVM SVM Color Features Texture Features KL Divergence EXIF header

Effects of Image Capture Context Recall for a class C is fraction of C classified correctly

Orientation detection See IEEE TPAMI paper Hardcopy or posted Also uses single-feature Bayesian classifier (answer to #1-4) Keys: 4-class problem (North, South, East, West) Priors really helped here! You should be able to understand the two papers (both posted)