Cognitive Computer Vision

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

Gated Graphs and Causal Inference
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
A Tutorial on Learning with Bayesian Networks
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for 1 Lecture Notes for E Alpaydın 2010.
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Acknowledgements Contact Information Anthony Wong, MTech 1, Senthil K. Nachimuthu, MD 1, Peter J. Haug, MD 1,2 Sepsis Temporal Model Methodology Dynamic.
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Cognitive Computer Vision
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
Tracking Objects with Dynamics Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 04/21/15 some slides from Amin Sadeghi, Lana Lazebnik,
1 © 1998 HRL Laboratories, LLC. All Rights Reserved Construction of Bayesian Networks for Diagnostics K. Wojtek Przytula: HRL Laboratories & Don Thompson:
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Propagation in Poly Trees Given a Bayesian Network BN = {G, JDP} JDP(a,b,c,d,e) = p(a)*p(b|a)*p(c|e,b)*p(d)*p(e|d) a d b e c.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Aspects of Bayesian Inference and Statistical Disclosure Control in Python Duncan Smith Confidentiality and Privacy Group CCSR University of Manchester.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
The Bayesian Web Adding Reasoning with Uncertainty to the Semantic Web
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
UIUC CS 498: Section EA Lecture #21 Reasoning in Artificial Intelligence Professor: Eyal Amir Fall Semester 2011 (Some slides from Kevin Murphy (UBC))
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Machine Learning in Practice Lecture 5 Carolyn Penstein Rosé Language Technologies Institute/ Human-Computer Interaction Institute.
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Pattern Recognition and Machine Learning
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Introduction on Graphic Models
Belief propagation with junction trees Presented by Mark Silberstein and Yaniv Hamo.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Bayesian Hierarchical Clustering Paper by K. Heller and Z. Ghahramani ICML 2005 Presented by David Williams Paper Discussion Group ( )
CSC2535 Lecture 5 Sigmoid Belief Nets
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Bayesian Decision Theory Introduction to Machine Learning (Chap 3), E. Alpaydin.
Integrative Genomics I BME 230. Probabilistic Networks Incorporate uncertainty explicitly Capture sparseness of wiring Incorporate multiple kinds of data.
INTRODUCTION TO Machine Learning 2nd Edition
ICS 280 Learning in Graphical Models
Graphical Models in Brief
CS 416 Artificial Intelligence
Markov ó Kalman Filter Localization
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
INTRODUCTION TO Machine Learning
State Estimation Probability, Bayes Filtering
Uncertainty in AI.
CSE-490DF Robotics Capstone
CHAPTER 7 BAYESIAN NETWORK INDEPENDENCE BAYESIAN NETWORK INFERENCE MACHINE LEARNING ISSUES.
Bayesian Statistics and Belief Networks
Probabilistic Map Based Localization
Class #16 – Tuesday, October 26
Chapter14-cont..
INTRODUCTION TO Machine Learning
Mathematical Foundations of BME Reza Shadmehr
Presentation transcript:

Cognitive Computer Vision Kingsley Sage khs20@sussex.ac.uk and Hilary Buxton hilaryb@sussex.ac.uk Prepared under ECVision Specific Action 8-3 http://www.ecvision.org

Lecture 6 Inference in Bayesian networks Predictive inference Diagnostic inference Combined inference Intercausal inference General approaches for inference Bayesian inference tools

So why is Bayesian inference relevant to Cognitive CV? Provides a well-founded methodology for reasoning with uncertainty These methods are the basis for our model of perception guided by expectation We can develop well-founded methods of learning rather than just being stuck with hand-coded models

Inference Inference Four modes of inference: Calculating a probability over a set of nodes given the values of other nodes Four modes of inference: PREDICTIVE (from root to leaf) DIAGNOSTIC (from leaf to root) COMBINED (predictive and diagnostic) INTERCAUSAL

Inference Inference Also called conditioning or belief updating We will have some values (evidence nodes) and want to establish others (query nodes) Don’t confuse priors with evidence Priors are statistical statements of how likely something is to “happen” (frequentist view) Evidence means that you know it has happened

A vision example A B O C N All discrete nodes A and B are feature detectors for some area in an image (perhaps A is colour based and B is shape based) O is an object detector that bases its decision solely on A and B N determines how likely another is to be found nearby when the object detector finds its object C represents an action context that is relevant when the object detector finds its object A B O C N

A vision example A B O C N A detects red areas, B detects the cup shape, O detects the cup of tea, the potential nearby object is a saucer and the action context is someone picking up the tea to drink it!!

A vision example A B O C N These priors are established p(a=detect) 0.2 p(b=detect) 0.1 These priors are established during a training process A B O C N a= b= p(o=T|A,B) T 0.95 F 0.6 0.5 0.01 This table specifies the performance of the object detector where T =detected, and F = not detected o= p(c=T|O) T 0.7 F 0.1 o= p(n=T|O) T 0.7 F 0.2 The context is “will be picked up” if c=T. The saucer object is nearby if n=T

Predictive inference Let’s see this applied to our example We use marginalisation to evaluate our queries based on the evidence we have observed (if we have any)

Predictive inference In the absence of any observed evidence

Predictive inference Let’s say we now have evidence that a=T And if a=T and b=T

Diagnostic inference Reasoning from leaf upwards to root nodes Use Bayes rule

Diagnostic inference If there had been a link from another node into N, we would have needed to have normalised our expression over the additional node O N X

Combined inference A O B C N Evidence Where you have evidence from say N and B and form a query on an intermediate node E.g. use diagnostic inference to determine p(o=T|n=?) and then use predictive inference to determine p(o=T) given the evidence Can compute, for example p(o=T|n=T,b=T) A O B C N Query Evidence

Intercausal inference “explaining away” A and B are independent A is dependent on B given O If, for example, p(a=T|o=T) > p(a=T|o=T,b=T) then the odds are that a=T rather than b=T caused o=T We say that O is “explained away” A O B Evidence Query

General approach to inference Having their origins in Pearl’s work on Junction Trees (“Probabilistic Reasoning in Intelligent Systems”, Pearl 1988) Efficient schemes exist for global computation of probabilities using local message passing (e.g. Jensen and Lauritzen 1990 and Lauritzen and Spiegelhalter 1988) Beyond the scope of this course, but …

Bayesian inference tools there are a number of packages out there to do the work for you!! http://www.cs.ubc.ca/~murphyk/Software : Kevin Murphy’s BNT http://www.csse.monash.edu.au/bai/book/appendix_b.pdf : Excellent summary of various packages and their capabilities

Summary Bayesian inference allows the values of evidence nodes to be used systematically to update query nodes We can distinguish 4 modes of inference: predictive, diagnostic, combined and explaining away Large Bayesian networks can be evaluated efficiently using Bayesian inference toolkits available on the Internet

Next time … Gaussian mixtures A lot of excellent reference material on Bayesian reasoning can be found at: http://www.csse.monash.edu.au/bai http://www.dcs.qmw.ac.uk/~norman/BBNs/idxlist.htm