Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3

Slides:



Advertisements
Similar presentations
BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Introduction of Probabilistic Reasoning and Bayesian Networks
1 Slides for the book: Probabilistic Robotics Authors: Sebastian Thrun Wolfram Burgard Dieter Fox Publisher: MIT Press, Web site for the book & more.
Introduction to probability theory and graphical models Translational Neuroimaging Seminar on Bayesian Inference Spring 2013 Jakob Heinzle Translational.
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Cognitive Computer Vision
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
CS 547: Sensing and Planning in Robotics Gaurav S. Sukhatme Computer Science Robotic Embedded Systems Laboratory University of Southern California
Bayesian Belief Networks
AI - Week 24 Uncertain Reasoning (quick mention) then REVISION Lee McCluskey, room 2/07
Bayesian Networks. Graphical Models Bayesian networks Conditional random fields etc.
Goal: Reconstruct Cellular Networks Biocarta. Conditions Genes.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Artificial Intelligence and Lisp Lecture 7 LiU Course TDDC65 Autumn Semester, 2010
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
Review: Probability Random variables, events Axioms of probability
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Probability, Bayes’ Theorem and the Monty Hall Problem
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Summary of the Bayes Net Formalism David Danks Institute for Human & Machine Cognition.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
1 Robot Environment Interaction Environment perception provides information about the environment’s state, and it tends to increase the robot’s knowledge.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability Course web page: vision.cis.udel.edu/cv March 19, 2003  Lecture 15.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Review: Probability Random variables, events Axioms of probability Atomic events Joint and marginal probability distributions Conditional probability distributions.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
Introduction on Graphic Models
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11 CS479/679 Pattern Recognition Dr. George Bebis.
Review of Probability.
Qian Liu CSE spring University of Pennsylvania
Cognitive Computer Vision
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Markov ó Kalman Filter Localization
CSE-490DF Robotics Capstone
Class #19 – Tuesday, November 3
Class #16 – Tuesday, October 26
Chapter 14 February 26, 2004.
Presentation transcript:

Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3

Lecture 5 Reminder of probability theory Bayes rule Bayesian networks

So why is Bayes rule relevant to Cognitive CV? Provides a well-founded methodology for reasoning with uncertainty These methods are the basis for our model of perception guided by expectation We can develop well-founded methods of learning rather than just being stuck with hand- coded models

Bayes rule: dealing with uncertainty Rev. THOMAS BAYES Sources of uncertainty e.g.: – ignorance – complexity – physical randomness – vagueness Use probability theory to reason about uncertainty Be careful to understand what you mean by probability and use it consistently – frequency analysis – belief

Probability theory - reminder p(x): single continuous value in the range [0,1]. Think of either as “x is true in 0.7 of cases” (frequentist) of “I believe x = true with probability 0.7” P(X): often (but not always) used to denote a distribution over a set of values, e.g. if X is discrete {x=true, x=false} then P(X) encompasses knowledge of both values. p(x=true) is then a single value.

Probability theory - reminder Joint probability Conditional probability

Probability theory - reminder Conditional independence Marginalising

Bayes rule – the basics Y X BAYES RULE

Bayes rule – the basics As an illustration, let’s look at the conditional probability of a hypothesis H based on some evidence E

Bayes rule – example Consider a vision system used to detect zebra in static images It has a “stripey area” operator to help it do this (the evidence E) Let p(h=zebra present) = 0.02 (prior established during training) Assume the “stripey area” operator is discrete valued (true/false) Let p(e=true|h=true)=0.8 (it’s a fairly good detector) Let p(e=true|h=false)=0.1 (there are non-zebra items with stripes in the data set – like the gate) Given e, we can establish p(h=true|e=true) …

Bayes rule – example Note that this is an increase over the prior = 0.02 due to the evidence e

Interpretation Despite our intuition, our detector does not seem very “good” Remember, only 1 in 50 images had a zebra That means that 49 out of 50 do not contain a zebra and the detector is not 100% reliable. Some of these images will be incorrectly determined as having a zebra Failing to account for “negative” evidence properly is a typical failing of human intuitive reasoning

Moving on … Human intuition is not very Bayesian (e.g. Kahneman et al., 1982). Be sure to apply Bayes theory correctly Bayesian networks help us to organise our thinking clearly Causality and Bayesian networks are related

Bayesian networks A E D B C Compact representation of the joint probability over a set of variables Each variable is represented as a node. Each variable can be discrete or continuous Conditional independence assumptions are encoded using a set of arcs Set of nodes and arcs is referred to as a graph No arcs imply nodes are conditionally independent of each other Different types of graph exist. The one shown is a Directed Acyclic Graph (DAG)

Bayesian networks - terminology A E D B C A is called a root node and has a prior only B,D, and E are called leaf nodes A “causes” B and “causes” C. So value of A determines value of B and C A is the parent nodes of B and C B and C are child nodes of A To determine E, you need only to know C. E is conditionally independent of A given C

Encoding conditional independence ABC FACTORED REPRESENTATION

Specifying the Conditional Probability Terms (1) For a discrete node C with discrete parents A and B, the conditional probability term P(C|A,B) can be represented as a value table a=b=p(c=T|A,B) redT0.2 redF0.1 greenT0.6 greenF0.3 blueT0.99 blueF0.05 A C B {red,green,blue} {true,false}

Specifying the Conditional Probability Terms (2) For a continuous node C with continuous parents A and B, the conditional probability term P(C|A,B) can be represented as a function A C B A B p(c|A,B)

Specifying the Conditional Probability Terms (3) For a continuous node C with 1 continuous parent A and and 1 discrete parent B, the conditional probability term P(C|A,B) can be represented as a set of functions (the continuous function is selected according to a “context” determined by B A p(c|A,B) A C B {true,false}

Directed Acyclic Graph (DAG) A E D B C Arcs encode “causal” relationships between nodes No more than 1 path (regardless of arc direction) between any node and any other node If we added dotted red arc, we would have a loopy graph Loopy graphs can be approximated by acyclic ones for inference, but this is outside the scope of this course

Inference and Learning Inference – Calculating a probability over a set of nodes given the values of other nodes – Two most useful modes of inference are PREDICTIVE (from root to leaf) and DIAGNOSTIC (from leaf to root) Exact and approximate methods – Exact methods exist for Directed Acyclic Graphs (DAGs) – Approximations exists for other graph types

Summary Bayes rule allows us to deal with uncertain data Bayesian networks encode conditional independence. Simple DAGs can be used n causal and diagnostic modes

Next time … Examples of inference using Bayesian Networks A lot of excellent reference material on Bayesian reasoning can be found at: