Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cognitive Computer Vision

Similar presentations


Presentation on theme: "Cognitive Computer Vision"— Presentation transcript:

1 Cognitive Computer Vision
Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3

2 Lecture 6 Inference in Bayesian networks
Predictive inference Diagnostic inference Combined inference Intercausal inference General approaches for inference Bayesian inference tools

3 So why is Bayesian inference relevant to Cognitive CV?
Provides a well-founded methodology for reasoning with uncertainty These methods are the basis for our model of perception guided by expectation We can develop well-founded methods of learning rather than just being stuck with hand-coded models

4 Inference Inference Four modes of inference:
Calculating a probability over a set of nodes given the values of other nodes Four modes of inference: PREDICTIVE (from root to leaf) DIAGNOSTIC (from leaf to root) COMBINED (predictive and diagnostic) INTERCAUSAL

5 Inference Inference Also called conditioning or belief updating
We will have some values (evidence nodes) and want to establish others (query nodes) Don’t confuse priors with evidence Priors are statistical statements of how likely something is to “happen” (frequentist view) Evidence means that you know it has happened

6 A vision example A B O C N All discrete nodes
A and B are feature detectors for some area in an image (perhaps A is colour based and B is shape based) O is an object detector that bases its decision solely on A and B N determines how likely another is to be found nearby when the object detector finds its object C represents an action context that is relevant when the object detector finds its object A B O C N

7 A vision example A B O C N A detects red areas, B detects the cup shape, O detects the cup of tea, the potential nearby object is a saucer and the action context is someone picking up the tea to drink it!!

8 A vision example A B O C N These priors are established
p(a=detect) 0.2 p(b=detect) 0.1 These priors are established during a training process A B O C N a= b= p(o=T|A,B) T 0.95 F 0.6 0.5 0.01 This table specifies the performance of the object detector where T =detected, and F = not detected o= p(c=T|O) T 0.7 F 0.1 o= p(n=T|O) T 0.7 F 0.2 The context is “will be picked up” if c=T. The saucer object is nearby if n=T

9 Predictive inference Let’s see this applied to our example
We use marginalisation to evaluate our queries based on the evidence we have observed (if we have any)

10 Predictive inference In the absence of any observed evidence

11 Predictive inference Let’s say we now have evidence that a=T
And if a=T and b=T

12 Diagnostic inference Reasoning from leaf upwards to root nodes
Use Bayes rule

13 Diagnostic inference If there had been a link from another node into N, we would have needed to have normalised our expression over the additional node O N X

14 Combined inference A O B C N
Evidence Where you have evidence from say N and B and form a query on an intermediate node E.g. use diagnostic inference to determine p(o=T|n=?) and then use predictive inference to determine p(o=T) given the evidence Can compute, for example p(o=T|n=T,b=T) A O B C N Query Evidence

15 Intercausal inference “explaining away”
A and B are independent A is dependent on B given O If, for example, p(a=T|o=T) > p(a=T|o=T,b=T) then the odds are that a=T rather than b=T caused o=T We say that O is “explained away” A O B Evidence Query

16 General approach to inference
Having their origins in Pearl’s work on Junction Trees (“Probabilistic Reasoning in Intelligent Systems”, Pearl 1988) Efficient schemes exist for global computation of probabilities using local message passing (e.g. Jensen and Lauritzen 1990 and Lauritzen and Spiegelhalter 1988) Beyond the scope of this course, but …

17 Bayesian inference tools
there are a number of packages out there to do the work for you!! : Kevin Murphy’s BNT : Excellent summary of various packages and their capabilities

18 Summary Bayesian inference allows the values of evidence nodes to be used systematically to update query nodes We can distinguish 4 modes of inference: predictive, diagnostic, combined and explaining away Large Bayesian networks can be evaluated efficiently using Bayesian inference toolkits available on the Internet

19 Next time … Gaussian mixtures
A lot of excellent reference material on Bayesian reasoning can be found at:


Download ppt "Cognitive Computer Vision"

Similar presentations


Ads by Google