Presentation is loading. Please wait.

Presentation is loading. Please wait.

Integration and Graphical Models

Similar presentations


Presentation on theme: "Integration and Graphical Models"— Presentation transcript:

1 Integration and Graphical Models
Derek Hoiem CS 598, Spring 2009 April 14, 2009

2 Why? The goal of vision is to make useful inferences about the scene. In most cases, this requires integrative reasoning about many types of information.

3 Example: 3D modeling

4 Object context From Divvala et al. CVPR 2009

5 How? Feature passing Graphical models

6 Class Today Feature passing Graphical models Example Bayesian networks
Markov networks Various inference and learning methods Example

7 Properties of a good mechanism for integration
Modular: different processes/estimates can be improved independently Symbiotic: each estimate improves Robust: mistakes in one process are not fatal for others that partially rely on it Feasible: training and inference is fast and easy

8 Feature Passing Compute features from one estimated scene property to help estimate another Image X Features X Estimate Y Features Y Estimate

9 Feature passing: example
Use features computed from “geometric context” confidence images to improve object detection Features: average confidence within each window Above Object Window Below Hoiem et al. ICCV 2005

10 Feature Passing Pros and cons Simple training and inference
Very flexible in modeling interactions Not modular if we get a new method for first estimates, we may need to retrain Requires iteration to be symbiotic complicates things Robust in expectation but not instance

11 Probabilistic graphical models
Explicitly model uncertainty and dependency structure Directed Undirected Factor graph a a a b b b c d c d c d Key concept: Markov blanket

12 Directed acyclical graph (Bayes net)
Arrow directions matter a a c independent of a given b d independent of a given b a,c,d dependent when conditioned on b b b c d c d P(a,b,c,d) = P(c|b)P(d|b)P(b|a)P(a) P(a,b,c,d) = P(b|a,c,d)P(a)P(c)P(d)

13 Directed acyclical graph (Bayes net)
Can model causality Parameter learning Decomposes: learn each term separately (ML) Inference Simple exact inference if tree-shaped (belief propagation) a b c d P(a,b,c,d) = P(c|b)P(d|b)P(b|a)P(a)

14 Directed acyclical graph (Bayes net)
Can model causality Parameter learning Decomposes: learn each term separately (ML) Inference Simple exact inference if tree-shaped (belief propagation) Loops require approximation Loopy BP Tree-reweighted BP Sampling a b c d P(a,b,c,d) = P(c|b)P(d|a,b)P(b|a)P(a)

15 Directed graph Example: Places and scenes
Place: office, kitchen, street, etc. Objects Present Fire Hydrant Car Person Toaster Microwave P(place, car, person, toaster, micro, hydrant) = P(place) P(car | place) P(person | place) … P(hydrant | place)

16 Directed graph Example: “Putting Objects in Perspective”

17 Undirected graph (Markov Networks)
Does not model causality Often pairwise Parameter learning difficult Inference usually approximate x1 x2 x3 x4

18 Markov Networks Example: “label smoothing” grid Binary nodes
Pairwise Potential 0 1 K 1 K 0

19 Factor graphs A general representation Factor Graph a Bayes Net a b b
d c d

20 Factor graphs A general representation Factor Graph a a b c Markov Net
d b c d

21 Factor graphs Write as a factor graph

22 Inference: Belief Propagation
Very general Approximate, except for tree-shaped graphs Generalizing variants BP can have better convergence for graphs with many loops or high potentials Standard packages available (BNT toolbox, my website) To learn more: Yedidia, J.S.; Freeman, W.T.; Weiss, Y., "Understanding Belief Propagation and Its Generalizations”, Technical Report, 2001:

23 Inference: Graph Cuts Associative: edge potentials penalize different labels Associative binary networks can be solved optimally (and quickly) using graph cuts Multilabel associative networks can be handled by alpha-expansion or alpha-beta swaps To learn more: Classic paper: What Energy Functions can be Minimized via Graph Cuts? (Kolmogorov and Zabih, ECCV '02/PAMI '04)

24 Inference: Sampling (MCMC)
Metropolis-Hastings algorithm Define transitions and transition probabilities Make sure you can get from any state to any other (ergodicity) Make proposal and accept if rand(1) < P(new state)/P(old state) P(backward transition) / P(transition) Note: if P(state) decomposes, this is easy to compute Example: “Image parsing” by Tu and Zhu to find good segmentation

25 Learning parameters: maximize likelihood
Simply count for Bayes network with discrete variables Run BP and do gradient descent for Markov network Often do not care about full likelihood

26 Learning parameters: maximize objective
SPSA (simultaneous perturbation stochastic approximation) algorithm: Take two trial steps in a random direction, one forward and one backwards Compute loss (or objective) for each and get a pseudo-gradient Take a step according to results Refs Li and Huttenlocher, “Learning for Optical Flow Using Stochastic Optimization”, ECCV 2008 Various papers by Spall on SPSA

27 Learning parameters: structured learning
See also Tsochantaridis et al.: Szummer et al. 2008

28 How to get the structure?
Set by hand (most common) Learn (mostly for Bayes nets) Maximize score (greedy search) Based on independence tests Logistic regression with L1 regularization for finding Markov blanket For more:

29 Graphical Models Pros and cons
Very powerful if dependency structure is sparse and known Modular (especially Bayesian networks) Flexible representation (but not as flexible as “feature passing”) Many inference methods Recent development in learning Markov network parameters, but still tricky

30 Which techniques have I used?
Almost all of them Feature passing (ICCV 2005, CVPR 2008) Bayesian networks (CVPR 2006) In factor graph form (ICCV 2007) Semi-naïve Bayes (CVPR 2004) Markov networks (ECCV 2008, CVPR 2007, CVPR 2005: HMM) Belief propagation (CVPR 2006, ICCV 2007) Structured learning (ECCV 2008) Graph cuts (CVPR 2008, ECCV 2008) MCMC (IJCV 2007… didn’t work well) Learning Bayesian structure ( , not published)

31 Example: faces, skin, cloth


Download ppt "Integration and Graphical Models"

Similar presentations


Ads by Google