Discrete Optimization Lecture 3 – Part 1 M. Pawan Kumar Slides available online

Slides:



Advertisements
Similar presentations
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Advertisements

Mean-Field Theory and Its Applications In Computer Vision1 1.
Primal-dual Algorithm for Convex Markov Random Fields Vladimir Kolmogorov University College London GDR (Optimisation Discrète, Graph Cuts et Analyse d'Images)
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London Tutorial at GDR (Optimisation Discrète, Graph Cuts.
1 LP, extended maxflow, TRW OR: How to understand Vladimirs most recent work Ramin Zabih Cornell University.
O BJ C UT M. Pawan Kumar Philip Torr Andrew Zisserman UNIVERSITY OF OXFORD.
Solving Markov Random Fields using Second Order Cone Programming Relaxations M. Pawan Kumar Philip Torr Andrew Zisserman.
Discrete Optimization Lecture 4 – Part 3 M. Pawan Kumar Slides available online
Introduction to Markov Random Fields and Graph Cuts Simon Prince
ICCV 2007 tutorial Part III Message-passing algorithms for energy minimization Vladimir Kolmogorov University College London.
Discrete Optimization in Computer Vision Nikos Komodakis Ecole des Ponts ParisTech, LIGM Traitement de l’information et vision artificielle.
An Analysis of Convex Relaxations (PART I) Minimizing Higher Order Energy Functions (PART 2) Philip Torr Work in collaboration with: Pushmeet Kohli, Srikumar.
Discrete Optimization for Vision and Learning. Who? How? M. Pawan Kumar Associate Professor Ecole Centrale Paris Nikos Komodakis Associate Professor Ecole.
Probabilistic Inference Lecture 1
Learning with Inference for Discrete Graphical Models Nikos Komodakis Pawan Kumar Nikos Paragios Ramin Zabih (presenter)
F IXING M AX -P RODUCT : A U NIFIED L OOK AT M ESSAGE P ASSING A LGORITHMS Nicholas Ruozzi and Sekhar Tatikonda Yale University.
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
Convergent and Correct Message Passing Algorithms Nicholas Ruozzi and Sekhar Tatikonda Yale University TexPoint fonts used in EMF. Read the TexPoint manual.
An Analysis of Convex Relaxations M. Pawan Kumar Vladimir Kolmogorov Philip Torr for MAP Estimation.
P 3 & Beyond Solving Energies with Higher Order Cliques Pushmeet Kohli Pawan Kumar Philip H. S. Torr Oxford Brookes University CVPR 2007.
Improved Moves for Truncated Convex Models M. Pawan Kumar Philip Torr.
Message Passing Algorithms for Optimization
Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Recovering Articulated Object Models from 3D Range Data Dragomir Anguelov Daphne Koller Hoi-Cheung Pang Praveen Srinivasan Sebastian Thrun Computer Science.
What Energy Functions Can be Minimized Using Graph Cuts? Shai Bagon Advanced Topics in Computer Vision June 2010.
Relaxations and Moves for MAP Estimation in MRFs M. Pawan Kumar STANFORDSTANFORD Vladimir KolmogorovPhilip TorrDaphne Koller.
Hierarchical Graph Cuts for Semi-Metric Labeling M. Pawan Kumar Joint work with Daphne Koller.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
Graph-Cut Algorithm with Application to Computer Vision Presented by Yongsub Lim Applied Algorithm Laboratory.
Extensions of submodularity and their application in computer vision
A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University.
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris École des Ponts ParisTech INRIA Saclay, Île-de-France Joint work with Phil.
Probabilistic Inference Lecture 4 – Part 2 M. Pawan Kumar Slides available online
MRFs and Segmentation with Graph Cuts Computer Vision CS 543 / ECE 549 University of Illinois Derek Hoiem 02/24/10.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
Planar Cycle Covering Graphs for inference in MRFS The Typhon Algorithm A New Variational Approach to Ground State Computation in Binary Planar Markov.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris Joint work with Phil Torr, Daphne Koller.
Rounding-based Moves for Metric Labeling M. Pawan Kumar Center for Visual Computing Ecole Centrale Paris.
Lena Gorelick joint work with O. Veksler I. Ben Ayed A. Delong Y. Boykov.
Learning a Small Mixture of Trees M. Pawan Kumar Daphne Koller Aim: To efficiently learn a.
Discrete Optimization Lecture 2 – Part I M. Pawan Kumar Slides available online
Discrete Optimization Lecture 5 – Part 2 M. Pawan Kumar Slides available online
Discrete Optimization Lecture 4 – Part 2 M. Pawan Kumar Slides available online
Probabilistic Inference Lecture 3 M. Pawan Kumar Slides available online
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London.
Discrete Optimization in Computer Vision M. Pawan Kumar Slides will be available online
Fast and accurate energy minimization for static or time-varying Markov Random Fields (MRFs) Nikos Komodakis (Ecole Centrale Paris) Nikos Paragios (Ecole.
Probabilistic Inference Lecture 5 M. Pawan Kumar Slides available online
Dynamic Tree Block Coordinate Ascent Daniel Tarlow 1, Dhruv Batra 2 Pushmeet Kohli 3, Vladimir Kolmogorov 4 1: University of Toronto3: Microsoft Research.
Update any set S of nodes simultaneously with step-size We show fixed point update is monotone for · 1/|S| Covering Trees and Lower-bounds on Quadratic.
Efficient Discriminative Learning of Parts-based Models M. Pawan Kumar Andrew Zisserman Philip Torr
O BJ C UT M. Pawan Kumar Philip Torr Andrew Zisserman UNIVERSITY OF OXFORD.
Discrete Optimization Lecture 2 – Part 2 M. Pawan Kumar Slides available online
Using Combinatorial Optimization within Max-Product Belief Propagation
Inference for Learning Belief Propagation. So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables.
Probabilistic Inference Lecture 2 M. Pawan Kumar Slides available online
Discrete Optimization Lecture 1 M. Pawan Kumar Slides available online
CS654: Digital Image Analysis Lecture 28: Advanced topics in Image Segmentation Image courtesy: IEEE, IJCV.
Tightening LP Relaxations for MAP using Message-Passing David Sontag Joint work with Talya Meltzer, Amir Globerson, Tommi Jaakkola, and Yair Weiss.
MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts M. Pawan Kumar Daphne Koller Aim: To obtain accurate, efficient maximum a posteriori (MAP)
Polyhedral Optimization Lecture 5 – Part 3 M. Pawan Kumar Slides available online
Rounding-based Moves for Metric Labeling M. Pawan Kumar École Centrale Paris INRIA Saclay, Île-de-France.
Introduction of BP & TRW-S
Learning a Region-based Scene Segmentation Model
Alexander Shekhovtsov and Václav Hlaváč
Discrete Inference and Learning
MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts
“Traditional” image segmentation
Presentation transcript:

Discrete Optimization Lecture 3 – Part 1 M. Pawan Kumar Slides available online

Energy Minimization VaVa VbVb VcVc VdVd Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) Label l 0 Label l 1

Energy Minimization VaVa VbVb VcVc VdVd Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) = 13 Label l 0 Label l 1

Energy Minimization VaVa VbVb VcVc VdVd Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) Label l 0 Label l 1

Energy Minimization VaVa VbVb VcVc VdVd Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) = 27 Label l 0 Label l 1

Energy Minimization VaVa VbVb VcVc VdVd Q(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) f* = arg min Q(f;  ) q* = min Q(f;  ) = Q(f*;  ) Label l 0 Label l 1

Min-Marginals VaVa VbVb VcVc VdVd f* = arg min Q(f;  ) such that f(a) = i Min-marginal q a;i Label l 0 Label l 1

Min-Marginals and MAP Minimum min-marginal of any variable = energy of MAP labelling min f Q(f;  ) such that f(a) = i q a;i min i min i ( ) V a has to take one label min f Q(f;  )

Recap We only need to know two sets of equations General form of Reparameterization  ’ a;i =  a;i  ’ ab;ik =  ab;ik + M ab;k - M ab;k + M ba;i - M ba;i  ’ b;k =  b;k Reparameterization of (a,b) in Belief Propagation M ab;k = min i {  a;i +  ab;ik } M ba;i = 0

Dynamic Programming 3 variables  2 variables + book-keeping n variables  (n-1) variables + book-keeping Start from left, go to right Reparameterize current edge (a,b) M ab;k = min i {  a;i +  ab;ik }  ’ ab;ik =  ab;ik + M ab;k - M ab;k  ’ b;k =  b;k Repeat

LP Relaxation and its Dual TRW Message Passing Outline

Integer Programming Formulation min ∑ a ∑ i  a;i y a;i + ∑ (a,b) ∑ ik  ab;ik y ab;ik y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k

Integer Programming Formulation min  T y y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k  = [ …  a;i …. ; …  ab;ik ….] y = [ … y a;i …. ; … y ab;ik ….]

Linear Programming Relaxation min  T y y a;i  {0,1} ∑ i y a;i = 1 y ab;ik = y a;i y b;k Two reasons why we can’t solve this

Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 y ab;ik = y a;i y b;k One reason why we can’t solve this

Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 ∑ k y ab;ik = ∑ k y a;i y b;k One reason why we can’t solve this

Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 One reason why we can’t solve this = 1 ∑ k y ab;ik = y a;i ∑ k y b;k

Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 ∑ k y ab;ik = y a;i One reason why we can’t solve this

Linear Programming Relaxation min  T y y a;i  [0,1] ∑ i y a;i = 1 ∑ k y ab;ik = y a;i No reason why we can’t solve this * * memory requirements, time complexity

Dual Let’s try to understand it intuitively

Dual of the LP Relaxation Wainwright et al., 2001 VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi  VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi 11 22 33 44 55 66  i   i  

Dual of the LP Relaxation Wainwright et al., 2001 q*(  1 )  i   i   q*(  2 ) q*(  3 ) q*(  4 )q*(  5 )q*(  6 )  q*(  i ) Dual of LP  VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi max

Dual of the LP Relaxation Wainwright et al., 2001  i   i   max  q*(  i ) I can easily compute q*(  i ) I can easily maintain reparam constraint So can I easily solve the dual?

LP Relaxation and its Dual TRW Message Passing Outline

Things to Remember Forward-pass computes min-marginals of root BP is exact for trees Every iteration provides a reparameterization

TRW Message Passing Kolmogorov, 2006 VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi 11 22 33 44 55 66  i   i    q*(  i ) Pick a variable VaVa

TRW Message Passing Kolmogorov, 2006  i   i    q*(  i ) VcVc VbVb VaVa  1 c;0  1 c;1  1 b;0  1 b;1  1 a;0  1 a;1 VaVa VdVd VgVg  4 a;0  4 a;1  4 d;0  4 d;1  4 g;0  4 g;1

TRW Message Passing Kolmogorov, 2006  1 +  4 +  rest   q*(  1 ) + q*(  4 ) + K VcVc VbVb VaVa VaVa VdVd VgVg Reparameterize to obtain min-marginals of V a  1 c;0  1 c;1  1 b;0  1 b;1  1 a;0  1 a;1  4 a;0  4 a;1  4 d;0  4 d;1  4 g;0  4 g;1

TRW Message Passing Kolmogorov, 2006  ’ 1 +  ’ 4 +  rest VcVc VbVb VaVa  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1 VaVa VdVd VgVg  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 One pass of Belief Propagation q*(  ’ 1 ) + q*(  ’ 4 ) + K

TRW Message Passing Kolmogorov, 2006  ’ 1 +  ’ 4 +  rest   VcVc VbVb VaVa VaVa VdVd VgVg Remain the same q*(  ’ 1 ) + q*(  ’ 4 ) + K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1

TRW Message Passing Kolmogorov, 2006  ’ 1 +  ’ 4 +  rest   min{  ’ 1 a;0,  ’ 1 a;1 } + min{  ’ 4 a;0,  ’ 4 a;1 } + K VcVc VbVb VaVa VaVa VdVd VgVg  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1

TRW Message Passing Kolmogorov, 2006  ’ 1 +  ’ 4 +  rest   VcVc VbVb VaVa VaVa VdVd VgVg Compute average of min-marginals of V a  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 min{  ’ 1 a;0,  ’ 1 a;1 } + min{  ’ 4 a;0,  ’ 4 a;1 } + K

TRW Message Passing Kolmogorov, 2006  ’ 1 +  ’ 4 +  rest   VcVc VbVb VaVa VaVa VdVd VgVg  ’’ a;0 =  ’ 1 a;0 +  ’ 4 a;0 2  ’’ a;1 =  ’ 1 a;1 +  ’ 4 a;1 2  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’ 1 a;0  ’ 1 a;1  ’ 4 a;0  ’ 4 a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 min{  ’ 1 a;0,  ’ 1 a;1 } + min{  ’ 4 a;0,  ’ 4 a;1 } + K

TRW Message Passing Kolmogorov, 2006  ’’ 1 +  ’’ 4 +  rest VcVc VbVb VaVa VaVa VdVd VgVg  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  ’’ a;0 =  ’ 1 a;0 +  ’ 4 a;0 2  ’’ a;1 =  ’ 1 a;1 +  ’ 4 a;1 2 min{  ’ 1 a;0,  ’ 1 a;1 } + min{  ’ 4 a;0,  ’ 4 a;1 } + K

TRW Message Passing Kolmogorov, 2006  ’’ 1 +  ’’ 4 +  rest   VcVc VbVb VaVa VaVa VdVd VgVg  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  ’’ a;0 =  ’ 1 a;0 +  ’ 4 a;0 2  ’’ a;1 =  ’ 1 a;1 +  ’ 4 a;1 2 min{  ’ 1 a;0,  ’ 1 a;1 } + min{  ’ 4 a;0,  ’ 4 a;1 } + K

TRW Message Passing Kolmogorov, 2006 VcVc VbVb VaVa VaVa VdVd VgVg 2 min{  ’’ a;0,  ’’ a;1 } + K  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1  ’’ 1 +  ’’ 4 +  rest    ’’ a;0 =  ’ 1 a;0 +  ’ 4 a;0 2  ’’ a;1 =  ’ 1 a;1 +  ’ 4 a;1 2

TRW Message Passing Kolmogorov, 2006 VcVc VbVb VaVa VaVa VdVd VgVg  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 min {p 1 +p 2, q 1 +q 2 }min {p 1, q 1 } + min {p 2, q 2 } ≥ 2 min{  ’’ a;0,  ’’ a;1 } + K  ’’ 1 +  ’’ 4 +  rest  

TRW Message Passing Kolmogorov, 2006 VcVc VbVb VaVa VaVa VdVd VgVg Objective function increases or remains constant  ’ 1 c;0  ’ 1 c;1  ’ 1 b;0  ’ 1 b;1  ’’ a;0  ’’ a;1  ’’ a;0  ’’ a;1  ’ 4 d;0  ’ 4 d;1  ’ 4 g;0  ’ 4 g;1 2 min{  ’’ a;0,  ’’ a;1 } + K  ’’ 1 +  ’’ 4 +  rest  

TRW Message Passing Initialize  i. Take care of reparam constraint Choose random variable V a Compute min-marginals of V a for all trees Node-average the min-marginals REPEAT Kolmogorov, 2006 Can also do edge-averaging

Preliminaries LP Relaxation and its Dual TRW Message Passing –Examples –Primal Solution –Results Outline

Example 1 VaVa VbVb l0l0 l1l1 VbVb VcVc VcVc VaVa Pick variable V a. Reparameterize.

Example 1 VaVa VbVb VbVb VcVc VcVc VaVa Average the min-marginals of V a l0l0 l1l1

Example 1 VaVa VbVb VbVb VcVc VcVc VaVa Pick variable V b. Reparameterize. l0l0 l1l1

Example 1 VaVa VbVb VbVb VcVc VcVc VaVa Average the min-marginals of V b l0l0 l1l1

Example 1 VaVa VbVb VbVb VcVc VcVc VaVa Value of dual does not increase l0l0 l1l1

Example 1 VaVa VbVb VbVb VcVc VcVc VaVa Maybe it will increase for V c NO l0l0 l1l1

Example 1 VaVa VbVb VbVb VcVc VcVc VaVa Strong Tree Agreement Exact MAP Estimate f 1 (a) = 0f 1 (b) = 0f 2 (b) = 0f 2 (c) = 0f 3 (c) = 0f 3 (a) = 0 l0l0 l1l1

Example 2 VaVa VbVb VbVb VcVc VcVc VaVa Pick variable V a. Reparameterize. l0l0 l1l1

Example 2 VaVa VbVb VbVb VcVc VcVc VaVa Average the min-marginals of V a l0l0 l1l1

Example 2 VaVa VbVb VbVb VcVc VcVc VaVa Value of dual does not increase l0l0 l1l1

Example 2 VaVa VbVb VbVb VcVc VcVc VaVa Maybe it will decrease for V b or V c NO l0l0 l1l1

Example 2 VaVa VbVb VbVb VcVc VcVc VaVa f 1 (a) = 1f 1 (b) = 1f 2 (b) = 1f 2 (c) = 0f 3 (c) = 1f 3 (a) = 1 f 2 (b) = 0f 2 (c) = 1 Weak Tree Agreement Not Exact MAP Estimate l0l0 l1l1

Example 2 VaVa VbVb VbVb VcVc VcVc VaVa Weak Tree Agreement Convergence point of TRW l0l0 l1l1 f 1 (a) = 1f 1 (b) = 1f 2 (b) = 1f 2 (c) = 0f 3 (c) = 1f 3 (a) = 1 f 2 (b) = 0f 2 (c) = 1

Preliminaries LP Relaxation and its Dual TRW Message Passing –Examples –Primal Solution –Results Outline

Obtaining the Labelling Only solves the dual. Primal solutions? VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi  ’ =   i   Fix the label Of V a

Obtaining the Labelling Only solves the dual. Primal solutions? VaVa VbVb VcVc VdVd VeVe VfVf VgVg VhVh ViVi  ’ =   i   Fix the label Of V b Continue in some fixed order Meltzer et al., 2006

Computational Issues of TRW Speed-ups for some pairwise potentials Basic Component is Belief Propagation Felzenszwalb & Huttenlocher, 2004 Memory requirements cut down by half Kolmogorov, 2006 Further speed-ups using monotonic chains Kolmogorov, 2006

Theoretical Properties of TRW Always converges, unlike BP Kolmogorov, 2006 Strong tree agreement implies exact MAP Wainwright et al., 2001 Optimal MAP for two-label submodular problems Kolmogorov and Wainwright, 2005  ab;00 +  ab;11 ≤  ab;01 +  ab;10

Preliminaries LP Relaxation and its Dual TRW Message Passing –Examples –Primal Solution –Results Outline

Results Binary Segmentation Szeliski et al., 2008 Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Pairwise Potentials: 0, if same labels 1 - exp(|d a - d b |), if different labels

Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al., 2008 Pairwise Potentials: 0, if same labels 1 - exp(|d a - d b |), if different labels TRW

Results Binary Segmentation Labels - {foreground, background} Unary Potentials: -log(likelihood) using learnt fg/bg models Szeliski et al., 2008 Belief Propagation Pairwise Potentials: 0, if same labels 1 - exp(|d a - d b |), if different labels

Results Stereo Correspondence Szeliski et al., 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials: 0, if same labels 1 - exp(|d a - d b |), if different labels

Results Szeliski et al., 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Pairwise Potentials: 0, if same labels 1 - exp(|d a - d b |), if different labels TRW Stereo Correspondence

Results Szeliski et al., 2008 Labels - {disparities} Unary Potentials: Similarity of pixel colours Belief Propagation Pairwise Potentials: 0, if same labels 1 - exp(|d a - d b |), if different labels Stereo Correspondence

Results Non-submodular problems Kolmogorov, 2006 BP TRW-S 30x30 grid K 50 BP TRW-S BP outperforms TRW-S

Code + Standard Data