Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University.

Slides:



Advertisements
Similar presentations
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Advertisements

POSE–CUT Simultaneous Segmentation and 3D Pose Estimation of Humans using Dynamic Graph Cuts Mathieu Bray Pushmeet Kohli Philip H.S. Torr Department of.
Iterative Rounding and Iterative Relaxation
Mean-Field Theory and Its Applications In Computer Vision1 1.
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London Tutorial at GDR (Optimisation Discrète, Graph Cuts.
1 LP, extended maxflow, TRW OR: How to understand Vladimirs most recent work Ramin Zabih Cornell University.
O BJ C UT M. Pawan Kumar Philip Torr Andrew Zisserman UNIVERSITY OF OXFORD.
Solving Markov Random Fields using Second Order Cone Programming Relaxations M. Pawan Kumar Philip Torr Andrew Zisserman.
Solving Markov Random Fields using Dynamic Graph Cuts & Second Order Cone Programming Relaxations M. Pawan Kumar, Pushmeet Kohli Philip Torr.
Constrained Approximate Maximum Entropy Learning (CAMEL) Varun Ganapathi, David Vickrey, John Duchi, Daphne Koller Stanford University TexPoint fonts used.
ICCV 2007 tutorial Part III Message-passing algorithms for energy minimization Vladimir Kolmogorov University College London.
Discrete Optimization in Computer Vision Nikos Komodakis Ecole des Ponts ParisTech, LIGM Traitement de l’information et vision artificielle.
An Analysis of Convex Relaxations (PART I) Minimizing Higher Order Energy Functions (PART 2) Philip Torr Work in collaboration with: Pushmeet Kohli, Srikumar.
Lecture 8 – Nonlinear Programming Models Topics General formulations Local vs. global solutions Solution characteristics Convexity and convex programming.
Learning with Inference for Discrete Graphical Models Nikos Komodakis Pawan Kumar Nikos Paragios Ramin Zabih (presenter)
1 Fast Primal-Dual Strategies for MRF Optimization (Fast PD) Robot Perception Lab Taha Hamedani Aug 2014.
Belief Propagation on Markov Random Fields Aggeliki Tsoli.
ICCV Tutorial 2007 Philip Torr Papers, presentations and videos on web.....
An Analysis of Convex Relaxations M. Pawan Kumar Vladimir Kolmogorov Philip Torr for MAP Estimation.
P 3 & Beyond Solving Energies with Higher Order Cliques Pushmeet Kohli Pawan Kumar Philip H. S. Torr Oxford Brookes University CVPR 2007.
Improved Moves for Truncated Convex Models M. Pawan Kumar Philip Torr.
2010/5/171 Overview of graph cuts. 2010/5/172 Outline Introduction S-t Graph cuts Extension to multi-label problems Compare simulated annealing and alpha-
Message Passing Algorithms for Optimization
Exploiting Duality (Particularly the dual of SVM) M. Pawan Kumar VISUAL GEOMETRY GROUP.
MAP estimation in MRFs via rank aggregation Rahul Gupta Sunita Sarawagi (IBM India Research Lab) (IIT Bombay)
Relaxations and Moves for MAP Estimation in MRFs M. Pawan Kumar STANFORDSTANFORD Vladimir KolmogorovPhilip TorrDaphne Koller.
Hierarchical Graph Cuts for Semi-Metric Labeling M. Pawan Kumar Joint work with Daphne Koller.
Invariant Large Margin Nearest Neighbour Classifier M. Pawan Kumar Philip Torr Andrew Zisserman.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
Extensions of submodularity and their application in computer vision
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris École des Ponts ParisTech INRIA Saclay, Île-de-France Joint work with Phil.
Probabilistic Inference Lecture 4 – Part 2 M. Pawan Kumar Slides available online
Reconstructing Relief Surfaces George Vogiatzis, Philip Torr, Steven Seitz and Roberto Cipolla BMVC 2004.
Planar Cycle Covering Graphs for inference in MRFS The Typhon Algorithm A New Variational Approach to Ground State Computation in Binary Planar Markov.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris Joint work with Phil Torr, Daphne Koller.
Rounding-based Moves for Metric Labeling M. Pawan Kumar Center for Visual Computing Ecole Centrale Paris.
Lena Gorelick joint work with O. Veksler I. Ben Ayed A. Delong Y. Boykov.
Learning a Small Mixture of Trees M. Pawan Kumar Daphne Koller Aim: To efficiently learn a.
Discrete Optimization Lecture 2 – Part I M. Pawan Kumar Slides available online
Discrete Optimization Lecture 4 – Part 2 M. Pawan Kumar Slides available online
An Efficient Message-Passing Algorithm for the M-Best MAP Problem Dhruv Batra (Currently) Research Assistant Professor TTI-Chicago (Spring 2013) Assistant.
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London.
Discrete Optimization in Computer Vision M. Pawan Kumar Slides will be available online
Associative Hierarchical CRFs for Object Class Image Segmentation Ľubor Ladický 1 1 Oxford Brookes University 2 Microsoft Research Cambridge Based on the.
Discrete Optimization Lecture 3 – Part 1 M. Pawan Kumar Slides available online
Fast and accurate energy minimization for static or time-varying Markov Random Fields (MRFs) Nikos Komodakis (Ecole Centrale Paris) Nikos Paragios (Ecole.
Probabilistic Inference Lecture 5 M. Pawan Kumar Slides available online
Dynamic Tree Block Coordinate Ascent Daniel Tarlow 1, Dhruv Batra 2 Pushmeet Kohli 3, Vladimir Kolmogorov 4 1: University of Toronto3: Microsoft Research.
Update any set S of nodes simultaneously with step-size We show fixed point update is monotone for · 1/|S| Covering Trees and Lower-bounds on Quadratic.
Efficient Discriminative Learning of Parts-based Models M. Pawan Kumar Andrew Zisserman Philip Torr
O BJ C UT M. Pawan Kumar Philip Torr Andrew Zisserman UNIVERSITY OF OXFORD.
Tractable Higher Order Models in Computer Vision (Part II) Slides from Carsten Rother, Sebastian Nowozin, Pusohmeet Khli Microsoft Research Cambridge Presented.
Discrete Optimization Lecture 2 – Part 2 M. Pawan Kumar Slides available online
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
Inference for Learning Belief Propagation. So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables.
Probabilistic Inference Lecture 2 M. Pawan Kumar Slides available online
Discrete Optimization Lecture 1 M. Pawan Kumar Slides available online
Tightening LP Relaxations for MAP using Message-Passing David Sontag Joint work with Talya Meltzer, Amir Globerson, Tommi Jaakkola, and Yair Weiss.
MAP Estimation in Binary MRFs using Bipartite Multi-Cuts Sashank J. Reddi Sunita Sarawagi Sundar Vishwanathan Indian Institute of Technology, Bombay TexPoint.
MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts M. Pawan Kumar Daphne Koller Aim: To obtain accurate, efficient maximum a posteriori (MAP)
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
Rounding-based Moves for Metric Labeling M. Pawan Kumar École Centrale Paris INRIA Saclay, Île-de-France.
TU/e Algorithms (2IL15) – Lecture 12 1 Linear Programming.
Introduction of BP & TRW-S
Alexander Shekhovtsov and Václav Hlaváč
Nonnegative polynomials and applications to learning
Efficiently Selecting Regions for Scene Understanding
Polynomial DC decompositions
An Analysis of Convex Relaxations for MAP Estimation
Presentation transcript:

Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University

Aim a bcd Label ‘0’ Label ‘1’ Labelling m = {1, 0, 0, 1} Random Variables V = {a, b, c, d} Label Set L = {0, 1} To solve convex relaxations of MAP estimation Edges E = {(a, b), (b, c), (c, d)}

Aim Label ‘0’ Label ‘1’ Cost(m) = = 13 Approximate using Convex Relaxations Minimum Cost Labelling? NP-hard problem To solve convex relaxations of MAP estimation a bcd

Aim Label ‘0’ Label ‘1’ Objectives Solve tighter convex relaxations – LP and SOCP Handle large number of random variables, e.g. image pixels To solve convex relaxations of MAP estimation a bcd

Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

Integer Programming Formulation a b Label ‘0’ Label ‘1’ Unary Cost Unary Cost Vector u = [ 5 Cost of a = 0 2 Cost of a = 1 ; 2 4 ] Labelling m = {1, 0}

Label ‘0’ Label ‘1’ Unary Cost Unary Cost Vector u = [ 5 2 ; 2 4 ] T Labelling m = {1, 0} Label vector x = [ -1 a  0 1 a = 1 ; 1 -1 ] T Recall that the aim is to find the optimal x Integer Programming Formulation a b

Label ‘0’ Label ‘1’ Unary Cost Unary Cost Vector u = [ 5 2 ; 2 4 ] T Labelling m = {1, 0} Label vector x = [ -11; 1 -1 ] T Sum of Unary Costs = 1 2 ∑ i u i (1 + x i ) Integer Programming Formulation a b

Label ‘0’ Label ‘1’ Pairwise Cost Labelling m = {1, 0} Pairwise Cost of a and a Cost of a = 0 and b = 0 3 Cost of a = 0 and b = Pairwise Cost Matrix P Integer Programming Formulation a b

Label ‘0’ Label ‘1’ Pairwise Cost Labelling m = {1, 0} Pairwise Cost Matrix P Sum of Pairwise Costs 1 4 ∑ ij P ij (1 + x i )(1+x j ) Integer Programming Formulation a b

Label ‘0’ Label ‘1’ Pairwise Cost Labelling m = {1, 0} Pairwise Cost Matrix P Sum of Pairwise Costs 1 4 ∑ ij P ij (1 + x i +x j + x i x j ) 1 4 ∑ ij P ij (1 + x i + x j + X ij )= X = x x T X ij = x i x j Integer Programming Formulation a b

Constraints Uniqueness Constraint ∑ x i = 2 - |L| i  a Integer Constraints x i  {-1,1} X = x x T Integer Programming Formulation

x* = argmin 1 2 ∑ u i (1 + x i ) ∑ P ij (1 + x i + x j + X ij ) ∑ x i = 2 - |L| i  a x i  {-1,1} X = x x T Convex Non-Convex Integer Programming Formulation

Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

Linear Programming Relaxation x* = argmin 1 2 ∑ u i (1 + x i ) ∑ P ij (1 + x i + x j + X ij ) ∑ x i = 2 - |L| i  a x i  {-1,1} X = x x T Retain Convex Part Schlesinger, 1976

Linear Programming Relaxation x* = argmin 1 2 ∑ u i (1 + x i ) ∑ P ij (1 + x i + x j + X ij ) ∑ x i = 2 - |L| i  a Retain Convex Part Schlesinger, 1976 X ij  [-1,1] 1 + x i + x j + X ij ≥ 0 ∑ X ij = (2 - |L|) x i j  b x i  [-1,1]

Dual of the LP Relaxation Wainwright et al., 2001 abc def ghi  = (u, P) abc def ghi abc def ghi 11 22 33 44 55 66 11 22 33 44 55 66  ii   ii  

Dual of the LP Relaxation Wainwright et al., 2001 abc def ghi  = (u, P) abc def ghi abc def ghi 11 22 33 44 55 66 Q(  1 )  ii   ii   Q(  2 ) Q(  3 ) Q(  4 )Q(  5 )Q(  6 ) max   i Q(  i ) Dual of LP

Tree-Reweighted Message Passing Kolmogorov, 2005 abc def ghi abc def ghi 11 22 33 44 55 66 Pick a variable cbaadg a Reparameterize such that u i are min-marginals u1u1 u2u2 u3u3 u4u4 Only one pass of belief propagation

Tree-Reweighted Message Passing Kolmogorov, 2005 abc def ghi abc def ghi 11 22 33 44 55 66 Pick a variable cbaadg a Average the unary costs (u 1 +u 3 )/2 Repeat for all variables (u 1 +u 3 )/2 (u 2 +u 4 )/2 TRW-S

Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

Cycle Inequalities Chopra and Rao, 1991 a ef bc d a ed xixi xjxj xkxk At least two of them have the same sign x i x j x j x k x k x i X ij X jk X ki X = xx T At least one of them is 1 X ij + X jk + X ki  -1

Cycle Inequalities Chopra and Rao, 1991 a ef bc d X ij + X jk + X kl - X li  -2 xjxj b fe xixi xkxk c xlxl Generalizes to all cycles LP-C

Second-Order Cone Constraints Kumar et al., 2007 a ef bc d x c = xixi xjxj xkxk X c = 1 X ij X ik X jk X ik 1 1 X c = x c x c T X c x c x c T 1 (X c - x c x c T )  0 (x i +x j +x k ) 2 ≤ 3 + X ij + X jk + X ki SOCP-C

Second-Order Cone Constraints Kumar et al., 2007 a ef bc d 1 (X c - x c x c T )  0 SOCP-Q x c = xixi xjxj xkxk X c = 1 X ij X ik X jk X ik 1 1 xlxl X il X jl X kl X il X jl X kl 1

Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

Modifying the Dual abc def ghi  ii   ii   max   i Q(  i ) 11 22 33 abc def ghi 44 55 66 adg beh cfi +   j s j 11 22 ab de bc ef de gh ef hi 33 44

Modifying TRW-S abc def ghi adg beh cfi ab de bc ef de gh ef hi Pick a variable --- a Pick a cycle/clique with a  ii   ii   max   i Q(  i )+  j s j Can be solved efficiently Run TRW-S for trees with a REPEAT

Properties of the Algorithm Algorithm satisfies the reparametrization constraint Value of dual never decreasesCONVERGENCE Solution satisfies Weak Tree Agreement (WTA) WTA not sufficient for convergence More accurate results than TRW-S

Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

4-Neighbourhood MRF Test SOCP-C 50 binary MRFs of size 30x30 u ≈ N (0,1) P ≈ N (0,σ 2 ) Test LP-C

4-Neighbourhood MRF σ = 5 LP-C dominates SOCP-C

8-Neighbourhood MRF Test SOCP-Q 50 binary MRFs of size 30x30 u ≈ N (0,1) P ≈ N (0,σ 2 )

8-Neighbourhood MRF σ = 5 /  2 SOCP-Q dominates LP-C

Conclusions Modified LP dual to include more constraints Extended TRW-S to solve tighter dual Experiments show improvement More results in the poster

Future Work More efficient subroutines for solving cycles/cliques Using more accurate LP solvers - proximal projections Analysis of SOCP-C vs. LP-C

Questions?

Timings MethodTime/Iteration BP TRW-S LP-C SOCP-C SOCP-Q Linear in the number of variables!!

Video Segmentation KeyframeUser Segmentation Segment remaining video ….

Video Segmentation Belief Propagation Input

Video Segmentation  -swap Input

Video Segmentation  -expansion Input

Video Segmentation TRW-S Input

Video Segmentation LP-C Input

Video Segmentation SOCP-Q Input 000

4-Neighbourhood MRF σ = 1

4-Neighbourhood MRF σ = 2.5

8-Neighbourhood MRF σ = 1/  2

8-Neighbourhood MRF σ = 2.5 /  2