An Analysis of Convex Relaxations for MAP Estimation

Slides:



Advertisements
Similar presentations
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Advertisements

POSE–CUT Simultaneous Segmentation and 3D Pose Estimation of Humans using Dynamic Graph Cuts Mathieu Bray Pushmeet Kohli Philip H.S. Torr Department of.
Mean-Field Theory and Its Applications In Computer Vision1 1.
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London Tutorial at GDR (Optimisation Discrète, Graph Cuts.
O BJ C UT M. Pawan Kumar Philip Torr Andrew Zisserman UNIVERSITY OF OXFORD.
Solving Markov Random Fields using Second Order Cone Programming Relaxations M. Pawan Kumar Philip Torr Andrew Zisserman.
Solving Markov Random Fields using Dynamic Graph Cuts & Second Order Cone Programming Relaxations M. Pawan Kumar, Pushmeet Kohli Philip Torr.
Convex Programming Brookes Vision Reading Group. Huh? What is convex ??? What is programming ??? What is convex programming ???
C&O 355 Lecture 23 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
C&O 355 Mathematical Programming Fall 2010 Lecture 22 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A.
Engineering Optimization
MS&E 211 Quadratic Programming Ashish Goel. A simple quadratic program Minimize (x 1 ) 2 Subject to: -x 1 + x 2 ≥ 3 -x 1 – x 2 ≥ -2.
ICCV 2007 tutorial Part III Message-passing algorithms for energy minimization Vladimir Kolmogorov University College London.
An Analysis of Convex Relaxations (PART I) Minimizing Higher Order Energy Functions (PART 2) Philip Torr Work in collaboration with: Pushmeet Kohli, Srikumar.
Nonlinear Programming
Discrete Optimization Shi-Chung Chang. Discrete Optimization Lecture #1 Today: Reading Assignments 1.Chapter 1 and the Appendix of [Pas82] 2.Chapter 1.
Continuous optimization Problems and successes
Efficient Inference for Fully-Connected CRFs with Stationarity
1 Fast Primal-Dual Strategies for MRF Optimization (Fast PD) Robot Perception Lab Taha Hamedani Aug 2014.
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
A Linear Round Lower Bound for Lovasz-Schrijver SDP relaxations of Vertex Cover Grant Schoenebeck Luca Trevisan Madhur Tulsiani UC Berkeley.
1-norm Support Vector Machines Good for Feature Selection  Solve the quadratic program for some : min s. t.,, denotes where or membership. Equivalent.
Semidefinite Programming
ICCV Tutorial 2007 Philip Torr Papers, presentations and videos on web.....
1 Optimization problems such as MAXSAT, MIN NODE COVER, MAX INDEPENDENT SET, MAX CLIQUE, MIN SET COVER, TSP, KNAPSACK, BINPACKING do not have a polynomial.
An Analysis of Convex Relaxations M. Pawan Kumar Vladimir Kolmogorov Philip Torr for MAP Estimation.
Improved Moves for Truncated Convex Models M. Pawan Kumar Philip Torr.
Support Vector Machines Formulation  Solve the quadratic program for some : min s. t.,, denotes where or membership.  Different error functions and measures.
Computer Algorithms Integer Programming ECE 665 Professor Maciej Ciesielski By DFG.
Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under contract.
Relaxations and Moves for MAP Estimation in MRFs M. Pawan Kumar STANFORDSTANFORD Vladimir KolmogorovPhilip TorrDaphne Koller.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
Extensions of submodularity and their application in computer vision
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris École des Ponts ParisTech INRIA Saclay, Île-de-France Joint work with Phil.
Probabilistic Inference Lecture 4 – Part 2 M. Pawan Kumar Slides available online
13.7 – Graphing Linear Inequalities Are the ordered pairs a solution to the problem?
Approximating Minimum Bounded Degree Spanning Tree (MBDST) Mohit Singh and Lap Chi Lau “Approximating Minimum Bounded DegreeApproximating Minimum Bounded.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris Joint work with Phil Torr, Daphne Koller.
Section 4-1: Introduction to Linear Systems. To understand and solve linear systems.
Learning a Small Mixture of Trees M. Pawan Kumar Daphne Koller Aim: To efficiently learn a.
Discrete Optimization Lecture 2 – Part I M. Pawan Kumar Slides available online
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London.
Discrete Optimization Lecture 3 – Part 1 M. Pawan Kumar Slides available online
Probabilistic Inference Lecture 5 M. Pawan Kumar Slides available online
Efficient Discriminative Learning of Parts-based Models M. Pawan Kumar Andrew Zisserman Philip Torr
Introduction to Semidefinite Programs Masakazu Kojima Semidefinite Programming and Its Applications Institute for Mathematical Sciences National University.
Discrete Optimization Lecture 2 – Part 2 M. Pawan Kumar Slides available online
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
M3 1.5 Systems of Linear Inequalities M3 1.5 Systems of Linear Inequalities Essential Questions: How can we write and graph a system of linear inequalities.
Inference for Learning Belief Propagation. So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables.
Probabilistic Inference Lecture 2 M. Pawan Kumar Slides available online
1 Scale and Rotation Invariant Matching Using Linearly Augmented Tree Hao Jiang Boston College Tai-peng Tian, Stan Sclaroff Boston University.
Discrete Optimization Lecture 1 M. Pawan Kumar Slides available online
MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts M. Pawan Kumar Daphne Koller Aim: To obtain accurate, efficient maximum a posteriori (MAP)
C&O 355 Lecture 19 N. Harvey TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A A A A A A A.
Approximation Algorithms based on linear programming.
Rounding-based Moves for Metric Labeling M. Pawan Kumar École Centrale Paris INRIA Saclay, Île-de-France.
Linear Programming (LP) Vector Form Maximize: cx Subject to : Ax  b c = (c 1, c 2, …, c n ) x = b = A = Summation Form Maximize:  c i x i Subject to:
CS4234 Optimiz(s)ation Algorithms L2 – Linear Programming.
Introduction of BP & TRW-S
3.3 Systems of Inequalities
3.7 Systems of Inequalities
6-6 Systems of Linear Inequalities
3.1 Inequalities and Their Graphs
Field.
Linear Programming Example: Maximize x + y x and y are called
MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts
Presentation transcript:

An Analysis of Convex Relaxations for MAP Estimation M. Pawan Kumar Vladimir Kolmogorov Philip H.S. Torr http://cms.brookes.ac.uk/research/visiongroup http://www.adastral.ucl.ac.uk/~vladkolm Aim: To analyze Maximum a Posteriori (MAP) estimation methods based on convex relaxations Comparing Relaxations Two New SOCP Relaxations Domination For all (u,P) For at least one (u,P) All LP-S constraints SOCP-C a b a b MAP Estimation - Integer Programming Formulation Cycle ‘G’ ≥ > 4 [ -1, 1 ; 1, -1] 2 Label Vector x c d c 3 1 5 2 strictly dominates [ 5, 2 ; 2, 4] = 0 Unary Vector u A dominates B A B Va Vb = 1 Pairwise Matrix P - 3 1 a b b c c a Equivalent relaxations: A dominates B, B dominates A. Random Field Example Unary Cost = 0 LP-S = 0 SOCP-C = 0.75 #variables n = 2 #labels h = 2 A  B = ∑ Aij Bij Comparing Existing SOCP and QP Relaxations Dominated by linear cycle inequalities? Xab;ij min P  X xa;i xb;j SOCP-Q All cycle inequalities arg min xT (4u + 2P1) + P  X, subject to ∑i xa;i = 2 - h x  {-1,1}nh X = x xT Pab;ij ≥ 0 Xab;ij = infimum a b a b MAP x* Pab;ij < 0 Xab;ij = supremum Clique ‘G’ (xa;i+ xb;j)2 ≤ 2 + 2Xab;ij c d c (xa;i- xb;j)2 ≤ 2 - 2Xab;ij SOCP-MS is QP-RL SOCP-MS Muramatsu and Suzuki, 2003 Ravikumar and Lafferty, 2006 = -1/3 LP-S vs. SOCP Relaxations over Trees and Cycles a b b c c d G = (V,E) = 1/3 Linear Programming (LP) Relaxation Schlesinger, 1976 Va Vb a b a b 2 1 1 d a a c b d LP-S 1 2 1 1 xa;1 = -1/3 xa;0 = -1/3 Non-convex Constraint X = x xT x  {-1,1}nh 1 1 2 1 Dominates linear cycle inequalities Vd Vc d c d c 1 1 2 Open Questions Convex Relaxation V (Constrained Variables) 1+xa;i+xb;j+Xab;ij ≥ 0 E (Constrained Pairs) x  [-1,1]nh Random Field C Matrix Cycle inequalities vs. SOCP/QP? j Xab;ij = (2-h) xa;i Tree (SOCP-T) Even Cycle (SOCP-E) Odd Cycle (SOCP-O) Best ‘C’ for special cases? Second Order Cone Programming (SOCP) Relaxations Efficient solutions for SOCP? Non-convex Constraint X = x xT Convex Relaxation Kim and Kojima, 2000 Future Work C = U UT 50 random fields 4 neighbourhood 8 neighbourhood ||UTx||2≤ C  X LP-S dominates SOCP-T Pab;ij ≥ 0 Pab;ij ≥ 0 for one (a,b) …  Special Case: Edge Pab;ij ≤ 0 Pab;ij ≤ 0 for one/all (a,b) SOCP-MS/ QP-RL LP-S dominates SOCP-E LP-S dominates SOCP-O