Rounding-based Moves for Metric Labeling M. Pawan Kumar École Centrale Paris INRIA Saclay, Île-de-France.

Slides:



Advertisements
Similar presentations
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Advertisements

MAP Estimation Algorithms in
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London Tutorial at GDR (Optimisation Discrète, Graph Cuts.
1 LP, extended maxflow, TRW OR: How to understand Vladimirs most recent work Ramin Zabih Cornell University.
Solving Markov Random Fields using Second Order Cone Programming Relaxations M. Pawan Kumar Philip Torr Andrew Zisserman.
Solving IPs – Cutting Plane Algorithm General Idea: Begin by solving the LP relaxation of the IP problem. If the LP relaxation results in an integer solution,
Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
Graph Cut Algorithms for Computer Vision & Medical Imaging Ramin Zabih Computer Science & Radiology Cornell University Joint work with Y. Boykov, V. Kolmogorov,
ICCV 2007 tutorial Part III Message-passing algorithms for energy minimization Vladimir Kolmogorov University College London.
Dependent Randomized Rounding in Matroid Polytopes (& Related Results) Chandra Chekuri Jan VondrakRico Zenklusen Univ. of Illinois IBM ResearchMIT.
The University of Ontario CS 4487/9687 Algorithms for Image Analysis Multi-Label Image Analysis Problems.
An Analysis of Convex Relaxations (PART I) Minimizing Higher Order Energy Functions (PART 2) Philip Torr Work in collaboration with: Pushmeet Kohli, Srikumar.
Discrete Optimization for Vision and Learning. Who? How? M. Pawan Kumar Associate Professor Ecole Centrale Paris Nikos Komodakis Associate Professor Ecole.
Learning with Inference for Discrete Graphical Models Nikos Komodakis Pawan Kumar Nikos Paragios Ramin Zabih (presenter)
Approximation Algoirthms: Semidefinite Programming Lecture 19: Mar 22.
Robust Higher Order Potentials For Enforcing Label Consistency
Design of Optimal Multiple Spaced Seeds for Homology Search Jinbo Xu School of Computer Science, University of Waterloo Joint work with D. Brown, M. Li.
P 3 & Beyond Solving Energies with Higher Order Cliques Pushmeet Kohli Pawan Kumar Philip H. S. Torr Oxford Brookes University CVPR 2007.
Improved Moves for Truncated Convex Models M. Pawan Kumar Philip Torr.
2010/5/171 Overview of graph cuts. 2010/5/172 Outline Introduction S-t Graph cuts Extension to multi-label problems Compare simulated annealing and alpha-
Stereo & Iterative Graph-Cuts Alex Rav-Acha Vision Course Hebrew University.
Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University.
Graph Cut based Inference with Co-occurrence Statistics Ľubor Ladický, Chris Russell, Pushmeet Kohli, Philip Torr.
Integer Programming Difference from linear programming –Variables x i must take on integral values, not real values Lots of interesting problems can be.
Relaxations and Moves for MAP Estimation in MRFs M. Pawan Kumar STANFORDSTANFORD Vladimir KolmogorovPhilip TorrDaphne Koller.
Hierarchical Graph Cuts for Semi-Metric Labeling M. Pawan Kumar Joint work with Daphne Koller.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
MAP Estimation Algorithms in M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Computer Vision - Part I.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris École des Ponts ParisTech INRIA Saclay, Île-de-France Joint work with Phil.
Probabilistic Inference Lecture 4 – Part 2 M. Pawan Kumar Slides available online
Graph Cut & Energy Minimization
Procedural Modeling of Architectures towards 3D Reconstruction Nikos Paragios Ecole Centrale Paris / INRIA Saclay Ile-de-France Joint Work: P. Koutsourakis,
Graph Cut Algorithms for Binocular Stereo with Occlusions
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Multiplicative Bounds for Metric Labeling M. Pawan Kumar École Centrale Paris Joint work with Phil Torr, Daphne Koller.
Rounding-based Moves for Metric Labeling M. Pawan Kumar Center for Visual Computing Ecole Centrale Paris.
Learning a Small Mixture of Trees M. Pawan Kumar Daphne Koller Aim: To efficiently learn a.
Discrete Optimization Lecture 2 – Part I M. Pawan Kumar Slides available online
Discrete Optimization Lecture 4 – Part 2 M. Pawan Kumar Slides available online
Probabilistic Inference Lecture 3 M. Pawan Kumar Slides available online
Algorithms for MAP estimation in Markov Random Fields Vladimir Kolmogorov University College London.
Discrete Optimization in Computer Vision M. Pawan Kumar Slides will be available online
Discrete Optimization Lecture 3 – Part 1 M. Pawan Kumar Slides available online
1 Markov Random Fields with Efficient Approximations Yuri Boykov, Olga Veksler, Ramin Zabih Computer Science Department CORNELL UNIVERSITY.
Fast and accurate energy minimization for static or time-varying Markov Random Fields (MRFs) Nikos Komodakis (Ecole Centrale Paris) Nikos Paragios (Ecole.
Probabilistic Inference Lecture 5 M. Pawan Kumar Slides available online
Lecture 19: Solving the Correspondence Problem with Graph Cuts CAP 5415 Fall 2006.
Gaussian Mixture Models and Expectation-Maximization Algorithm.
Discrete Optimization Lecture 2 – Part 2 M. Pawan Kumar Slides available online
Implicit Hitting Set Problems Richard M. Karp Erick Moreno Centeno DIMACS 20 th Anniversary.
Inference for Learning Belief Propagation. So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables.
Probabilistic Inference Lecture 2 M. Pawan Kumar Slides available online
Discrete Optimization Lecture 1 M. Pawan Kumar Slides available online
Pushmeet Kohli. E(X) E: {0,1} n → R 0 → fg 1 → bg Image (D) n = number of pixels [Boykov and Jolly ‘ 01] [Blake et al. ‘04] [Rother, Kolmogorov and.
Deterministic Algorithms for Submodular Maximization Problems Moran Feldman The Open University of Israel Joint work with Niv Buchbinder.
A global approach Finding correspondence between a pair of epipolar lines for all pixels simultaneously Local method: no guarantee we will have one to.
MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts M. Pawan Kumar Daphne Koller Aim: To obtain accurate, efficient maximum a posteriori (MAP)
Polyhedral Optimization Lecture 5 – Part 3 M. Pawan Kumar Slides available online
Unconstrained Submodular Maximization Moran Feldman The Open University of Israel Based On Maximizing Non-monotone Submodular Functions. Uriel Feige, Vahab.
Combinatorial clustering algorithms. Example: K-means clustering
Introduction of BP & TRW-S
Data Driven Resource Allocation for Distributed Learning
Alexander Shekhovtsov and Václav Hlaváč
Markov Random Fields with Efficient Approximations
Efficient Graph Cut Optimization for Full CRFs with Quantized Edges
Discrete Inference and Learning
Discrete Optimization Methods Basic overview of graph cuts
Graphical Models and Learning
MAP Estimation of Semi-Metric MRFs via Hierarchical Graph Cuts
Presentation transcript:

Rounding-based Moves for Metric Labeling M. Pawan Kumar École Centrale Paris INRIA Saclay, Île-de-France

Metric Labeling Variables V = { V 1, V 2, …, V n }

Metric Labeling Variables V = { V 1, V 2, …, V n }

Metric Labeling VaVa VbVb Labels L = { l 1, l 2, …, l h } Variables V = { V 1, V 2, …, V n } Labeling f: { 1, 2, …, n}  {1, 2, …, h} E(f) = Σ a θ a (f(a)) + Σ (a,b) w ab d(f(a),f(b)) min f θ a (f(a)) θ b (f(b)) w ab d(f(a),f(b)) w ab ≥ 0 d is metric

Metric Labeling VaVa VbVb E(f) min f NP hard = Σ a θ a (f(a)) + Σ (a,b) w ab d(f(a),f(b)) Low-level vision applications

Outline Approximate Algorithms Comparison Rounding-based Moves

Boykov, Veksler and Zabih Kleinberg and Tardos Efficiency Accuracy Move-Making Algorithms Convex Relaxations

Kolmogorov and Boykov Move-Making Algorithms Convex Relaxations Chekuri, Khanna, Naor and Zosin Efficiency Accuracy

Outline Approximate Algorithms –Move-Making Algorithms –Linear Programming Relaxation Comparison Rounding-based Moves

Move-Making Algorithms Space of All Labelings f

Expansion Algorithm Variables take label l α or retain current label Slide courtesy Pushmeet Kohli Boykov, Veksler and Zabih, 2001

Expansion Algorithm Sky House Tree Ground Initialize with TreeStatus:Expand GroundExpand HouseExpand Sky Slide courtesy Pushmeet Kohli Variables take label l α or retain current label Boykov, Veksler and Zabih, 2001

Multiplicative Bounds f*: Optimal Labelingf: Estimated Labeling Σ a θ a (f(a)) + Σ (a,b) w ab d(f(a),f(b)) Σ a θ a (f*(a)) + Σ (a,b) w ab d(f*(a),f*(b)) ≥

Multiplicative Bounds f*: Optimal Labelingf: Estimated Labeling ≤ B Σ a θ a (f(a)) + Σ (a,b) w ab d(f(a),f(b)) Σ a θ a (f*(a)) + Σ (a,b) w ab d(f*(a),f*(b)) Ask me the obvious question

Outline Approximate Algorithms –Move-Making Algorithms –Linear Programming Relaxation Comparison Rounding-based Moves

Integer Linear Program Number of facets grows exponentially in problem size Minimize a linear function over a set of feasible solutions Indicator x a (i)  {0,1} for each variable V a and label l i Indicator x ab (i,k)  {0,1} for each neighbor (V a,V b ) and labels l i, l k

Linear Programming Relaxation Schlesinger, 1976; Chekuri et al., 2001; Wainwright et al., 2003 Indicator x a (i)  {0,1} for each variable V a and label l i Indicator x ab (i,k)  {0,1} for each neighbor (V a,V b ) and labels l i, l k

Linear Programming Relaxation Schlesinger, 1976; Chekuri et al., 2001; Wainwright et al., 2003 Indicator x a (i)  [0,1] for each variable V a and label l i Indicator x ab (i,k)  [0,1] for each neighbor (V a,V b ) and labels l i, l k

Approximation Factor x*: LP Optimal Solutionx: Estimated Integral Solution Σ a Σ i θ a (i)x a (i) + Σ (a,b) Σ (i,k) w ab d(i,k)x ab (i,k) ≥ Σ a Σ i θ a (i)x* a (i) + Σ (a,b) Σ (i,k) w ab d(i,k)x* ab (i,k)

Approximation Factor x*: LP Optimal Solutionx: Estimated Integral Solution Σ a Σ i θ a (i)x a (i) + Σ (a,b) Σ (i,k) w ab d(i,k)x ab (i,k) ≤ Σ a Σ i θ a (i)x* a (i) + Σ (a,b) Σ (i,k) w ab d(i,k)x* ab (i,k) F

Outline Approximate Algorithms Comparison Rounding-based Moves

Theoretical Guarantees ExpansionLP Uniform22 Metric2MO(log h) Truncated Linear 2M2 + √2 Truncated Quadratic 2MO(√M) M = ratio of maximum and minimum non-zero distance

Outline Approximate Algorithms Comparison Rounding-based Moves –Complete Rounding –Interval Rounding –Hierarchical Rounding

Complete Rounding Treat x a (i)  [0,1] as probability that f(a) = i Cumulative probability y a (i) = Σ j≤i x a (j) 0y a (1) y a (2) y a (h) = 1 y a (k) y a (i) Generate a random number r  (0,1] Assign the label next to r r

Example 0y a (1) y a (4) y a (3) y a (2) y b (1) y b (4) y b (3) y b (2) y c (1) y c (4) y c (3) y c (2) r r r

Complete Move A move that mimics complete rounding Considers all random variables and labels Assigns labels in one iteration

Key Observation If d is submodular d(i,k) + d(i+1,k+1) ≤ d(i,k+1) + d(i+1,k), for all i, k Schlesinger and Flach, 2003 energy can be minimized via minimum cut

Complete Move VaVa VbVb θ ab (i,k) = w ab d(i,k)NP-hard

Complete Move VaVa VbVb θ ab (i,k) = w ab d’(i,k) d’(i,k) ≥ d(i,k) d’ is submodular

Complete Move VaVa VbVb θ ab (i,k) = w ab d’(i,k) d’(i,k) ≥ d(i,k) d’ is submodular

Complete Move New problem can be solved using minimum cut Same multiplicative bound as complete rounding Multiplicative bound is tight

Outline Approximate Algorithms Comparison Rounding-based Moves –Complete Rounding –Interval Rounding –Hierarchical Rounding

Interval Rounding Treat x a (i)  [0,1] as probability that f(a) = i Cumulative probability y a (i) = Σ j≤i x a (j) 0y a (1) y a (2) y a (h) = 1 y a (k) y a (i) Choose an interval of length h’

Interval Rounding Treat x a (i)  [0,1] as probability that f(a) = i Cumulative probability y a (i) = Σ j≤i x a (j) r Generate a random number r  (0,1] Assign the label next to r if it is within the interval y a (k)-y a (i) 0 Choose an interval of length h’ REPEAT

Example 0y a (1) y a (4) y a (3) y a (2) y b (1) y b (4) y b (3) y b (2) y c (1) y c (4) y c (3) y c (2)

Example 0y a (1) y a (2) y b (1) y b (2) y c (1) y c (2) r r r

Example 0y a (1) y a (4) y a (3) y a (2) y b (1) y b (4) y b (3) y b (2) y c (1) y c (4) y c (3) y c (2)

Example 0 y c (1) y c (4) y c (3) y c (2)

Example 0 y c (3) y c (2) r -y c (1)

Example 0y a (1) y a (4) y a (3) y a (2) y b (1) y b (4) y b (3) y b (2) y c (1) y c (4) y c (3) y c (2)

Interval Move A move that mimics interval rounding Considers all variables and an interval of labels Changes labeling iteratively

Key Observation If d is submodular d(i,k) + d(i+1,k+1) ≤ d(i,k+1) + d(i+1,k), for all i, k Schlesinger and Flach, 2003 energy can be minimized via minimum cut

Interval Move VaVa VbVb θ ab (i,k) = w ab d(i,k) Choose an interval of length h’

Interval Move VaVa VbVb θ ab (i,k) = w ab d(i,k) Choose an interval of length h’ Add the current labels

Interval Move VaVa VbVb θ ab (i,k) = w ab d’(i,k) Choose an interval of length h’ Add the current labels d’(i,k) ≥ d(i,k) d’ is submodular Solve to update labels Repeat until convergence

Interval Move Each problem can be solved using minimum cut Same multiplicative bound as interval rounding Multiplicative bound is tight

Boykov, Veksler and Zabih Kleinberg and Tardos Length of interval = 1 Move-Making Algorithms Convex Relaxations

Boykov, Veksler and Zabih Chekuri, Khanna, Naor and Zosin Length of interval = 1 Optimal interval length Move-Making Algorithms Convex Relaxations

Theoretical Guarantees MovesLP Uniform22 Metric2MO(log h) Truncated Linear 2 + √2 Truncated Quadratic O(√M) M = ratio of maximum and minimum non-zero distance

Outline Approximate Algorithms Comparison Rounding-based Moves –Complete Rounding –Interval Rounding –Hierarchical Rounding

Hierarchical Rounding L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Hierarchical clustering of labels (e.g. r-HST metrics)

Hierarchical Rounding L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Assign variables to labels L 1, L 2 or L 3 Move down the hierarchy until the leaf level

Hierarchical Rounding L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Assign variables to labels l 1, l 2 or l 3

Hierarchical Rounding L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Assign variables to labels l 4, l 5 or l 6

Hierarchical Rounding L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Assign variables to labels l 7, l 8 or l 9

Hierarchical Move L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Hierarchical clustering of labels (e.g. r-HST metrics)

Hierarchical Move L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Obtain labeling f 1 restricted to labels {l 1,l 2,l 3 }

Hierarchical Move L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Obtain labeling f 2 restricted to labels {l 4,l 5,l 6 }

Hierarchical Move L1L1 L2L2 l1l1 l2l2 l3l3 l4l4 l5l5 l6l6 l7l7 l8l8 l9l9 L3L3 Obtain labeling f 3 restricted to labels {l 7,l 8,l 9 }

Hierarchical Move L1L1 L2L2 L3L3 VaVa VbVb f 1 (a) f 2 (a) f 3 (a) Move up the hierarchy until we reach the root f 1 (b) f 2 (b) f 3 (b)

Hierarchical Move Each problem can be solved using minimum cut Same multiplicative bound as hierarchical rounding Multiplicative bound is tight

Boykov, Veksler and Zabih Kleinberg and Tardos Flat hierarchy r-HST hierarchy Move-Making Algorithms Convex Relaxations

Theoretical Guarantees MovesLP Uniform22 MetricO(log h) Truncated Linear 2 + √2 Truncated Quadratic O(√M) M = ratio of maximum and minimum non-zero distance

Questions?