Presentation is loading. Please wait.

Presentation is loading. Please wait.

Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University.

Similar presentations


Presentation on theme: "Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University."— Presentation transcript:

1 Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University

2 Aim 2 5 4 2 6 3 3 7 0 1 1 0 0 2 3 1 1 41 0 a bcd Label ‘0’ Label ‘1’ Labelling m = {1, 0, 0, 1} Random Variables V = {a, b, c, d} Label Set L = {0, 1} To solve convex relaxations of MAP estimation Edges E = {(a, b), (b, c), (c, d)}

3 Aim 2 5 4 2 6 3 3 7 0 1 1 0 0 2 3 1 1 41 0 Label ‘0’ Label ‘1’ Cost(m) = 2 + 1 + 2 + 1 + 3 + 1 + 3 = 13 Approximate using Convex Relaxations Minimum Cost Labelling? NP-hard problem To solve convex relaxations of MAP estimation a bcd

4 Aim 2 5 4 2 6 3 3 7 0 1 1 0 0 2 3 1 1 41 0 Label ‘0’ Label ‘1’ Objectives Solve tighter convex relaxations – LP and SOCP Handle large number of random variables, e.g. image pixels To solve convex relaxations of MAP estimation a bcd

5 Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

6 Integer Programming Formulation 2 5 4 2 0 1 3 0 a b Label ‘0’ Label ‘1’ Unary Cost Unary Cost Vector u = [ 5 Cost of a = 0 2 Cost of a = 1 ; 2 4 ] Labelling m = {1, 0}

7 2 5 4 2 0 1 3 0 Label ‘0’ Label ‘1’ Unary Cost Unary Cost Vector u = [ 5 2 ; 2 4 ] T Labelling m = {1, 0} Label vector x = [ -1 a  0 1 a = 1 ; 1 -1 ] T Recall that the aim is to find the optimal x Integer Programming Formulation a b

8 2 5 4 2 0 1 3 0 Label ‘0’ Label ‘1’ Unary Cost Unary Cost Vector u = [ 5 2 ; 2 4 ] T Labelling m = {1, 0} Label vector x = [ -11; 1 -1 ] T Sum of Unary Costs = 1 2 ∑ i u i (1 + x i ) Integer Programming Formulation a b

9 2 5 4 2 0 1 3 0 Label ‘0’ Label ‘1’ Pairwise Cost Labelling m = {1, 0} Pairwise Cost of a and a 00 00 0 Cost of a = 0 and b = 0 3 Cost of a = 0 and b = 1 10 00 00 10 30 Pairwise Cost Matrix P Integer Programming Formulation a b

10 2 5 4 2 0 1 3 0 Label ‘0’ Label ‘1’ Pairwise Cost Labelling m = {1, 0} Pairwise Cost Matrix P 00 00 0 3 10 00 00 10 30 Sum of Pairwise Costs 1 4 ∑ ij P ij (1 + x i )(1+x j ) Integer Programming Formulation a b

11 2 5 4 2 0 1 3 0 Label ‘0’ Label ‘1’ Pairwise Cost Labelling m = {1, 0} Pairwise Cost Matrix P 00 00 0 3 10 00 00 10 30 Sum of Pairwise Costs 1 4 ∑ ij P ij (1 + x i +x j + x i x j ) 1 4 ∑ ij P ij (1 + x i + x j + X ij )= X = x x T X ij = x i x j Integer Programming Formulation a b

12 Constraints Uniqueness Constraint ∑ x i = 2 - |L| i  a Integer Constraints x i  {-1,1} X = x x T Integer Programming Formulation

13 x* = argmin 1 2 ∑ u i (1 + x i ) + 1 4 ∑ P ij (1 + x i + x j + X ij ) ∑ x i = 2 - |L| i  a x i  {-1,1} X = x x T Convex Non-Convex Integer Programming Formulation

14 Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

15 Linear Programming Relaxation x* = argmin 1 2 ∑ u i (1 + x i ) + 1 4 ∑ P ij (1 + x i + x j + X ij ) ∑ x i = 2 - |L| i  a x i  {-1,1} X = x x T Retain Convex Part Schlesinger, 1976

16 Linear Programming Relaxation x* = argmin 1 2 ∑ u i (1 + x i ) + 1 4 ∑ P ij (1 + x i + x j + X ij ) ∑ x i = 2 - |L| i  a Retain Convex Part Schlesinger, 1976 X ij  [-1,1] 1 + x i + x j + X ij ≥ 0 ∑ X ij = (2 - |L|) x i j  b x i  [-1,1]

17 Dual of the LP Relaxation Wainwright et al., 2001 abc def ghi  = (u, P) abc def ghi abc def ghi 11 22 33 44 55 66 11 22 33 44 55 66  ii   ii  

18 Dual of the LP Relaxation Wainwright et al., 2001 abc def ghi  = (u, P) abc def ghi abc def ghi 11 22 33 44 55 66 Q(  1 )  ii   ii   Q(  2 ) Q(  3 ) Q(  4 )Q(  5 )Q(  6 ) max   i Q(  i ) Dual of LP

19 Tree-Reweighted Message Passing Kolmogorov, 2005 abc def ghi abc def ghi 11 22 33 44 55 66 Pick a variable cbaadg a Reparameterize such that u i are min-marginals u1u1 u2u2 u3u3 u4u4 Only one pass of belief propagation

20 Tree-Reweighted Message Passing Kolmogorov, 2005 abc def ghi abc def ghi 11 22 33 44 55 66 Pick a variable cbaadg a Average the unary costs (u 1 +u 3 )/2 Repeat for all variables (u 1 +u 3 )/2 (u 2 +u 4 )/2 TRW-S

21 Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

22 Cycle Inequalities Chopra and Rao, 1991 a ef bc d a ed xixi xjxj xkxk At least two of them have the same sign x i x j x j x k x k x i X ij X jk X ki X = xx T At least one of them is 1 X ij + X jk + X ki  -1

23 Cycle Inequalities Chopra and Rao, 1991 a ef bc d X ij + X jk + X kl - X li  -2 xjxj b fe xixi xkxk c xlxl Generalizes to all cycles LP-C

24 Second-Order Cone Constraints Kumar et al., 2007 a ef bc d x c = xixi xjxj xkxk X c = 1 X ij X ik X jk X ik 1 1 X c = x c x c T X c x c x c T 1 (X c - x c x c T )  0 (x i +x j +x k ) 2 ≤ 3 + X ij + X jk + X ki SOCP-C

25 Second-Order Cone Constraints Kumar et al., 2007 a ef bc d 1 (X c - x c x c T )  0 SOCP-Q x c = xixi xjxj xkxk X c = 1 X ij X ik X jk X ik 1 1 xlxl X il X jl X kl X il X jl X kl 1

26 Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

27 Modifying the Dual abc def ghi  ii   ii   max   i Q(  i ) 11 22 33 abc def ghi 44 55 66 adg beh cfi +   j s j 11 22 ab de bc ef de gh ef hi 33 44

28 Modifying TRW-S abc def ghi adg beh cfi ab de bc ef de gh ef hi Pick a variable --- a Pick a cycle/clique with a  ii   ii   max   i Q(  i )+  j s j Can be solved efficiently Run TRW-S for trees with a REPEAT

29 Properties of the Algorithm Algorithm satisfies the reparametrization constraint Value of dual never decreasesCONVERGENCE Solution satisfies Weak Tree Agreement (WTA) WTA not sufficient for convergence More accurate results than TRW-S

30 Outline Integer Programming Formulation Linear Programming Relaxation Additional Constraints Solving the Convex Relaxations Results and Conclusions

31 4-Neighbourhood MRF Test SOCP-C 50 binary MRFs of size 30x30 u ≈ N (0,1) P ≈ N (0,σ 2 ) Test LP-C

32 4-Neighbourhood MRF σ = 5 LP-C dominates SOCP-C

33 8-Neighbourhood MRF Test SOCP-Q 50 binary MRFs of size 30x30 u ≈ N (0,1) P ≈ N (0,σ 2 )

34 8-Neighbourhood MRF σ = 5 /  2 SOCP-Q dominates LP-C

35 Conclusions Modified LP dual to include more constraints Extended TRW-S to solve tighter dual Experiments show improvement More results in the poster

36 Future Work More efficient subroutines for solving cycles/cliques Using more accurate LP solvers - proximal projections Analysis of SOCP-C vs. LP-C

37 Questions?

38 Timings MethodTime/Iteration BP0.0027 TRW-S0.0027 LP-C7.7778 SOCP-C8.8091 SOCP-Q9.1170 Linear in the number of variables!!

39 Video Segmentation KeyframeUser Segmentation Segment remaining video ….

40 Video Segmentation Belief Propagation Input 81752562018314

41 Video Segmentation  -swap Input 118713681289

42 Video Segmentation  -expansion Input 245312661225

43 Video Segmentation TRW-S Input 64251309297

44 Video Segmentation LP-C Input 719264294

45 Video Segmentation SOCP-Q Input 000

46 4-Neighbourhood MRF σ = 1

47 4-Neighbourhood MRF σ = 2.5

48 8-Neighbourhood MRF σ = 1/  2

49 8-Neighbourhood MRF σ = 2.5 /  2


Download ppt "Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University."

Similar presentations


Ads by Google