Presentation is loading. Please wait.

Presentation is loading. Please wait.

Inference for Learning Belief Propagation. So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables.

Similar presentations


Presentation on theme: "Inference for Learning Belief Propagation. So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables."— Presentation transcript:

1 Inference for Learning Belief Propagation

2 So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables >> N_Labels)

3

4 Motivating Application ImageDesired Output Only 10 variables !! head 5 2 3 6 8 9 2 3 5 2 4 3 1 2 6 8 9 8

5 Motivating Application headtorso uleg1 lleg1 uleg2 lleg2 uleg3 lleg3 uleg4 lleg4 Only 10 variables !! Thousands of Labels !! Millions of pairwise potentials!!

6 Belief Propagation E(f;  ) = ∑ a  a;f(a) + ∑ (a,b)  ab;f(a)f(b) MAP Estimation f* = argmin f E(f;  ) An algorithm for solving RECALL Potentials  a;i and  ab;ij Labeling f : V  L Exact for tree-structured models Pearl, 1988

7 Belief Propagation VaVa VbVb 2 5 2 1 0 4 0 1 M ab Message M ab;i : V a ’s opinion on V b taking label i V b gathers information from V a Compute the belief B b;i

8 VaVa VbVb 2 5 2 1 0 VaVa VbVb 2 5 40 1  a;0 +  ab;00 = 5 + 0  a;1 +  ab;10 = 2 + 1 min M ab;0 = Two Variables M ab;i = min j  a;j +  ab;ji

9 VaVa VbVb 2 5 40 1  a;0 +  ab;01 = 5 + 1  a;1 +  ab;11 = 2 + 0 min M ab;1 = Two Variables VaVa VbVb 5 2 1 0 2 3 f(a) = 1 M ab;i = min j  a;j +  ab;ji

10 Two Variables VaVa VbVb 5 2 1 0 2 3 f(a) = 1 VaVa VbVb 2 5 40 1 2 B b;i =  b;i +∑ a M ab;i  b;0 + M ab;0 = 2 + 3  b;1 + M ab;1 = 4 + 2 argmin f*(b) =

11 Two Variables VaVa VbVb 5 2 1 0 2 3 f(a) = 1 VaVa VbVb 2 5 40 1 2 B b;i =  b;i +∑ a M ab;i  b;0 + M ab;0 = 2 + 3  b;1 + M ab;1 = 4 + 2 argmin f*(b) =

12 Two Variables VaVa VbVb 5 2 1 0 2 3 f(a) = 1 VaVa VbVb 2 5 40 1 2 B b;i =  b;i +∑ a M ab;i  b;0 + M ab;0 = 2 + 3  b;1 + M ab;1 = 4 + 2 argmin f*(b) =

13 Three Variables VaVa VbVb 2 5 2 1 0 VcVc 460 1 0 1 3 2 3 Pass message from “a” to “b” as before l0l0 l1l1

14 Three Variables VaVa VbVb 2 5 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1 l0l0 l1l1

15 Three Variables VaVa VbVb 2 5 2 1 0 VcVc 460 1 0 1 3 2 3 Pass message from “b” to “c” as before 3 f(a) = 1 2 f(a) = 1 l0l0 l1l1

16 Three Variables VbVb 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1  b;0 +  bc;00 + M ab;0 = 6  b;1 +  bc;10 + M ab;1 = 8 min M bc;0 = M bc;i = min j  b;j +  bc;ji + ∑ n\c M nb;j l0l0 l1l1 VaVa 2 5

17 Three Variables VbVb 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1  b;0 +  bc;00 + M ab;0 = 6  b;1 +  bc;10 + M ab;1 = 8 min M bc;0 = M bc;i = min j  b;j +  bc;ji + ∑ n\c M nb;j 6 f(b) = 0 l0l0 l1l1 VaVa 2 5

18 Three Variables VbVb 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1  b;0 +  bc;01 + M ab;0 = 8  b;1 +  bc;11 + M ab;1 = 6 min M bc;1 = M bc;i = min j  b;j +  bc;ji + ∑ n\c M nb;j 6 f(b) = 0 l0l0 l1l1 VaVa 2 5

19 Three Variables VbVb 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1 M bc;i = min j  b;j +  bc;ji + ∑ n\c M nb;j 6 f(b) = 0  b;0 +  bc;01 + M ab;0 = 8  b;1 +  bc;11 + M ab;1 = 6 min M bc;1 = 6 f(b) = 1 l0l0 l1l1 VaVa 2 5

20 Three Variables VbVb 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1 6 f(b) = 0 6 f(b) = 1 B c;i =  c;i +∑ b M bc;i  c;0 + M bc;0 = 3 + 6  c;1 + M bc;1 = 6 + 6 argmin f*(c) = l0l0 l1l1 VaVa 2 5

21 Three Variables VbVb 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1 6 f(b) = 0 6 f(b) = 1 B c;i =  c;i +∑ b M bc;i  c;0 + M bc;0 = 3 + 6  c;1 + M bc;1 = 6 + 6 argmin f*(c) = l0l0 l1l1 VaVa 2 5

22 Three Variables VbVb 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1 6 f(b) = 0 6 f(b) = 1 B c;i =  c;i +∑ b M bc;i  c;0 + M bc;0 = 3 + 6  c;1 + M bc;1 = 6 + 6 argmin f*(c) = l0l0 l1l1 VaVa 2 5

23 Three Variables VbVb 2 1 0 VcVc 460 1 0 1 3 2 3 3 f(a) = 1 2 f(a) = 1 6 f(b) = 0 6 f(b) = 1 B c;i =  c;i +∑ b M bc;i  c;0 + M bc;0 = 3 + 6  c;1 + M bc;1 = 6 + 6 argmin f*(c) = l0l0 l1l1 VaVa 2 5

24 Tree-structured Models headtorso uleg1 lleg1 uleg2 lleg2 uleg3 lleg3 uleg4 lleg4 Message Passing

25 Tree-structured Models head torso uleg1 lleg1 uleg2 lleg2 uleg3 lleg3 uleg4 lleg4 Message Passing

26 Tree-structured Models head torso uleg1 lleg1 uleg2 lleg2 uleg3 lleg3 uleg4 lleg4 Message Passing

27 Tree-structured Models head torso uleg1 lleg1 uleg2 lleg2 uleg3 lleg3 uleg4 lleg4 Message Passing

28 Loopy Graphs VaVa VdVd VbVb VcVc Overcounting

29 Summary of BP Exact for trees Approximate MAP for general cases Convergence is not guaranteed M bc;i = min j  b;j +  bc;ji + ∑ n\a M nb;j B c;i =  c;i +∑ b M bc;i

30

31 Inference for Learning Linear Programming Relaxation

32 Linear Integer Programming min x g 0 T x s.t. g i T x ≤ 0 h i T x = 0 Linear function Linear constraints x is a vector of integers For example, x  {0,1} N Hard to solve !!

33 Linear Programming min x g 0 T x s.t. g i T x ≤ 0 h i T x = 0 Linear function Linear constraints x is a vector of reals Easy to solve!! For example, x  [0,1] N Relaxation

34 Roadmap Express MAP as an integer program Relax to a linear program and solve Round fractional solution to integers

35 2 5 4 2 0 1 3 0 V1V1 V2V2 Label ‘ 0 ’ Label ‘ 1 ’ Unary Cost Integer Programming Formulation Unary Cost Vector u = [ 5 Cost of V 1 = 0 2 Cost of V 1 = 1 ; 2 4 ]

36 2 5 4 2 0 1 3 0 V1V1 V2V2 Label ‘ 0 ’ Label ‘ 1 ’ Unary Cost Unary Cost Vector u = [ 5 2 ; 2 4 ] T Label vector x = [ 0 V 1  0 1 V 1 = 1 ; 1 0 ] T Integer Programming Formulation

37 2 5 4 2 0 1 3 0 V1V1 V2V2 Label ‘ 0 ’ Label ‘ 1 ’ Unary Cost Unary Cost Vector u = [ 5 2 ; 2 4 ] T Label vector x = [ 01; 1 0 ] T Sum of Unary Costs = ∑i ui xi∑i ui xi Integer Programming Formulation

38 2 5 4 2 0 1 3 0 V1V1 V2V2 Label ‘ 0 ’ Label ‘ 1 ’ Pairwise Cost Integer Programming Formulation 0 Cost of V 1 = 0 and V 1 = 0 0 00 0 Cost of V 1 = 0 and V 2 = 0 3 Cost of V 1 = 0 and V 2 = 1 10 00 00 10 30 Pairwise Cost Matrix P

39 2 5 4 2 0 1 3 0 V1V1 V2V2 Label ‘ 0 ’ Label ‘ 1 ’ Pairwise Cost Integer Programming Formulation Pairwise Cost Matrix P 00 00 0 3 10 00 00 10 30 Sum of Pairwise Costs ∑ i<j P ij x i x j = ∑ i<j P ij X ij X = xx T

40 Integer Programming Formulation Constraints Uniqueness Constraint ∑ x i = 1 i  V a Integer Constraints x i  {0,1} X = x x T

41 Integer Programming Formulation x* = argmin ∑ u i x i +∑ P ij X ij x i  {0,1} X = x x T ∑ x i = 1 i  V a

42 Roadmap Express MAP as an integer program Relax to a linear program and solve Round fractional solution to integers

43 Integer Programming Formulation x* = argmin ∑ u i x i +∑ P ij X ij ∑ x i = 1 i  V a x i  {0,1} X = x x T Convex Non-Convex

44 Integer Programming Formulation x* = argmin ∑ u i x i +∑ P ij X ij ∑ x i = 1 i  V a x i  [0,1] X = x x T Convex Non-Convex

45 Integer Programming Formulation x* = argmin ∑ u i x i +∑ P ij X ij ∑ x i = 1 i  V a x i  [0,1] X ij  [0,1] Convex ∑ X ij = x i j  V b

46 Linear Programming Formulation x* = argmin ∑ u i x i +∑ P ij X ij ∑ x i = 1 i  V a x i  [0,1] X ij  [0,1] Convex ∑ X ij = x i j  V b Schlesinger, 76; Chekuri et al., 01; Wainwright et al., 01

47 Roadmap Express MAP as an integer program Relax to a linear program and solve Round fractional solution to integers

48 Properties Dominate many convex relaxations Best known multiplicative bounds 2 for Potts (uniform) energies 2 + √2 for Truncated linear energies O(log n) for metric labeling Matched by move-making Kumar and Torr, 2008; Kumar and Koller, UAI 2009 Kumar, Kolmogorov and Torr, 2007

49 Algorithms Tree-reweighted message passing (TRW) Max-product linear programming (MPLP) Dual decomposition Komodakis and Paragios, ICCV 2007


Download ppt "Inference for Learning Belief Propagation. So far... Exact methods for submodular energies Approximations for non-submodular energies Move-making ( N_Variables."

Similar presentations


Ads by Google