Download presentation
Presentation is loading. Please wait.
Published byCrystal Logan Modified over 9 years ago
1
Probabilistic Inference Lecture 3 M. Pawan Kumar pawan.kumar@ecp.fr Slides available online http://cvc.centrale-ponts.fr/personnel/pawan/
2
Recap of lecture 1
3
Exponential Family P(v) = exp{-Σ α θ α Φ α (v) - A(θ)} Sufficient Statistics Parameters Log-Partition Function Random Variables V = {V 1,V 2,…,V n } Labeling V = v v a L = {l 1,l 2,…,l h } Random Variable V a takes a value or label v a
4
Overcomplete Representation P(v) = exp{-Σ α θ α Φ α (v) - A(θ)} Sufficient Statistics Parameters Log-Partition Function There exists a non-zero c such that Σ α c α Φ α (v) = Constant
5
Pairwise MRF Random Variable V = {V 1, V 2, …,V n } Neighborhood over variables specified by edges E Label set L = {l 1, l 2, …, l h } P(v) = exp{-Σ α θ α Φ α (v) - A(θ)} Sufficient StatisticsParameters I a;i (v a )θ a;i for all V a V, l i L θ ab;ik for all (V a,V b ) E, l i, l k L I ab;ik (v a,v b )
6
Pairwise MRF Random Variable V = {V 1, V 2, …,V n } Neighborhood over variables specified by edges E Label set L = {l 1, l 2, …, l h } P(v) = exp{-Σ a Σ i θ a;i I a;i (v a ) -Σ a,b Σ i,k θ ab;ik I ab;ik (v a,v b ) - A(θ)} A(θ) : log Z Probability P(v) = Π a ψ a (v a ) Π (a,b) ψ ab (v a,v b ) Z ψ a (l i ) : exp(-θ a;i )ψ a (l i,l k ) : exp(-θ ab;ik ) Parameters θ are sometimes also referred to as potentials
7
Pairwise MRF Random Variable V = {V 1, V 2, …,V n } Neighborhood over variables specified by edges E Label set L = {l 1, l 2, …, l h } P(v) = exp{-Σ a Σ i θ a;i I a;i (v a ) -Σ a,b Σ i,k θ ab;ik I ab;ik (v a,v b ) - A(θ)} Labeling as a function f : {1, 2, …, n} {1, 2, …, h} Variable V a takes a label l f(a)
8
Pairwise MRF Random Variable V = {V 1, V 2, …,V n } Neighborhood over variables specified by edges E Label set L = {l 1, l 2, …, l h } P(f) = exp{-Σ a θ a;f(a) -Σ a,b θ ab;f(a)f(b) - A(θ)} Labeling as a function f : {1, 2, …, n} {1, 2, …, h} Variable V a takes a label l f(a) Energy Q(f) = Σ a θ a;f(a) + Σ a,b θ ab;f(a)f(b)
9
Pairwise MRF Random Variable V = {V 1, V 2, …,V n } Neighborhood over variables specified by edges E Label set L = {l 1, l 2, …, l h } P(f) = exp{-Q(f) - A(θ)} Labeling as a function f : {1, 2, …, n} {1, 2, …, h} Variable V a takes a label l f(a) Energy Q(f) = Σ a θ a;f(a) + Σ a,b θ ab;f(a)f(b)
10
Inference max v ( P(v) = exp{-Σ a Σ i θ a;i I a;i (v a ) -Σ a,b Σ i,k θ ab;ik I ab;ik (v a,v b ) - A(θ)} ) Maximum a Posteriori (MAP) Estimation min f ( Q(f) = Σ a θ a;f(a) + Σ a,b θ ab;f(a)f(b) ) Energy Minimization P(v a = l i ) = Σ v P(v)δ(v a = l i ) Computing Marginals P(v a = l i, v b = l k ) = Σ v P(v)δ(v a = l i )δ(v b = l k )
11
Recap of lecture 2
12
Definitions Energy Minimization f* = arg min Q(f; ) Q(f; ) = ∑ a a;f(a) + ∑ (a,b) ab;f(a)f(b) Min-marginals q a;i = min Q(f; ) s.t. f(a) = i Q(f; ’) = Q(f; ), for all f ’ Reparameterization
13
Belief Propagation Pearl, 1988 General form of Reparameterization ’ a;i = a;i ’ ab;ik = ab;ik + M ab;k - M ab;k + M ba;i - M ba;i ’ b;k = b;k Reparameterization of (a,b) in Belief Propagation M ab;k = min i { a;i + ab;ik } M ba;i = 0
14
Belief Propagation on Trees VbVb VaVa Forward Pass: Leaf Root All min-marginals are computed Backward Pass: Root Leaf VcVc VdVd VeVe VgVg VhVh
15
Computational Complexity Each constant takes O(|L|) Number of constants - O(|E||L|) O(|E||L| 2 ) Memory required ? O(|E||L|)
16
Belief Propagation on Cycles VaVa VbVb VdVd VcVc a;0 a;1 b;0 b;1 d;0 d;1 c;0 c;1 Remember my suggestion? Fix the label of V a
17
Belief Propagation on Cycles VaVa VbVb VdVd VcVc a;0 b;0 b;1 d;0 d;1 c;0 c;1 Equivalent to a tree-structured problem
18
Belief Propagation on Cycles VaVa VbVb VdVd VcVc a;1 b;0 b;1 d;0 d;1 c;0 c;1 Equivalent to a tree-structured problem
19
Belief Propagation on Cycles Choose the minimum energy solution VaVa VbVb VdVd VcVc a;0 a;1 b;0 b;1 d;0 d;1 c;0 c;1 This approach quickly becomes infeasible
20
Vincent Algayres Algorithm VaVa VbVb VdVd VcVc a;0 b;0 d;0 d;1 c;0 c;1 Compute zero cost paths from all labels of V a to all labels of V d. Requires fixing V a.
21
Speed-Ups for Special Cases ab;ik = 0, if i = k = C, otherwise. M ab;k = min i { a;i + ab;ik } Felzenszwalb and Huttenlocher, 2004
22
Speed-Ups for Special Cases ab;ik = w ab |i-k| M ab;k = min i { a;i + ab;ik } Felzenszwalb and Huttenlocher, 2004
23
Speed-Ups for Special Cases ab;ik = min{w ab |i-k|, C} M ab;k = min i { a;i + ab;ik } Felzenszwalb and Huttenlocher, 2004
24
Speed-Ups for Special Cases ab;ik = min{w ab (i-k) 2, C} M ab;k = min i { a;i + ab;ik } Felzenszwalb and Huttenlocher, 2004
25
Lecture 3
26
Ising Model P(v) = exp{-Σ α θ α Φ α (v) - A(θ)} Random Variable V = {V 1, V 2, …,V n }Label set L = {0, 1} Neighborhood over variables specified by edges E Sufficient StatisticsParameters I a;i (v a )θ a;i for all V a V, l i L θ ab;ik for all (V a,V b ) E, l i, l k L I ab;ik (v a,v b ) I a;i (v a ): indicator for v a = l i I ab;ik (v a,v b ): indicator for v a = l i, v b = l k
27
Ising Model P(v) = exp{-Σ a Σ i θ a;i I a;i (v a ) -Σ a,b Σ i,k θ ab;ik I ab;ik (v a,v b ) - A(θ)} Random Variable V = {V 1, V 2, …,V n }Label set L = {0, 1} Neighborhood over variables specified by edges E Sufficient StatisticsParameters I a;i (v a )θ a;i for all V a V, l i L θ ab;ik for all (V a,V b ) E, l i, l k L I ab;ik (v a,v b ) I a;i (v a ): indicator for v a = l i I ab;ik (v a,v b ): indicator for v a = l i, v b = l k
28
Interactive Binary Segmentation Foreground histogram of RGB values FG Background histogram of RGB values BG ‘1’ indicates foreground and ‘0’ indicates background
29
Interactive Binary Segmentation More likely to be foreground than background
30
Interactive Binary Segmentation More likely to be background than foreground θ a;0 proportional to -log(BG(d a )) θ a;1 proportional to -log(FG(d a ))
31
Interactive Binary Segmentation More likely to belong to same label
32
Interactive Binary Segmentation Less likely to belong to same label θ ab;ik proportional to exp(-(d a -d b ) 2 ) if i ≠ k θ ab;ik = 0 if i = k
33
Outline Minimum Cut Problem Two-Label Submodular Energy Functions Move-Making Algorithms
34
Directed Graph n1n1 n2n2 n3n3 n4n4 10 5 3 2 Two important restrictions (1) Rational arc lengths (2) Positive arc lengths D = (N, A)
35
Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 Let N 1 and N 2 such that N 1 “union” N 2 = N N 1 “intersection” N 2 = Φ C is a set of arcs such that (n 1,n 2 ) A n 1 N 1 n 2 N 2 D = (N, A) C is a cut in the digraph D
36
Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 What is C? D = (N, A) N1N1 N2N2 {(n 1,n 2 ),(n 1,n 4 )} ? {(n 1,n 4 ),(n 3,n 2 )} ? {(n 1,n 4 )} ? ✓
37
Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 What is C? D = (N, A) N1N1 N2N2 {(n 1,n 2 ),(n 1,n 4 ),(n 3,n 2 )} ? {(n 1,n 4 ),(n 3,n 2 )} ? {(n 4,n 3 )} ? ✓
38
Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 What is C? D = (N, A) N2N2 N1N1 {(n 1,n 2 ),(n 1,n 4 ),(n 3,n 2 )} ? {(n 1,n 4 ),(n 3,n 2 )} ? {(n 3,n 2 )} ? ✓
39
Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 Let N 1 and N 2 such that N 1 “union” N 2 = N N 1 “intersection” N 2 = Φ C is a set of arcs such that (n 1,n 2 ) A n 1 N 1 n 2 N 2 D = (N, A) C is a cut in the digraph D
40
Weight of a Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 Sum of length of all arcs in C D = (N, A)
41
Weight of a Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 w(C) = Σ (n 1,n 2 ) C l(n 1,n 2 ) D = (N, A)
42
Weight of a Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 What is w(C)? D = (N, A) N1N1 N2N2 3
43
Weight of a Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 What is w(C)? D = (N, A) N1N1 N2N2 5
44
Weight of a Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 What is w(C)? D = (N, A) N2N2 N1N1 15
45
st-Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 A source “s” C is a cut such that s N 1 t N 2 D = (N, A) C is an st-cut s t A sink “t” 12 73
46
Weight of an st-Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 w(C) = Σ (n 1,n 2 ) C l(n 1,n 2 )
47
Weight of an st-Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 What is w(C)? 3
48
Weight of an st-Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 What is w(C)? 15
49
Minimum Cut Problem n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Find a cut with the minimum weight !! C* = argmin C w(C)
50
[Slide credit: Andrew Goldberg] Augmenting Path and Push-Relabel n: #nodes m: #arcs U: maximum arc length Solvers for the Minimum-Cut Problem
51
Remember … Two important restrictions (1) Rational arc lengths (2) Positive arc lengths
52
Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 Let N 1 and N 2 such that N 1 “union” N 2 = N N 1 “intersection” N 2 = Φ C is a set of arcs such that (n 1,n 2 ) A n 1 N 1 n 2 N 2 D = (N, A) C is a cut in the digraph D
53
st-Cut n1n1 n2n2 n3n3 n4n4 10 5 3 2 A source “s” C is a cut such that s N 1 t N 2 D = (N, A) C is an st-cut s t A sink “t” 12 73
54
Minimum Cut Problem n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Find a cut with the minimum weight !! C* = argmin C w(C) w(C) = Σ (n 1,n 2 ) C l(n 1,n 2 )
55
Outline Minimum Cut Problem Two-Label Submodular Energy Functions Move-Making Algorithms Hammer, 1965; Kolmogorov and Zabih, 2004
56
Overview Energy Q Digraph D Digraph D One nodes per random variable N = N 1 U N 2 N = N 1 U N 2 Compute Minimum Cut + Additional nodes “s” and “t” Labeling f* Labeling f* n a N 1 implies f(a) = 0 n a N 2 implies f(a) = 1
57
Outline Minimum Cut Problem Two-Label Submodular Energy Functions Unary Potentials Pairwise Potentials Energy Minimization Move-Making Algorithms
58
Digraph for Unary Potentials VaVa θ a;0 θ a;1 P Q f(a) = 0 f(a) = 1
59
Digraph for Unary Potentials nana P Q s t f(a) = 0 f(a) = 1
60
Digraph for Unary Potentials nana P Q s t Let P ≥ Q P-Q 0 Q Q + Constant P-Q f(a) = 0 f(a) = 1
61
Digraph for Unary Potentials nana P Q s t Let P ≥ Q P-Q 0 Q Q + Constant P-Q f(a) = 1 w(C) = 0 f(a) = 0 f(a) = 1
62
Digraph for Unary Potentials nana P Q s t Let P ≥ Q P-Q 0 Q Q + Constant P-Q f(a) = 0 w(C) = P-Q f(a) = 0 f(a) = 1
63
Digraph for Unary Potentials nana P Q s t Let P < Q 0 Q-P P P + Constant Q-P f(a) = 0 f(a) = 1
64
Digraph for Unary Potentials nana P Q s t Let P < Q 0 Q-P P P + Constant f(a) = 1 w(C) = Q-P Q-P f(a) = 0 f(a) = 1
65
Digraph for Unary Potentials nana P Q s t Let P < Q 0 Q-P P P + Constant f(a) = 0 w(C) = 0 Q-P f(a) = 0 f(a) = 1
66
Outline Minimum Cut Problem Two-Label Submodular Energy Functions Unary Potentials Pairwise Potentials Energy Minimization Move-Making Algorithms
67
Digraph for Pairwise Potentials VaVa θ ab;11 VbVb θ ab;00 θ ab;01 θ ab;10 PR QS f(a) = 0f(a) = 1 f(b) = 0 f(b) = 1 00 Q-P 0S-Q 0 0R+Q-S-P 00 + + + PP PP
68
Digraph for Pairwise Potentials nana nbnb PR QS f(a) = 0f(a) = 1 f(b) = 0 f(b) = 1 00 Q-P 0S-Q 0 0R+Q-S-P 00 + + + PP PP s t Constant
69
Digraph for Pairwise Potentials nana nbnb PR QS 00 Q-P 0S-Q 0 0R+Q-S-P 00 + + s t Unary Potential f(b) = 1 Q-P f(a) = 0f(a) = 1 f(b) = 0 f(b) = 1
70
Digraph for Pairwise Potentials nana nbnb PR QS 0S-Q 0 0R+Q-S-P 00 + s t Unary Potential f(a) = 1 Q-PS-Q f(a) = 0f(a) = 1 f(b) = 0 f(b) = 1
71
Digraph for Pairwise Potentials nana nbnb PR QS 0R+Q-S-P 00 s t Pairwise Potential f(a) = 1, f(b) = 0 Q-PS-Q f(a) = 0f(a) = 1 f(b) = 0 f(b) = 1 R+Q-S-P
72
Digraph for Pairwise Potentials nana nbnb PR QS s t Q-PS-Q f(a) = 0f(a) = 1 f(b) = 0 f(b) = 1 R+Q-S-P R+Q-S-P ≥ 0 General 2-label MAP estimation is NP-hard
73
Outline Minimum Cut Problem Two-Label Submodular Energy Functions Unary Potentials Pairwise Potentials Energy Minimization Move-Making Algorithms
74
st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Flow is less than length Flow is non-negative For all nodes expect s,t Incoming flow = Outgoing flow
75
st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Flow is non-negative For all nodes expect s,t Incoming flow = Outgoing flow flow(n 1,n 2 ) ≤ l(n 1,n 2 )
76
st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R For all nodes expect s,t Incoming flow = Outgoing flow flow(n 1,n 2 ) ≥ 0 flow(n 1,n 2 ) ≤ l(n 1,n 2 )
77
st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Incoming flow = Outgoing flow For all a N \ {s,t} flow(n 1,n 2 ) ≥ 0 flow(n 1,n 2 ) ≤ l(n 1,n 2 )
78
st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R = Outgoing flow For all a N \ {s,t} Σ (n,a) A flow(n,a) flow(n 1,n 2 ) ≥ 0 flow(n 1,n 2 ) ≤ l(n 1,n 2 )
79
st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R For all a N \ {s,t} Σ (n,a) A flow(n,a) = Σ (a,n) A flow(a,n) flow(n 1,n 2 ) ≥ 0 flow(n 1,n 2 ) ≤ l(n 1,n 2 )
80
Weight of an st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Outgoing flow of s - Incoming flow of s
81
Weight of an st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Σ (s,n) A flow(s,n) - Σ (n,s) A flow(n,s) = 0
82
Weight of an st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Σ (s,n) A flow(s,n)
83
Weight of an st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Σ (s,n) A flow(s,n) = Incoming flow of t
84
Weight of an st-Flow n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Σ (s,n) A flow(s,n) = Σ (n,t) A flow(n,t)
85
Max-Flow Problem n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Find the maximum flow!!
86
Min-Cut Max-Flow Theorem n1n1 n2n2 n3n3 n4n4 10 5 3 2 D = (N, A) s t 12 73 Function flow: A R Weight of minimum-cut = Weight of maximum-flow
87
Max-Flow via Reparameterization !!
88
Following slides courtesy Pushmeet Kohli
89
Maxflow Algorithms Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Source Sink n1n1 n2n2 2 5 9 4 2 1 Algorithms assume non-negative capacity Flow = 0
90
Maxflow Algorithms Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Source Sink n1n1 n2n2 2 5 9 4 2 1 Algorithms assume non-negative capacity Flow = 0
91
Maxflow Algorithms Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Source Sink 2-2 5-2 9 4 2 1 Algorithms assume non-negative capacity Flow = 0 + 2 n1n1 n2n2
92
Maxflow Algorithms Source Sink 0 3 9 4 2 1 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 2 n1n1 n2n2
93
Maxflow Algorithms Source Sink 0 3 9 4 2 1 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 2 n1n1 n2n2
94
Maxflow Algorithms Source Sink 0 3 9 4 2 1 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 2 n1n1 n2n2
95
Maxflow Algorithms Source Sink 0 3 5 0 2 1 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 2 + 4 n1n1 n2n2
96
Maxflow Algorithms Source Sink 0 3 5 0 2 1 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 6 n1n1 n2n2
97
Maxflow Algorithms Source Sink 0 3 5 0 2 1 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 6 n1n1 n2n2
98
Maxflow Algorithms Source Sink 0 2 4 0 2+1 1-1 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 6 + 1 n1n1 n2n2
99
Maxflow Algorithms Source Sink 0 2 4 0 3 0 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 7 n1n1 n2n2
100
Maxflow Algorithms Source Sink 0 2 4 0 3 0 Augmenting Path Based Algorithms 1.Find path from source to sink with positive capacity 2.Push maximum possible flow through this path 3.Repeat until no path can be found Algorithms assume non-negative capacity Flow = 7 n1n1 n2n2
101
History of Maxflow Algorithms [Slide credit: Andrew Goldberg] Augmenting Path and Push-Relabel n: # nodes m: # arcs U: maximum arc length Algorithms assume non- negative arc lengths
102
History of Maxflow Algorithms [Slide credit: Andrew Goldberg] Augmenting Path and Push-Relabel n: # nodes m: # arcs U: maximum arc length Algorithms assume non- negative arc lengths
103
Augmenting Path based Algorithms a1a1 a2a2 1000 1 Sink Source 1000 0 Ford Fulkerson: Choose any augmenting path
104
a1a1 a2a2 1000 1 Sink Source 1000 0 Augmenting Path based Algorithms Bad Augmenting Paths Ford Fulkerson: Choose any augmenting path
105
a1a1 a2a2 1000 1 Sink Source 1000 0 Augmenting Path based Algorithms Bad Augmenting Path Ford Fulkerson: Choose any augmenting path
106
a1a1 a2a2 999 0 Sink Source 1000 999 1 Augmenting Path based Algorithms Ford Fulkerson: Choose any augmenting path
107
a1a1 a2a2 999 0 Sink Source 1000 999 1 Ford Fulkerson: Choose any augmenting path n: # nodes m: # arcs We will have to perform 2000 augmentations! Worst case complexity: O (m x Total_Flow) (Pseudo-polynomial bound: depends on flow) Augmenting Path based Algorithms
108
Dinic: Choose shortest augmenting path n: # nodes m: # arcs Worst case Complexity: O (m n 2 ) Augmenting Path based Algorithms a1a1 a2a2 1000 1 Sink Source 1000 0
109
Maxflow in Computer Vision Specialized algorithms for vision problems – Grid graphs – Low connectivity (m ~ O(n)) Dual search tree augmenting path algorithm [Boykov and Kolmogorov PAMI 2004] Finds approximate shortest augmenting paths efficiently High worst-case time complexity Empirically outperforms other algorithms on vision problems
110
Maxflow in Computer Vision Specialized algorithms for vision problems – Grid graphs – Low connectivity (m ~ O(n)) Dual search tree augmenting path algorithm [Boykov and Kolmogorov PAMI 2004] Finds approximate shortest augmenting paths efficiently High worst-case time complexity Empirically outperforms other algorithms on vision problems Efficient code available on the web http://pub.ist.ac.at/~vnk/software.html
111
Outline Minimum Cut Problem Two-Label Submodular Energy Functions Move-Making Algorithms
112
Metric Labeling P(v) = exp{-Σ α θ α Φ α (v) - A(θ)} Random Variable V = {V 1, V 2, …,V n } Neighborhood over variables specified by edges E Sufficient StatisticsParameters I a;i (v a )θ a;i for all V a V, l i L θ ab;ik for all (V a,V b ) E, l i, l k L I ab;ik (v a,v b ) θ ab;ik is a metric distance function over labels Label set L = {0, …, h-1}
113
Metric Labeling P(v) = exp{-Σ a Σ i θ a;i I a;i (v a ) -Σ a,b Σ i,k θ ab;ik I ab;ik (v a,v b ) - A(θ)} Random Variable V = {V 1, V 2, …,V n } Neighborhood over variables specified by edges E Sufficient StatisticsParameters I a;i (v a )θ a;i for all V a V, l i L θ ab;ik for all (V a,V b ) E, l i, l k L I ab;ik (v a,v b ) θ ab;ik is a metric distance function over labels Label set L = {0, …, h-1}
114
Stereo Correspondence Disparity Map
115
Stereo Correspondence L = {disparities} Pixel (x a,y a ) in left corresponds to pixel (x a +v a,y a ) in right
116
Stereo Correspondence L = {disparities} θ a;i is proportional to the difference in RGB values
117
Stereo Correspondence L = {disparities} θ ab;ik = w ab d(i,k) w ab proportional to exp(-(d a -d b ) 2 )
118
Move-Making Algorithms Space of All Labelings f
119
Expansion Algorithm Initialize labeling f = f 0 (say f 0 (a) = 0, for all V a ) For α = 0, 2, …, h-1 End f α = argmin f’ Q(f’) s.t. f’(a) {f(a)} U {l α } Update f = f α Boykov, Veksler and Zabih, 2001 Repeat until convergence
120
Expansion Algorithm Variables take label l α or retain current label Slide courtesy Pushmeet Kohli
121
Expansion Algorithm Sky House Tree Ground Initialize with TreeStatus:Expand GroundExpand HouseExpand Sky Slide courtesy Pushmeet Kohli Variables take label l α or retain current label
122
Expansion Algorithm Restriction on pairwise potentials? θ ab;ik + θ ab;αα ≤ θ ab;iα + θ ab;αk Metric Labeling
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.