Download presentation
1
Mean field approximation for CRF inference
2
CRF Inference Problem CRF over variables: CRF distribution:
MAP inference: MPM (maximum posterior marginals) inference:
3
Other notation Unnormalized distribution Variational distribution
Expectation Entropy
4
Variational Inference
Inference => minimize KL-divergence General Objective Function
5
Mean field approximation
Variational distribution => product of independent marginals: Expectations: Entropy:
6
Mean field objective Objective
7
Local optimality conditions
Lagrangian Setting derivatives to 0 gives conditions for local optimality
8
Coordinate ascent Sequential coordinate ascent
Initialize Q_i’s to uniform distribution For i = 1...N, update vector Q_i by summing expectations over all cliques involving X_i (while fixing all Q_j, j!=i) Parallel updates algorithm As above, but perform updates in step 2 for all Q_i’s in parrallel (i.e. Generating Q^1, Q^2...)
9
Comparison with belief propagation
Objective Factored energy functional Local polytope
10
Comparison with belief propagation
Message updates: Extracting beliefs (after convergence):
11
Comparison with belief propagation
= => Bethe free energy for pairwise graphs Bethe cluster graphs: General: Pairwise:
12
Mean field updates Updates in dense CRF (Krahenbuhl NIPS ’11)
Evaluate using filtering =
13
Higher-order potentials
Pattern-based potentials P^n-Potts potentials
14
Higher-order potentials
Co-occurrence potentials L(X) = set of labels present in X {Y_1,...Y_L} = set of binary latent variables
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.