1 Hierarchical Image-Motion Segmentation using Swendsen-Wang Cuts Adrian Barbu Siemens Corporate Research Princeton, NJ Acknowledgements: S.C. Zhu, Y.N. Wu, A.L. Yuille et al.
Harvard, May 14 th, of 39 Talk Outline The Swendsen-Wang Cuts algorithm The original Swendsen-Wang algorithm Generalization to arbitrary probabilities Multi-Grid and Multi-Level Swendsen-Wang Cuts Application: Hierarchical Image-Motion Segmentation Conclusions and future work
Harvard, May 14 th, of 39 Swendsen-Wang for Ising / Potts Models Swedsen-Wang (1987) is an extremely smart idea that flips a patch at a time. Each edge in the lattice e= is associated a probability q=e - . 1. If s and t have different labels at the current state, e is turned off. If s and t have the same label, e is turned off with probability q. Thus each object is broken into a number of connected components (subgraphs). 2. One or many components are chosen at random. 3. The collective label is changed randomly to any of the labels.
Harvard, May 14 th, of 39 The Swendsen-Wang Algorithm Pros Computationally efficient in sampling the Ising/Potts models Cons: Limited to Ising / Potts models and factorized distributions Not informed by data, slows down in the presence of an external field (data term) Swendsen Wang Cuts Generalizes Swendsen-Wang to arbitrary posterior probabilities Improves the clustering step by using the image data
Harvard, May 14 th, of 39 SW Cuts: the Acceptance Probability Theorem (Metropolis-Hastings) For any proposal probability q(A B) and probability p(A), if the Markov chain moves by taking samples from q(A B) which are accepted with probability then the Markov chain is reversible with respect to p and has stationary distribution p. Theorem (Barbu,Zhu ‘03). The acceptance probability for the Swendsen-Wang Cuts algorithm is
Harvard, May 14 th, of Initialize a graph partition 2. Repeat, for current state A= π State A The Swendsen-Wang Cuts Algorithm Swendsen-Wang Cuts: SWC Input: G o =, discriminative probabilities q e, e E o, and generative posterior probability p(W|I). Output: Samples W~p(W|I). 7. Select a connected component V 0 CP at random 9. Accept the move with probability α(A B). 3. Repeat for each subgraph G l =, l=1,2,...,n in A 4. For e E l turn e=“on” with probability q e. 5. Partition G l into n l connected components: g li =, i=1,...,n l 6. Collect all the connected components in CP={V li : l=1,...,n, i=1,...,n l }. CP The initial graph G o 8. Propose to reassign V 0 to a subgraph G l’, l' follows a probability q(l'|V 0,A) State B
Harvard, May 14 th, of 39 Advantages of the SW Cuts Algorithm Our algorithm bridges the gap between the specialized and generic algorithms: Generally applicable – allows usage of complex models beyond the scope of the specialized algorithms Computationally efficient – performance comparable with the specialized algorithms Reversible and ergodic – theoretically guaranteed to eventually find the global optimum
Harvard, May 14 th, of 39 Hierarchical Image-Motion Segmentation Three-level representation: – Level 0: Pixels are grouped into atomic regions r ijk of relatively constant motion and intensity – motion parameters (u ijk,v ijk ) – intensity histogram h ijk – Level 1: Atomic regions are grouped into intensity regions R ij of coherent motion with intensity models H ij – Level 2: Intensity regions are grouped into moving objects O i with motion parameters i
Harvard, May 14 th, of 39 Multi-Grid SWC 1. Select an attention window ½ G. 2. Cluster the vertices within and select a connected component R 3. Swap the label of R 4. Accept the swap with probability, using as boundary condition.
Harvard, May 14 th, of 39 Multi-Level SWC 1. Select a level s, usually in an increasing order. 2. Cluster the vertices in G (s) and select a connected component R 3. Swap the label of R 4. Accept the swap with probability, using the lower levels, denoted by X (<s), as boundary conditions.
Harvard, May 14 th, of 39 Hierarchical Image-Motion Segmentation Modeling occlusion Accreted (disoccluded) pixels Motion pixels Accreted pixels Bayesian formulation Motion pixels explained by motion Intensity segmentation factor with generative and histogram models.
Harvard, May 14 th, of 39 Hierarchical Image-Motion Segmentation The prior has factors for Smoothness of motion Main motion for each object Boundary length Number of labels
Harvard, May 14 th, of 39 Designing the Edge Weights Level 0: Pixel similarity Common motion Histogram H j Histogram H i Level 1: Motion histogram M i Motion histogram M j Level 2:
Harvard, May 14 th, of 39 Experiments Image SegmentationMotion Segmentation Input sequence Image SegmentationMotion Segmentation Input sequence
Harvard, May 14 th, of 39 Experiments Image SegmentationMotion Segmentation Input sequence Image SegmentationMotion Segmentation Input sequence
Harvard, May 14 th, of 39 Conclusion Two extensions: Swendsen-Wang Cuts Samples arbitrary probabilities on Graph Partitions Efficient by using data-driven techniques Hundreds of times faster than Gibbs sampler Marginal Space Learning Constrain search by learning in Marginal Spaces Six orders of magnitude speedup with great accuracy Robust, complex statistical model by supervised learning
Harvard, May 14 th, of 39 Future Work Algorithm Boosting Any algorithm has a success rate and an error rate Can combine algorithms into a more robust algorithm by supervised learning Proof of concept for Image Registration Hierarchical Computing Efficient representation of Top-Down and Bottom-Up communication using specialized dictionaries Robust integration of multiple MSL paths by Algorithm Boosting Applications to medical imaging 3D curve localization and tracking Brain segmentation Lymph node detection
Harvard, May 14 th, of 39 References A. Barbu, S.C. Zhu. Generalizing Swendsen-Wang to sampling arbitrary posterior probabilities, IEEE Trans. PAMI, August A. Barbu, S.C. Zhu. Generalizing Swendsen-Wang for Image Analysis. To appear in J. Comp. Graph. Stat. Thank You!