Download presentation
Presentation is loading. Please wait.
Published byEvelyn Daniel Modified over 8 years ago
1
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, 1997. Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National University http://bi.snu.ac.kr/
2
2(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Contents 6.1 Definitions and Introductory Remarks 6.1.1 The Markov P-Process 6.1.2 Markov Random Fields and Neighborhood Systems 6.1.3 Gibbs Random Fields and Clique Potentials 6.1.4 The Gibbs-Markov Equivalence 6.2 Random Fields and Image Processes 6.2.1 Random Field Models 6.2.2 Gibbs Distributions and Simulated Annealing 6.2.3 Potential and Deterministic Approaches 6.3 The Hammersley-Clifford Theorem for Finite Lattices
3
Markov P-Process Markov P-Process Markov random field (MRF) Mathematical generalization of the notion of a one-dimensional temporal Markov chain to a two-dimensional (or higher) spatial lattice or graph. Markov P-process Partitioning of space P: G + is at least a distance P from G - X G+, X G, X G- : configurations of each lattice elements Spatial Markovian property 3(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/ P G+G+ GG G-G-
4
Neighborhood System Random field, ( , , p) : a discrete two-dimensional rectangular lattice X mn : a random variable defined on that takes on the values x mn at lattice site (m,n). X: configuration of a lattice system, i.e. the set of values of the random variables : a set of all possible configurations of the random variables p: joint probability measure Neighborhood system No causality Several different orders of neighborhood system, Fig. 6.2. (in lattice) A neighborhood system N ij associated with a lattice (like undirected graph ) 4(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
5
Markov Random Field Generalization of the Ising model in Ch.3 A Random field for which The joint probability distribution has associated conditional probabilities that are local, i.e. having the spatial Markovian relationship like: where mn is the neighborhood system for lattice site (m,n). The probability distribution is positive definite for all values of the random variable. The conditional probabilities are invariant with respect to neighborhood translations. 5(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
6
Gibbs Random Fields and Clique Potentials (1) Defining a random field by Gibbs distribution Gibbs random field, ( , , p) Joint probability distribution is of the form x: configuration U(x): potential encapsulating the global properties of the system Z: partition function T: temperature (as in simulated annealing) Potentials U(x): sum of individual contributions V i (x i ) from each lattice site V c (x c ): clique potential C i : the set of all cliques associated with the site i 6(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/ Potential only for max cliques?
7
Gibbs Random Fields and Clique Potentials (2) Generalization of the nearest-neighbor interactions of the Ising model In the Ising model Single site denoting interactions of a spin element with an external field Two-body terms denoting adjacent spin elements In the Gibbs random field Reflecting more elaborate sets of interactions –Single, pair, and three body interactions… 7(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
8
The Gibbs-Markov Equivalence Hammersley-Clifford theorem The global character of a Gibbs random field (defined through local interactions) is equivalent to the purely local character of a Markov random field. Gibbs random field == Markov random field For finite lattices and graphs Proofs Polynomial expansion by Besag, (Ch. 6.3) Equivalence based on Möbius inversion by Grimmett, (Ch. 6.4) 8(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
9
Random Field Models for Image (1) Two dimensional images highly organized spatial systems Utilizing MRF for… Texture analysis, synthesis, reconstruction,segmentation,… Causal random field models Markov mesh random field (Abend, Harley, and Kanal) Pickard random field Causal random field A random variable X ij in an image conditioned on random variables in upper & left region will depend only on random variables at sites immediately above and to the left. ij : the larger segment of the lattice above and to the left of a site (i,j) S ij : the small set of the segment called the support set 9(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
10
Random Field Models for Image (2) Noncausal models (to be described in the later sections) Gauss-Markov random fields Simultaneous autoregressive (SAR) model (can be either causal or noncausal) 10(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
11
Gibbs Distribution and Simulated Annealing in Image Processing One implication of the Gibbs-Markov equivalence We may exploit the global properties to model the problem then use the local characteristics to design a an algorithm to evaluate the consequences. Ex. Simulated annealing (SA) to endow the system with an equilibrium dynamics and carry out a pixel-by-pixel interactive reconstruction of the image. Examples Geman and Geman method (Ch 6.8 & 6.8) MRF + SA + Bayesian inference for two-dimensional image processing Complementary approximations and alternatives Marroquin’s maximizer of the posterior marginals (Ch. 6.9) Besag’s iterated conditional modes of the posterior distributions (Ch. 6.10) More rapid convergence than SA Multiple random field model (Jeng and Woods), (Ch. 6.11) 11(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
12
The Hammersley-Clifford Theorem for Finite Lattices (1) Assumptions and preliminaries Positivity condition If p(x m ) > 0 at each site m p(x 1, …, x m, …, x r ) > 0 Joint probability, p(x) and sample space, : all possible configurations with positive probability Joint & conditional probabilities by Bayes’s theorem Definitions… Q(x) Assuming that the value 0 is available at every site, we have p(0) > 0 by the positive condition. Then, we have 12(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/ (eq. 6.15)
13
The Hammersley-Clifford Theorem for Finite Lattices (2) Mathematical generalization of the nearest neighbor interactions by expanding Q(x) in the series: Our main results to establish relating G-functions to the Markovian properties of the conditional probabilities: For any i < j < … < s in the range 1, 2, …, r, the functions, G i,j,…,s are nonzero if and only if the site form a clique. (Subjected to this restriction, the G-functions may be chosen arbitrarily.) The proof is as follows… By Eq. 6.15, for any x, Q(x)-Q(x i ) can only depend on x i and the values at neighbors. Without loss of generality, let i=1, then we observe: 13(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
14
The Hammersley-Clifford Theorem for Finite Lattices (3) Suppose that site m is neither site 1 nor a neighbor of 1: Then Q(x)-Q(x 1 ) must be independent of site m for all x. Let us choose x for which x i =0 for all i except 1 and m. Then since G 1m must vanish. Likewise, we obtain an identical result for G 1mn (G functions having both of 1 and m) for a suitably chosen configuration, and so on. If there is no edge between i and j, then G functions having i and j must be zero. G functions are nonzero only if the site i, j, …, s form a clique. Conversely, Any group of G-functions will generate probability distributions that are positive definite. Since Q(x)-Q(x 1 ) depends upon x m only if there is a nonzero G-function linking x i to x m, The conditional probabilities generated by the G-functions are Markovian by construction. 14(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/http://bi.snu.ac.kr/
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.