Download presentation
Presentation is loading. Please wait.
Published byCharlene Briggs Modified over 9 years ago
1
Markov random field Institute of Electronics, NCTU
ζε°ζζ: ηθζΊ S. J. Wang εΈη: ηΎ
δ»ζ Jie-Wei Luo
2
Prior Knowledge Sites Label: An event happen to a site π={1,β¦,π}
Ex: pixel, feature(line, surface patch) Label: An event happen to a site EX: L ={edge,nonedge}, L={0, , 255}
3
Labeling Problem f = {f1, . . . , fm}
Each fi labeling sites in term of Labels f : S βL Labeling is called configuration in random field
4
Prior knowledge(conti)
In order to explain the concept of the MRF, we first introduce following definition: 1. i: Site (Pixel) 2. Ni: The neighboring point of i 3. S: Set of sites (Image) 4. fi: The value at site i (Intensity) f1 f2 f3 f4 fi f6 f7 f8 f9 A 3x3 imagined image
5
Neighborhood system f1 f2 f3 f4 fi f6 f7 f8 f9
The sites in S are related to one another via a neighborhood system. Its definition for S is defined as: where Ni is the set of sites neighboring i. The neighboring relationship has the following properties: (1) A site is not neighboring to itself (2) The neighboring relationship is mutual f1 f2 f3 f4 fi f6 f7 f8 f9
6
Example(Regular sites)
First order neighborhood system Second order neighborhood system Nth order neighborhood system
7
Example(Irregular sites)
The neighboring sites of the site i are m, n, and f. The neighboring sites of the site j are r and x
8
Clique A clique C is defined as a subset of sites in S.
Following are some examples Single-site pair-site triple-site
9
Clique: Example Take first order neighborhood system and second order neighborhood for example: Neighborhood system Clique types
10
Random field Random field is a list ofΒ random numbersΒ whose indices are mapped onto a space (of nΒ dimensions) F = {F1, , Fm} be a family of random variables defined on the set S in which each random variable Fi takes a value fi in L. The family F is called a random field.
11
Markov Random field Β View the 2D image f as the collection of the random variables (Random field) Markov Random field is a set of random variablesΒ having aΒ Markov property
12
Gibbs random field (GRF) and Gibbs distribution
A random field is said to be a Gibbs random field if and only if its configuration f obeys Gibbs distribution, that is: Design U for different applications f1 f2 f3 f4 fi f6 f7 f8 f9 Image configuration f U(f): Energy function; T: Temperature Vi(f): Clique potential
13
Role of Energy Function
(1) As the quantitative measure of the global quality of the solution and (2) As a guide to the search for a minimal solution. By MRF Modeling to find π β = arg ππππ πΈ(π|π)
14
Role of Temperature The temperature T controls the sharpness of the distribution. When Temperature is high, all configurations tend to be equally distributed.
15
Markov-Gibbs equivalence
Hammersley-Clifford theorem: A random field F is an MRF if and only if F is a GRF Proof: Let P(f) be a Gibbs distribution on S with the neighborhood system N. f1 f2 f3 f4 fi f6 f7 f8 f9 A 3x3 imagined image
16
Markov-Gibbs equivalence
Divide C into two set A and B with A consisting of cliques containing i and B cliques not containing i: f1 f2 f3 f4 fi f6 f7 f8 f9 A 3x3 imagined image
17
Optimization-based vision problem
18
Denoising Noisy signal d denoised signal f
19
The MAP-MRF Framework When both prior and likelihood is known
MAP-MRF Labeling π β = arg πππ₯π π(π|π) π π π βπ π,π =π π π π π π β = arg πππ₯π π π π π π
20
MAP formulation for denoising problem
Assume the observation is the true signal plus the independent Gaussian noise, that is ππ=ππ+ππ , ππ~π(0,π2) Under above circumstance, the observation model could be expressed as U(d|f): Likelihood energy
21
MAP formulation for denoising problem
Assume the unknown data f is MRF, the prior model is: Based on above information, the posteriori probability becomes
22
MAP formulation for denoising problem
The MAP estimator for the problem is: ?
23
The Smoothness Prior U(f)=[f(n)(x)]2 the order n determines the number of sites in the cliques involved N=1 (constant gray level) π π = π β² π₯ 2ππ₯ π π = π ππβ π πβ1 2 N=2 (constant gradient) π π = π β² β² π₯ 2ππ₯ π π = π π π+1 β2ππ+ π πβ1 2 N=3 (constant curvature) π π = π β²β²β² π₯ 2ππ₯ π π = π π π+1 β3 π π1 +3 π πβ1 β π πβ π π π+1 β3 π π1 +3 π πβ1 β π πβ2 2
24
MAP formulation for denoising problem
Define the smoothness prior: Substitute above information into the MAP estimator, we could get: Call posterior Energy function Observation model (Similarity measure) Prior model (Reconstruction constrain)
25
Piecewise Continuous Restoration
πΈ π = π=1 π π π β π π 2+ 2Ξ» π=1 π π π π β π πβ1 If g(x)=x2, at discontinuities (π π β π πβ1 )2 tend to be very large , giving an oversmoothed result. To encode piecewise smoothness , g should be saturate at its asymptotic upper bound to allow discontinuities π π₯ =minβ‘{π₯2,πΆ}
26
Result
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.