Presentation is loading. Please wait.

Presentation is loading. Please wait.

Markov random field Institute of Electronics, NCTU

Similar presentations


Presentation on theme: "Markov random field Institute of Electronics, NCTU"β€” Presentation transcript:

1 Markov random field Institute of Electronics, NCTU
ζŒ‡ε°Žζ•™ζŽˆ: ηŽ‹θ–ζ™Ί S. J. Wang ε­Έη”Ÿ: ηΎ…δ»‹ζš Jie-Wei Luo

2 Prior Knowledge Sites Label: An event happen to a site 𝑆={1,…,π‘š}
Ex: pixel, feature(line, surface patch) Label: An event happen to a site EX: L ={edge,nonedge}, L={0, , 255}

3 Labeling Problem f = {f1, . . . , fm}
Each fi labeling sites in term of Labels f : S β†’L Labeling is called configuration in random field

4 Prior knowledge(conti)
In order to explain the concept of the MRF, we first introduce following definition: 1. i: Site (Pixel) 2. Ni: The neighboring point of i 3. S: Set of sites (Image) 4. fi: The value at site i (Intensity) f1 f2 f3 f4 fi f6 f7 f8 f9 A 3x3 imagined image

5 Neighborhood system f1 f2 f3 f4 fi f6 f7 f8 f9
The sites in S are related to one another via a neighborhood system. Its definition for S is defined as: where Ni is the set of sites neighboring i. The neighboring relationship has the following properties: (1) A site is not neighboring to itself (2) The neighboring relationship is mutual f1 f2 f3 f4 fi f6 f7 f8 f9

6 Example(Regular sites)
First order neighborhood system Second order neighborhood system Nth order neighborhood system

7 Example(Irregular sites)
The neighboring sites of the site i are m, n, and f. The neighboring sites of the site j are r and x

8 Clique A clique C is defined as a subset of sites in S.
Following are some examples Single-site pair-site triple-site

9 Clique: Example Take first order neighborhood system and second order neighborhood for example: Neighborhood system Clique types

10 Random field Random field is a list ofΒ random numbersΒ whose indices are mapped onto a space (of nΒ dimensions) F = {F1, , Fm} be a family of random variables defined on the set S in which each random variable Fi takes a value fi in L. The family F is called a random field.

11 Markov Random field Β View the 2D image f as the collection of the random variables (Random field) Markov Random field is a set of random variablesΒ having aΒ Markov property

12 Gibbs random field (GRF) and Gibbs distribution
A random field is said to be a Gibbs random field if and only if its configuration f obeys Gibbs distribution, that is: Design U for different applications f1 f2 f3 f4 fi f6 f7 f8 f9 Image configuration f U(f): Energy function; T: Temperature Vi(f): Clique potential

13 Role of Energy Function
(1) As the quantitative measure of the global quality of the solution and (2) As a guide to the search for a minimal solution. By MRF Modeling to find 𝑓 βˆ— = arg π‘šπ‘–π‘›π‘“ 𝐸(𝑓|𝑑)

14 Role of Temperature The temperature T controls the sharpness of the distribution. When Temperature is high, all configurations tend to be equally distributed.

15 Markov-Gibbs equivalence
Hammersley-Clifford theorem: A random field F is an MRF if and only if F is a GRF Proof: Let P(f) be a Gibbs distribution on S with the neighborhood system N. f1 f2 f3 f4 fi f6 f7 f8 f9 A 3x3 imagined image

16 Markov-Gibbs equivalence
Divide C into two set A and B with A consisting of cliques containing i and B cliques not containing i: f1 f2 f3 f4 fi f6 f7 f8 f9 A 3x3 imagined image

17 Optimization-based vision problem

18 Denoising Noisy signal d denoised signal f

19 The MAP-MRF Framework When both prior and likelihood is known
MAP-MRF Labeling 𝑓 βˆ— = arg π‘šπ‘Žπ‘₯𝑓 𝑃(𝑓|𝑑) 𝑃 𝑓 𝑑 βˆπ‘ƒ 𝑓,𝑑 =𝑝 𝑑 𝑓 𝑃 𝑓 𝑓 βˆ— = arg π‘šπ‘Žπ‘₯𝑓 𝑝 𝑑 𝑓 𝑃 𝑓

20 MAP formulation for denoising problem
Assume the observation is the true signal plus the independent Gaussian noise, that is 𝑑𝑖=𝑓𝑖+𝑒𝑖 , 𝑒𝑖~𝑁(0,𝜎2) Under above circumstance, the observation model could be expressed as U(d|f): Likelihood energy

21 MAP formulation for denoising problem
Assume the unknown data f is MRF, the prior model is: Based on above information, the posteriori probability becomes

22 MAP formulation for denoising problem
The MAP estimator for the problem is: ?

23 The Smoothness Prior U(f)=[f(n)(x)]2 the order n determines the number of sites in the cliques involved N=1 (constant gray level) π‘ˆ 𝑓 = 𝑓 β€² π‘₯ 2𝑑π‘₯ π‘ˆ 𝑓 = 𝑖 π‘“π‘–βˆ’ 𝑓 π‘–βˆ’1 2 N=2 (constant gradient) π‘ˆ 𝑓 = 𝑓 β€² β€² π‘₯ 2𝑑π‘₯ π‘ˆ 𝑓 = 𝑖 𝑓 𝑖+1 βˆ’2𝑓𝑖+ 𝑓 π‘–βˆ’1 2 N=3 (constant curvature) π‘ˆ 𝑓 = 𝑓 β€²β€²β€² π‘₯ 2𝑑π‘₯ π‘ˆ 𝑓 = 𝑖 𝑓 𝑖+1 βˆ’3 𝑓 𝑖1 +3 𝑓 π‘–βˆ’1 βˆ’ 𝑓 π‘–βˆ’ 𝑖 𝑓 𝑖+1 βˆ’3 𝑓 𝑖1 +3 𝑓 π‘–βˆ’1 βˆ’ 𝑓 π‘–βˆ’2 2

24 MAP formulation for denoising problem
Define the smoothness prior: Substitute above information into the MAP estimator, we could get: Call posterior Energy function Observation model (Similarity measure) Prior model (Reconstruction constrain)

25 Piecewise Continuous Restoration
𝐸 𝑓 = 𝑖=1 π‘š 𝑓 𝑖 βˆ’ 𝑑 𝑖 2+ 2Ξ» 𝑖=1 π‘š 𝑔 𝑓 𝑖 βˆ’ 𝑓 π‘–βˆ’1 If g(x)=x2, at discontinuities (𝑓 𝑖 βˆ’ 𝑓 π‘–βˆ’1 )2 tend to be very large , giving an oversmoothed result. To encode piecewise smoothness , g should be saturate at its asymptotic upper bound to allow discontinuities 𝑔 π‘₯ =min⁑{π‘₯2,𝐢}

26 Result


Download ppt "Markov random field Institute of Electronics, NCTU"

Similar presentations


Ads by Google