Presentation is loading. Please wait.

Presentation is loading. Please wait.

Belief Propagation on Markov Random Fields Aggeliki Tsoli.

Similar presentations


Presentation on theme: "Belief Propagation on Markov Random Fields Aggeliki Tsoli."— Presentation transcript:

1 Belief Propagation on Markov Random Fields Aggeliki Tsoli

2 3/1/2008MLRG2 Outline Graphical Models Markov Random Fields (MRFs) Belief Propagation

3 3/1/2008MLRG3 Graphical Models Diagrams Nodes: random variables Edges: statistical dependencies among random variables Advantages: 1. Better visualization conditional independence properties new models design 2. Factorization

4 3/1/2008MLRG4 Graphical Models types Directed causal relationships e.g. Bayesian networks Undirected no constraints imposed on causality of events (“weak dependencies”) Markov Random Fields (MRFs)

5 3/1/2008MLRG5 Example MRF Application: Image Denoising Question: How can we retrieve the original image given the noisy one? Original image (Binary) Noisy image e.g. 10% of noise

6 3/1/2008MLRG6 MRF formulation Nodes For each pixel i, x i : latent variable (value in original image) y i : observed variable (value in noisy image)  x i, y i  {0,1} x1x1 x2x2 xixi xnxn y1y1 y2y2 yiyi ynyn

7 3/1/2008MLRG7 MRF formulation Edges x i,y i of each pixel i correlated local evidence function  (x i,y i ) E.g.  (x i,y i ) = 0.9 (if x i = y i ) and  (x i,y i ) = 0.1 otherwise (10% noise) Neighboring pixels, similar value compatibility function  (x i, x j ) x1x1 x2x2 xixi xnxn y1y1 y2y2 yiyi ynyn

8 3/1/2008MLRG8 MRF formulation Question: What are the marginal distributions for x i, i = 1, …,n? x1x1 x2x2 xixi xnxn y1y1 y2y2 yiyi ynyn P(x 1, x 2, …, x n ) = (1/Z)  (ij)  (x i, x j )  i  (x i, y i )

9 3/1/2008MLRG9 Belief Propagation Goal: compute marginals of the latent nodes of underlying graphical model Attributes: iterative algorithm message passing between neighboring latent variables nodes Question: Can it also be applied to directed graphs? Answer: Yes, but here we will apply it to MRFs

10 3/1/2008MLRG10 1)Select random neighboring latent nodes x i, x j 2)Send message m i  j from x i to x j 3)Update belief about marginal distribution at node x j 4)Go to step 1, until convergence How is convergence defined? Belief Propagation Algorithm xixi xjxj yiyi yjyj mijmij

11 3/1/2008MLRG11 Message m i  j from x i to x j : what node x i thinks about the marginal distribution of x j Step 2: Message Passing xixi xjxj yiyi yjyj N(i)\j m i  j (x j ) =  (x i )  (x i, y i )  (x i, x j )  k  N(i)\j m k  i (x i ) Messages initially uniformly distributed

12 3/1/2008MLRG12 Step 3: Belief Update xjxj yjyj N(j) b(x j ) = k  (x j, y j )  q  N(j) m q  j (x j ) Belief b(x j ): what node x j thinks its marginal distribution is

13 3/1/2008MLRG13 1)Select random neighboring latent nodes x i, x j 2)Send message m i  j from x i to x j 3)Update belief about marginal distribution at node x j 4)Go to step 1, until convergence Belief Propagation Algorithm xixi xjxj yiyi yjyj mijmij

14 3/1/2008MLRG14 Example 2 - Compute belief at node 1. 1 3 4 m21m21 m32m32 m42m42 Fig. 12 (Yedidia et al.)

15 3/1/2008MLRG15 Does graph topology matter? BP procedure the same! Performance Failure to converge/predict accurate beliefs [Murphy, Weiss, Jordan 1999] Success at decoding for error-correcting codes [Frey and Mackay 1998] computer vision problems where underlying MRF full of loops [Freeman, Pasztor, Carmichael 2000] vs.

16 3/1/2008MLRG16 How long does it take? No explicit reference on paper My opinion, depends on nodes of graph graph topology Work on improving the running time of BP (for specific applications) Next time?

17 3/1/2008MLRG17 Questions?

18 3/1/2008MLRG18 Next time ? BP on directed graphs Improve running time of BP More about loopy BP Can an initial estimation of messages (non- uniform) alleviate the problem?


Download ppt "Belief Propagation on Markov Random Fields Aggeliki Tsoli."

Similar presentations


Ads by Google