Belief Propagation on Markov Random Fields Aggeliki Tsoli.

Slides:



Advertisements
Similar presentations
Mean-Field Theory and Its Applications In Computer Vision1 1.
Advertisements

Bayesian Belief Propagation
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
Exact Inference in Bayes Nets
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
1 Bayesian Image Modeling by Generalized Sparse Markov Random Fields and Loopy Belief Propagation Kazuyuki Tanaka GSIS, Tohoku University, Sendai, Japan.
Graphical models, belief propagation, and Markov random fields 1.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
1 Graphical Models in Data Assimilation Problems Alexander Ihler UC Irvine Collaborators: Sergey Kirshner Andrew Robertson Padhraic Smyth.
1 Image Completion using Global Optimization Presented by Tingfan Wu.
1 On the Statistical Analysis of Dirty Pictures Julian Besag.
“Human Control of an Anthropomorphic Robot Hand”
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Conditional Random Fields
Efficiently Solving Convex Relaxations M. Pawan Kumar University of Oxford for MAP Estimation Philip Torr Oxford Brookes University.
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Understanding Belief Propagation and its Applications Dan Yuan June 2004.
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
. Applications and Summary. . Presented By Dan Geiger Journal Club of the Pharmacogenetics Group Meeting Technion.
A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University.
Super-Resolution of Remotely-Sensed Images Using a Learning-Based Approach Isabelle Bégin and Frank P. Ferrie Abstract Super-resolution addresses the problem.
24 November, 2011National Tsin Hua University, Taiwan1 Mathematical Structures of Belief Propagation Algorithms in Probabilistic Information Processing.
Reconstructing Relief Surfaces George Vogiatzis, Philip Torr, Steven Seitz and Roberto Cipolla BMVC 2004.
Mean Field Inference in Dependency Networks: An Empirical Study Daniel Lowd and Arash Shamaei University of Oregon.
CSE 511a: Artificial Intelligence Spring 2013
Message-Passing for Wireless Scheduling: an Experimental Study Paolo Giaccone (Politecnico di Torino) Devavrat Shah (MIT) ICCCN 2010 – Zurich August 2.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Lecture 26: Single-Image Super-Resolution CAP 5415.
Markov Random Fields Probabilistic Models for Images
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
CS774. Markov Random Field : Theory and Application Lecture 02
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
Lecture 2: Statistical learning primer for biologists
Introduction to Belief Propagation
Belief Propagation and its Generalizations Shane Oldenburger.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Markov Random Fields & Conditional Random Fields
Contextual models for object detection using boosted random fields by Antonio Torralba, Kevin P. Murphy and William T. Freeman.
Pattern Recognition and Machine Learning
Efficient Belief Propagation for Image Restoration Qi Zhao Mar.22,2006.
Tightening LP Relaxations for MAP using Message-Passing David Sontag Joint work with Talya Meltzer, Amir Globerson, Tommi Jaakkola, and Yair Weiss.
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Markov Random Fields in Vision
Slide 1 Directed Graphical Probabilistic Models: inference William W. Cohen Machine Learning Feb 2008.
Expectation Propagation for Graphical Models Yuan (Alan) Qi Joint work with Tom Minka.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Extending Expectation Propagation for Graphical Models
Today.
Prof. Adriana Kovashka University of Pittsburgh April 4, 2017
CSCI 5822 Probabilistic Models of Human and Machine Learning
Generalized Belief Propagation
Inferring Edges by Using Belief Propagation
Markov Random Fields Presented by: Vladan Radosavljevic.
Graduate School of Information Sciences, Tohoku University
Physical Fluctuomatics 7th~10th Belief propagation
Expectation-Maximization & Belief Propagation
Probabilistic image processing and Bayesian network
Lecture 3: Exact Inference in GMs
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Markov Networks.
Mean Field and Variational Methods Loopy Belief Propagation
Presentation transcript:

Belief Propagation on Markov Random Fields Aggeliki Tsoli

3/1/2008MLRG2 Outline Graphical Models Markov Random Fields (MRFs) Belief Propagation

3/1/2008MLRG3 Graphical Models Diagrams Nodes: random variables Edges: statistical dependencies among random variables Advantages: 1. Better visualization conditional independence properties new models design 2. Factorization

3/1/2008MLRG4 Graphical Models types Directed causal relationships e.g. Bayesian networks Undirected no constraints imposed on causality of events (“weak dependencies”) Markov Random Fields (MRFs)

3/1/2008MLRG5 Example MRF Application: Image Denoising Question: How can we retrieve the original image given the noisy one? Original image (Binary) Noisy image e.g. 10% of noise

3/1/2008MLRG6 MRF formulation Nodes For each pixel i, x i : latent variable (value in original image) y i : observed variable (value in noisy image)  x i, y i  {0,1} x1x1 x2x2 xixi xnxn y1y1 y2y2 yiyi ynyn

3/1/2008MLRG7 MRF formulation Edges x i,y i of each pixel i correlated local evidence function  (x i,y i ) E.g.  (x i,y i ) = 0.9 (if x i = y i ) and  (x i,y i ) = 0.1 otherwise (10% noise) Neighboring pixels, similar value compatibility function  (x i, x j ) x1x1 x2x2 xixi xnxn y1y1 y2y2 yiyi ynyn

3/1/2008MLRG8 MRF formulation Question: What are the marginal distributions for x i, i = 1, …,n? x1x1 x2x2 xixi xnxn y1y1 y2y2 yiyi ynyn P(x 1, x 2, …, x n ) = (1/Z)  (ij)  (x i, x j )  i  (x i, y i )

3/1/2008MLRG9 Belief Propagation Goal: compute marginals of the latent nodes of underlying graphical model Attributes: iterative algorithm message passing between neighboring latent variables nodes Question: Can it also be applied to directed graphs? Answer: Yes, but here we will apply it to MRFs

3/1/2008MLRG10 1)Select random neighboring latent nodes x i, x j 2)Send message m i  j from x i to x j 3)Update belief about marginal distribution at node x j 4)Go to step 1, until convergence How is convergence defined? Belief Propagation Algorithm xixi xjxj yiyi yjyj mijmij

3/1/2008MLRG11 Message m i  j from x i to x j : what node x i thinks about the marginal distribution of x j Step 2: Message Passing xixi xjxj yiyi yjyj N(i)\j m i  j (x j ) =  (x i )  (x i, y i )  (x i, x j )  k  N(i)\j m k  i (x i ) Messages initially uniformly distributed

3/1/2008MLRG12 Step 3: Belief Update xjxj yjyj N(j) b(x j ) = k  (x j, y j )  q  N(j) m q  j (x j ) Belief b(x j ): what node x j thinks its marginal distribution is

3/1/2008MLRG13 1)Select random neighboring latent nodes x i, x j 2)Send message m i  j from x i to x j 3)Update belief about marginal distribution at node x j 4)Go to step 1, until convergence Belief Propagation Algorithm xixi xjxj yiyi yjyj mijmij

3/1/2008MLRG14 Example 2 - Compute belief at node m21m21 m32m32 m42m42 Fig. 12 (Yedidia et al.)

3/1/2008MLRG15 Does graph topology matter? BP procedure the same! Performance Failure to converge/predict accurate beliefs [Murphy, Weiss, Jordan 1999] Success at decoding for error-correcting codes [Frey and Mackay 1998] computer vision problems where underlying MRF full of loops [Freeman, Pasztor, Carmichael 2000] vs.

3/1/2008MLRG16 How long does it take? No explicit reference on paper My opinion, depends on nodes of graph graph topology Work on improving the running time of BP (for specific applications) Next time?

3/1/2008MLRG17 Questions?

3/1/2008MLRG18 Next time ? BP on directed graphs Improve running time of BP More about loopy BP Can an initial estimation of messages (non- uniform) alleviate the problem?