CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep 15 2009.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Gibbs sampler - simple properties It’s not hard to show that this MC chain is aperiodic. Often is reversible distribution. If in addition the chain is.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Loopy Belief Propagation a summary. What is inference? Given: –Observabled variables Y –Hidden variables X –Some model of P(X,Y) We want to make some.
CS774. Markov Random Field : Theory and Application Lecture 17 Kyomin Jung KAIST Nov
Approximation Algorithms for Unique Games Luca Trevisan Slides by Avi Eyal.
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
Survey Propagation Algorithm
CS774. Markov Random Field : Theory and Application Lecture 20 Kyomin Jung KAIST Nov
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Markov Networks.
Approximate Counting via Correlation Decay Pinyan Lu Microsoft Research.
CS774. Markov Random Field : Theory and Application Lecture 16 Kyomin Jung KAIST Nov
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
CS774. Markov Random Field : Theory and Application Lecture 06 Kyomin Jung KAIST Sep
Totally Unimodular Matrices Lecture 11: Feb 23 Simplex Algorithm Elliposid Algorithm.
Computational statistics 2009 Random walk. Computational statistics 2009 Random walk with absorbing barrier.
Conditional Random Fields
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Understanding Belief Propagation and its Applications Dan Yuan June 2004.
Code and Decoder Design of LDPC Codes for Gbps Systems Jeremy Thorpe Presented to: Microsoft Research
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
A Trainable Graph Combination Scheme for Belief Propagation Kai Ju Liu New York University.
The Role of Specialization in LDPC Codes Jeremy Thorpe Pizza Meeting Talk 2/12/03.
CS774. Markov Random Field : Theory and Application Lecture 10 Kyomin Jung KAIST Oct
CS774. Markov Random Field : Theory and Application Lecture 08 Kyomin Jung KAIST Sep
Proving Non-Reconstruction on Trees by an Iterative Algorithm Elitza Maneva University of Barcelona joint work with N. Bhatnagar, Hebrew University.
Adaptive CSMA under the SINR Model: Fast convergence using the Bethe Approximation Krishna Jagannathan IIT Madras (Joint work with) Peruru Subrahmanya.
CS774. Markov Random Field : Theory and Application Lecture 13 Kyomin Jung KAIST Oct
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
CS774. Markov Random Field : Theory and Application Lecture 21 Kyomin Jung KAIST Nov
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
AN ORTHOGONAL PROJECTION
Expanders via Random Spanning Trees R 許榮財 R 黃佳婷 R 黃怡嘉.
Probabilistic Graphical Models
The satisfiability threshold and clusters of solutions in the 3-SAT problem Elitza Maneva IBM Almaden Research Center.
Markov Random Fields Probabilistic Models for Images
Tokyo Institute of Technology, Japan Yu Nishiyama and Sumio Watanabe Theoretical Analysis of Accuracy of Gaussian Belief Propagation.
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
CS774. Markov Random Field : Theory and Application Lecture 02
CS Statistical Machine learning Lecture 24
Approximate Inference: Decomposition Methods with Applications to Computer Vision Kyomin Jung ( KAIST ) Joint work with Pushmeet Kohli (Microsoft Research)
Lecture 2: Statistical learning primer for biologists
An Introduction to Markov Chain Monte Carlo Teg Grenager July 1, 2004.
Belief Propagation and its Generalizations Shane Oldenburger.
Seminar on random walks on graphs Lecture No. 2 Mille Gandelsman,
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
CS774. Markov Random Field : Theory and Application Lecture 15 Kyomin Jung KAIST Oct
Spatial decay of correlations and efficient methods for computing partition functions. David Gamarnik Joint work with Antar Bandyopadhyay (U of Chalmers),
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Markov Networks: Theory and Applications Ying Wu Electrical Engineering and Computer Science Northwestern University Evanston, IL 60208
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Random Sampling Algorithms with Applications Kyomin Jung KAIST Aug ERC Workshop.
Theory of Computational Complexity Probability and Computing Ryosuke Sasanuma Iwama and Ito lab M1.
Trees.
Character-Based Phylogeny Reconstruction
Exact Inference Continued
Markov Networks.
Exact Inference ..
Exact Inference Continued
Expectation-Maximization & Belief Propagation
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
Markov Networks.
Generalized Belief Propagation
Locality In Distributed Graph Algorithms
Presentation transcript:

CS774. Markov Random Field : Theory and Application Lecture 04 Kyomin Jung KAIST Sep

Basic Idea of Belief Propagation (BP) Let be the marginal prob. of the MRF on the subtree rooted at j, and so on. i j k … …

Belief Propagation (BP) ijk

i j ∏ Belief at node i at time t: NiNi For t>n, and

Properties of BP (and MP) Exact for trees  Each node separates Graph into 2 disjoint components On a tree, the BP algorithm converges in time proportional to diameter of the graph – at most linear For general Graphs  Exact inference is NP-hard  Constant Approximate inference is hard

Loopy Belief Propagation Approaches for general graphs  Exact Inference Computation tree based approach (for graph with large girth) Junction Tree algorithm (for bounded tree width graph) Graph cut algorithm (for submodular MRF)  Approximate Inference Loopy BP Sampling based algorithm Graph decomposition based approximation

Loopy Belief Propagation If BP is used on graphs with loops, messages may circulate indefinitely Empirically, a good approximation is still achievable  Stop after fixed # of iterations  Stop when no significant change in beliefs  If solution is not oscillatory but converges, it usually is a good approximation Example: LDPC Codes

Fixed point of BP Messages of BP at time t forms a dimensional real vector. Let M(t) be this vector. If we normalize, the output of BP(marginal probabilities) is the same. BP algorithm is a continuous function that maps M(t) to M(t+1).  BP: Hence, by Brouwer Fixed Point Theorem, BP has at least one fixed point. (since the domain is a convex, compact set)

Fixed point of BP Now important questions are  “Is there a unique fixed point ?”  “Does BP converges to a fixed point ?”  “If it does, how fast ?” Studying these questions are of current research topics.  Ex, studying them for restricted class of MRF (ex graphs with large girth)  Studying relations of BP fixed point with other values (ex Minima of the Bethe Free energy)

Girth of a Graph For a graph G=(V,E), the girth of G is the length of a shortest cycle contained in G. If G has girth, and bounded degree, and the MRF satisfies exponential (spatial) correlation decay, then BP computes good approximation of the solution.  Proof: By considering computation tree of BP  It can be used to design a system based on MRF Ex: LDPC code

Computation Tree of BP Graph G Computation tree of G at x1

(Temporal) Decay of correlations in Markov chains A Markov chain with transition matrix satisfies decay of correlation (mixes) if and only if it is aperiodic (Spatial) Decay of correlations Same thing, but time is replaced by a “spatial” distance Correlation Decay

A sequence of spatially (graph) related random variables exhibits a correlation decay(long-range independence), if when is large Principle motivation - statistical phyisics. Uniqueness of Gibbs measures on infinite lattices, Dobrushin [60s]. Correlation Decay

Weitz [05]. Independent sets - graph Goldberg, Martin & Paterson [05]. Coloring. General graphs Jonasson [01]. Coloring. Regular trees is the maximum vertex degree of G. in the independent set is the weight for each vertex. (i.e. weight for an independent set of size I is ) q in the coloring problem is the number of possible colors. What is known about correlation decay ?