Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.

Slides:



Advertisements
Similar presentations
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
Advertisements

Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
1 Chapter 5 Belief Updating in Bayesian Networks Bayesian Networks and Decision Graphs Finn V. Jensen Qunyuan Zhang Division. of Statistical Genomics,
Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv 1.
Lauritzen-Spiegelhalter Algorithm
Exact Inference in Bayes Nets
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering A Comparison of Lauritzen- Spiegelhalter, Hugin, and Shenoy- Shafer Architectures.
Dynamic Bayesian Networks (DBNs)
Clique Trees Amr Ahmed October 23, Outline Clique Trees Representation Factorization Inference Relation with VE.
Pearl’s Belief Propagation Algorithm Exact answers from tree-structured Bayesian networks Heavily based on slides by: Tomas Singliar,
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Hidden Markov Models M. Vijay Venkatesh. Outline Introduction Graphical Model Parameterization Inference Summary.
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
Junction tree Algorithm :Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati.
From Variable Elimination to Junction Trees
Machine Learning CUNY Graduate Center Lecture 6: Junction Tree Algorithm.
CSCI 121 Special Topics: Bayesian Networks Lecture #3: Multiply-Connected Graphs and the Junction Tree Algorithm.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Inference in Bayesian Nets
April 2004Bayesian Networks in Educational Assessment - Section II Tutorial 12 Inference Based on ETS lecture.
Bayesian Networks Clique tree algorithm Presented by Sergey Vichik.
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Exact Inference: Clique Trees
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graphs.
Aspects of Bayesian Inference and Statistical Disclosure Control in Python Duncan Smith Confidentiality and Privacy Group CCSR University of Manchester.
Computer vision: models, learning and inference
CSC2535 Spring 2013 Lecture 2a: Inference in factor graphs Geoffrey Hinton.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by B.-H. Kim Biointelligence Laboratory, Seoul National.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Belief Propagation. What is Belief Propagation (BP)? BP is a specific instance of a general class of methods that exist for approximate inference in Bayes.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
Inference Complexity As Learning Bias Daniel Lowd Dept. of Computer and Information Science University of Oregon Joint work with Pedro Domingos.
Solving Bayesian Decision Problems: Variable Elimination and Strong Junction Tree Methods Presented By: Jingsong Wang Scott Langevin May 8, 2009.
Readings: K&F: 11.3, 11.5 Yedidia et al. paper from the class website
An Introduction to Variational Methods for Graphical Models
Intro to Junction Tree propagation and adaptations for a Distributed Environment Thor Whalen Metron, Inc.
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
R-Trees: A Dynamic Index Structure For Spatial Searching Antonin Guttman.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
Pattern Recognition and Machine Learning
Machine Learning – Lecture 18
Perceptual and Sensory Augmented Computing Machine Learning, Summer’09 Machine Learning – Lecture 13 Exact Inference & Belief Propagation Bastian.
Today Graphical Models Representing conditional dependence graphically
Belief propagation with junction trees Presented by Mark Silberstein and Yaniv Hamo.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
Slide 1 Directed Graphical Probabilistic Models: inference William W. Cohen Machine Learning Feb 2008.
Integrative Genomics I BME 230. Probabilistic Networks Incorporate uncertainty explicitly Capture sparseness of wiring Incorporate multiple kinds of data.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk
Inference in Bayesian Networks
StatSense In-Network Probabilistic Inference over Sensor Networks
Exact Inference Continued
Markov Networks.
Belief Propagation: An Extremely Rudimentary Discussion
CSCI 5822 Probabilistic Models of Human and Machine Learning
Propagation Algorithm in Bayesian Networks
Exact Inference Continued
CS 188: Artificial Intelligence Fall 2008
Variable Elimination 2 Clique Trees
Lecture 3: Exact Inference in GMs
Clique Tree Algorithm: Computation
Junction Trees 3 Undirected Graphical Models
Variable Elimination Graphical Models – Carlos Guestrin
Mean Field and Variational Methods Loopy Belief Propagation
Presentation transcript:

Junction Trees And Belief Propagation

Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for each one in turn is inefficient Solution: Junction trees (a.k.a. join trees, clique trees)

Junction Trees: Basic Idea In HMMs, we efficiently computed all marginals using dynamic programming An HMM is a linear chain, but the same method applies if the graph is a tree If the graph is not a tree, reduce it to one by clustering variables

The Junction Tree Algorithm 1.Moralize graph (if Bayes net) 2.Remove arrows (if Bayes net) 3.Triangulate graph 4.Build clique graph 5.Build junction tree 6.Choose root 7.Populate cliques 8.Do belief propagation

Example

Step 1: Moralize the Graph

Step 2: Remove Arrows

Step 3: Triangulate the Graph

Step 4: Build Clique Graph

The Clique Graph

Junction Trees A junction tree is a subgraph of the clique graph that: 1. Is a tree 2. Contains all the nodes of the clique graph 3. Satisfies the running intersection property. Running intersection property: For each pair U, V of cliques with intersection S, all cliques on the path between U and V contain S.

Step 5: Build the Junction Tree

Step 6: Choose a Root

Step 7: Populate the Cliques Place each potential from the original network in a clique containing all the variables it references For each clique node, form the product of the distributions in it (as in variable elimination).

Step 7: Populate the Cliques

Step 8: Belief Propagation 1.Incorporate evidence 2.Upward pass: Send messages toward root 3.Downward pass: Send messages toward leaves

Step 8.1: Incorporate Evidence For each evidence variable, go to one table that includes that variable. Set to 0 all entries in that table that disagree with the evidence.

Step 8.2: Upward Pass For each leaf in the junction tree, send a message to its parent. The message is the marginal of its table, summing out any variable not in the separator. When a parent receives a message from a child, it multiplies its table by the message table to obtain its new table. When a parent receives messages from all its children, it repeats the process (acts as a leaf). This process continues until the root receives messages from all its children.

Step 8.3: Downward Pass Reverses upward pass, starting at the root. The root sends a message to each of its children. More specifically, the root divides its current table by the message received from the child, marginalizes the resulting table to the separator, and sends the result to the child. Each child multiplies its table by its parent’s table and repeats the process (acts as a root) until leaves are reached. Table at each clique is joint marginal of its variables; sum out as needed. We’re done!

Inference Example: Going Up (No evidence)

Status After Upward Pass

Going Back Down

Status After Downward Pass

Why Does This Work? The junction tree algorithm is just a way to do variable elimination in all directions at once, storing intermediate results at each step.

The Link Between Junction Trees and Variable Elimination To eliminate a variable at any step, we combine all remaining tables involving that variable. A node in the junction tree corresponds to the variables in one of the tables created during variable elimination (the other variables required to remove a variable). An arc in the junction tree shows the flow of data in the elimination computation.

Junction Tree Savings Avoids redundancy in repeated variable elimination Need to build junction tree only once ever Need to repeat belief propagation only when new evidence is received

Loopy Belief Propagation Inference is efficient if graph is tree Inference cost is exponential in treewidth (size of largest clique in graph – 1) What if treewidth is too high? Solution: Do belief prop. on original graph May not converge, or converge to bad approx. In practice, often fast and good approximation

Loopy Belief Propagation Nodes (x)Factors (f)