. 236372 - Bayesian Networks Clique tree algorithm Presented by Sergey Vichik.

Slides:



Advertisements
Similar presentations
6.896: Topics in Algorithmic Game Theory Lecture 21 Yang Cai.
Advertisements

Constraint Satisfaction Problems
Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
1 Chapter 5 Belief Updating in Bayesian Networks Bayesian Networks and Decision Graphs Finn V. Jensen Qunyuan Zhang Division. of Statistical Genomics,
Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv 1.
Lauritzen-Spiegelhalter Algorithm
Greedy Algorithms Greed is good. (Some of the time)
Exact Inference in Bayes Nets
Minimum Spanning Trees Definition Two properties of MST’s Prim and Kruskal’s Algorithm –Proofs of correctness Boruvka’s algorithm Verifying an MST Randomized.
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
An Introduction to Variational Methods for Graphical Models.
Clique Trees Amr Ahmed October 23, Outline Clique Trees Representation Factorization Inference Relation with VE.
A Randomized Linear-Time Algorithm to Find Minimum Spanning Trees David R. Karger David R. Karger Philip N. Klein Philip N. Klein Robert E. Tarjan.
Junction Tree Algorithm Brookes Vision Reading Group.
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
Junction tree Algorithm :Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati.
From Variable Elimination to Junction Trees
Machine Learning CUNY Graduate Center Lecture 6: Junction Tree Algorithm.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Global Approximate Inference Eran Segal Weizmann Institute.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
1 7-MST Minimal Spanning Trees Fonts: MTExtra:  (comment) Symbol:  Wingdings: Fonts: MTExtra:  (comment) Symbol:  Wingdings:
DAST 2005 Tirgul 14 (and more) sample questions. DAST 2005 (reminder?) Kruskal’s MST Algorithm.
December Marginal and Joint Beliefs in BN1 A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Mark.
Lecture 12 Minimum Spanning Tree. Motivating Example: Point to Multipoint Communication Single source, Multiple Destinations Broadcast – All nodes in.
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
Is the following graph Hamiltonian- connected from vertex v? a). Yes b). No c). I have absolutely no idea v.
Exact Inference: Clique Trees
Minimum Spanning Trees
External-Memory MST (Arge, Brodal, Toma). Minimum-Spanning Tree Given a weighted, undirected graph G=(V,E), the minimum-spanning tree (MST) problem is.
PGM 2002/03 Tirgul5 Clique/Junction Tree Inference.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Minimum Spanning Trees. Subgraph A graph G is a subgraph of graph H if –The vertices of G are a subset of the vertices of H, and –The edges of G are a.
Introduction to Graph Theory
Theory of Computing Lecture 10 MAS 714 Hartmut Klauck.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Graphs Rosen, Chapter 8. Isomorphism (Rosen 560 to 563) Are two graphs G1 and G2 of equal form? That is, could I rename the vertices of G1 such that the.
1 Inferring structure to make substantive conclusions: How does it work? Hypothesis testing approaches: Tests on deviances, possibly penalised (AIC/BIC,
 2004 SDU Lecture 7- Minimum Spanning Tree-- Extension 1.Properties of Minimum Spanning Tree 2.Secondary Minimum Spanning Tree 3.Bottleneck.
Solving Bayesian Decision Problems: Variable Elimination and Strong Junction Tree Methods Presented By: Jingsong Wang Scott Langevin May 8, 2009.
Lecture 19 Greedy Algorithms Minimum Spanning Tree Problem.
Introduction to Graph Theory
An Introduction to Variational Methods for Graphical Models
Intro to Junction Tree propagation and adaptations for a Distributed Environment Thor Whalen Metron, Inc.
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
On Distributing a Bayesian Network
Introduction to Graph Theory
Today Graphical Models Representing conditional dependence graphically
Algorithms for hard problems Parameterized complexity Bounded tree width approaches Juris Viksna, 2015.
Divide and Conquer. Problem Solution 4 Example.
An introduction to chordal graphs and clique trees
Inference in Bayesian Networks
What is the next line of the proof?
Exact Inference ..
Lecture 12 Algorithm Analysis
Exact Inference Continued
CSCE350 Algorithms and Data Structure
Dependency Models – abstraction of Probability distributions
CSCI 5822 Probabilistic Models of Human and Machine Learning
Lecture 19-Problem Solving 4 Incremental Method
Outline This topic covers Prim’s algorithm:
Exact Inference ..
Lecture 14 Shortest Path (cont’d) Minimum Spanning Tree
Clique Tree Algorithm: Computation
Lecture 12 Algorithm Analysis
Presentation transcript:

Bayesian Networks Clique tree algorithm Presented by Sergey Vichik

Algorithm sequence 1. Translate a BN to Markov graph (moralization) 2. Add edges to create chordal graph 3. Find cliques 4. Construct clique tree 5. Enter evidences 6. Calc a posteriori probability (inference) Moralization Add edges Find Cliques Enter evidences Create clique tree Inference Bayesian network Markov graph Chordal graph Cliques graph Clique tree Tree with Evidences Evidences

BN to Markov u Add an edge between parents. B D E F C A G B D E F C A G Moralization Add edges Find Cliques Enter evidences Create clique tree Inference Bayesian network Markov graph Chordal graph Cliques graph Clique tree Tree with Evidences Evidences

Create Chordal graph u Add edges to form a chordal graph. B D E F C A G B D E F C A G Moralization Add edges Find Cliques Enter evidences Create clique tree Inference Bayesian network Markov graph Chordal graph Cliques graph Clique tree Tree with Evidences Evidences

Cliques graph u Find all cliques, connect them to form a clique graph u Cliques are connected if they are sharing a variable B D E F C A G AGC CDE ADC EF ABD Moralization Add edges Find Cliques Enter evidences Create clique tree Inference Bayesian network Markov graph Chordal graph Cliques graph Clique tree Tree with Evidences Evidences

Running Intersection property u Many trees may be embedded in the cliques graph. What tree to choose? u Lets try: {ADC}-{AGC}-{ABD}-{CDE}-{EF} u Now lets follow this order of cliques and perform an elimination:  Eliminate F, Eliminate E  Now, in order to continue we need to eliminate either C or D. Eliminating any of them will result in creating an extra edge : CB or GD, thus enlarging the probability tables. u The required property : If variable x is contained in cliques Y and Z, it must be contained in every. clique on path from Y to Z. AGC CDE ADC EF ABD B D E F C A G

Clique tree construction u A maximal spanning tree of the clique graph, is the required clique tree. u Proof: At the end of the lecture. AGC CDE ADC EF ABD Moralization Add edges Find Cliques Enter evidences Create clique tree Inference Bayesian network Markov graph Chordal graph Cliques graph Clique tree Tree with Evidences Evidences

Enter evidence u Entering evidence is eqivalent to removing the evidence variables and recalculating the effected probability functions. u General approach : build the clique tree without the evidence, and then recalculate the effected cliques. Actually it means reducing the tables to specific values :  f(E,F) -> f(E,F=f). Moralization Add edges Find Cliques Enter evidences Create clique tree Inference Bayesian network Markov graph Chordal graph Cliques graph Clique tree Tree with Evidences Evidences

Efficient Calculation of probabilities u m ij – a message from i to j u Wait for all messages excluding j (immediately for leaves). u Store messages on the edge for efficient update. AGC CDE ADC EF ABD ACAD CD E Moralization Add edges Find Cliques Enter evidences Create clique tree Inference Bayesian network Markov graph Chordal graph Cliques graph Clique tree Tree with Evidences Evidences

Calculation of a marginal probability 1) Select clique with a variable. 2) Multiply all incoming messages. 3) Marginalize to the required variable. P=f(ADC)f(AGC)f(ABD)f(CDE)f(EF) AGC CDE ADC EF ABD ACAD CD E

Maximum Spanning tree is a Clique Tree – a Proof../1 1) From graph theory : Cycle property:  For any cycle C in the graph, if the weight of an edge e of C is smaller than the weights of other edges of C, then this edge cannot belong to an MST. 2) For any chordal graph exists a clique tree.  Every chordal graph have a perfect elimination order.

Proof../2 u Lets take a clique tree that has as much common edges with MST but different. u For the purpose of contradiction, assume that edge (K1,K2) in MST is not an edge in CT (Clique Tree). u take the cut set associated with (K1,K2) in MST from the full graph. u There must be another edge (K3,K4) not equal to (K1,K2) in this cut set, and (K3,K4) is in CT but not in MST. K1 K2 K3K4 (K1,K2)  MST (K3,K4)  CT (K3,K4)  MST Cut set - set of all edges connecting 2 graph partitions

Proof../3 u But from properties of clique tree : K1∩K2  K3∩K4. if K1∩K2  K3∩K4, it contradicts the fact that MST is maximal. u Therefore K1∩K2 = K3∩K4. u We can replace the (K3,K4) with (K1,K2) in CT, and still remain with a clique tree.  Lets take K5 and K6 belong to different cut set sides. From properties of CT, K5∩K6  K3∩K4 and thus K5∩K6  K1∩K2.  Therefore the clique intersection property holds u We have contradicted the fact that CT is different from MST and proposed an algorithm to make them equal. K1 K2 K3K4 (K1,K2)  MST (K3,K4)  CT (K3,K4)  MST CT=> K1∩K2  K3∩K4 MST=> K1∩K2  K3∩K4 => K1∩K2 = K3∩K4 //