PGM 2002/03 Tirgul5 Clique/Junction Tree Inference.

Slides:



Advertisements
Similar presentations
Constraint Satisfaction Problems
Advertisements

Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Bayesian Networks, Winter Yoav Haimovitch & Ariel Raviv 1.
Lauritzen-Spiegelhalter Algorithm
Statistical Methods in AI/ML Bucket elimination Vibhav Gogate.
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering A Comparison of Lauritzen- Spiegelhalter, Hugin, and Shenoy- Shafer Architectures.
Dynamic Bayesian Networks (DBNs)
Clique Trees Amr Ahmed October 23, Outline Clique Trees Representation Factorization Inference Relation with VE.
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
Junction tree Algorithm :Probabilistic Graphical Models Recitation: 10/04/07 Ramesh Nallapati.
From Variable Elimination to Junction Trees
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Graphical Models - Inference -
Bayesian Networks A causal probabilistic network, or Bayesian network,
Global Approximate Inference Eran Segal Weizmann Institute.
. Bayesian Networks For Genetic Linkage Analysis Lecture #7.
. Bayesian Networks Lecture 9 Edited from Nir Friedman’s slides by Dan Geiger from Nir Friedman’s slides.
Bayesian Networks Clique tree algorithm Presented by Sergey Vichik.
Belief Propagation, Junction Trees, and Factor Graphs
. Inference I Introduction, Hardness, and Variable Elimination Slides by Nir Friedman.
Learning Equivalence Classes of Bayesian-Network Structures David M. Chickering Presented by Dmitry Zinenko.
Exact Inference: Clique Trees
TECH Computer Science Graph Optimization Problems and Greedy Algorithms Greedy Algorithms  // Make the best choice now! Optimization Problems  Minimizing.
Database Systems Normal Forms. Decomposition Suppose we have a relation R[U] with a schema U={A 1,…,A n } – A decomposition of U is a set of schemas.
1 Treewidth, partial k-tree and chordal graphs Delpensum INF 334 Institutt fo informatikk Pinar Heggernes Speaker:
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
UNC Chapel Hill Lin/Foskey/Manocha Minimum Spanning Trees Problem: Connect a set of nodes by a network of minimal total length Some applications: –Communication.
1 Inferring structure to make substantive conclusions: How does it work? Hypothesis testing approaches: Tests on deviances, possibly penalised (AIC/BIC,
CSCI 115 Chapter 7 Trees. CSCI 115 §7.1 Trees §7.1 – Trees TREE –Let T be a relation on a set A. T is a tree if there exists a vertex v 0 in A s.t. there.
Automated Planning and Decision Making Prof. Ronen Brafman Automated Planning and Decision Making 2007 Bayesian networks Variable Elimination Based on.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 11 th, 2006 Readings: K&F: 8.1, 8.2, 8.3,
UIUC CS 598: Section EA Graphical Models Deepak Ramachandran Fall 2004 (Based on slides by Eyal Amir (which were based on slides by Lise Getoor and Alvaro.
1 COROLLARY 4: D is an I-map of P iff each variable X is conditionally independent in P of all its non-descendants, given its parents. Proof  : Each variable.
An Introduction to Variational Methods for Graphical Models
1 Bayesian Networks (Directed Acyclic Graphical Models) The situation of a bell that rings whenever the outcome of two coins are equal can not be well.
Intro to Junction Tree propagation and adaptations for a Distributed Environment Thor Whalen Metron, Inc.
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Today Graphical Models Representing conditional dependence graphically
Algorithms for hard problems Parameterized complexity Bounded tree width approaches Juris Viksna, 2015.
1 Variable Elimination Graphical Models – Carlos Guestrin Carnegie Mellon University October 15 th, 2008 Readings: K&F: 8.1, 8.2, 8.3,
. Bayesian Networks Some slides have been edited from Nir Friedman’s lectures which is available at Changes made by Dan Geiger.
Knowledge Representation & Reasoning Lecture #5 UIUC CS 498: Section EA Professor: Eyal Amir Fall Semester 2005 (Based on slides by Lise Getoor and Alvaro.
Graphs. Representations of graphs : undirected graph An undirected graph G have five vertices and seven edges An adjacency-list representation of G The.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk
Inference in Bayesian Networks
PGM 2003/04 Tirgul6 Clique/Junction Tree Inference
Parameterized complexity Bounded tree width approaches
Exact Inference ..
Bayesian Networks Background Readings: An Introduction to Bayesian Networks, Finn Jensen, UCL Press, Some slides have been edited from Nir Friedman’s.
Bell & Coins Example Coin1 Bell Coin2
Exact Inference Continued
CSCI 5822 Probabilistic Models of Human and Machine Learning
Exact Inference ..
Exact Inference Eric Xing Lecture 11, August 14, 2010
Exact Inference Continued
Variable Elimination 2 Clique Trees
Lecture 3: Exact Inference in GMs
Clique Tree Algorithm: Computation
Junction Trees 3 Undirected Graphical Models
Variable Elimination Graphical Models – Carlos Guestrin
Advanced Machine Learning
Presentation transcript:

PGM 2002/03 Tirgul5 Clique/Junction Tree Inference

Outline  In class we saw how to construct junction tree via graph theoretic prinicipals  In the last tirgul we saw the algebric connection between elimination and message propagation In this tirgul we will see how elimination in a general graph implies a triangulation and a junction tree and use this to define a practical algrithm for exact inference in general graphs

Undirected graph representation u At each stage of the procedure, we have an algebraic term that we need to evaluate  In general this term is of the form: where Z i are sets of variables  We now plot a graph where there is an undirected edge X-- Y if X,Y are arguments of some factor that is, if X,Y are in some Z i u Note: this is the Markov network that describes the probability on the variables we did not eliminate yet

Undirected Graph Representation u Consider the “Asia” example u The initial factors are u thus, the undirected graph is u In this case this graph is just the moralized graph V S L T A B XD V S L T A B XD

Elimination in Undirected Graphs u Generalizing, we see that we can eliminate a variable x by 1. For all Y,Z, s.t., Y--X, Z--X  add an edge Y--Z 2. Remove X and all adjacent edges to it  This procedures create a clique that contains all the neighbors of X u After step 1 we have a clique that corresponds to the intermediate factor (before marginlization) u The cost of the step is exponential in the size of this clique

Undirected Graphs u The process of eliminating nodes from an undirected graph gives us a clue to the complexity of inference u To see this, we will examine the graph that contains all of the edges we added during the elimination

Example  Want to compute P(L)  Moralizing V S L T A B XD L T A B X V S D

Example  Want to compute P(L)  Moralizing  Eliminating v Multiply to get f’ v (v,t) Result f v (t) V S L T A B XD L T A B X V S D

Example  Want to compute P(L)  Moralizing  Eliminating v  Eliminating x Multiply to get f’ x (a,x) Result f x (a) V S L T A B XD L T A B X V S D

Example  Want to compute P(L)  Moralizing  Eliminating v  Eliminating x  Eliminating s Multiply to get f’ s (l,b,s) Result f s (l,b) V S L T A B XD L T A B X V S D

Example  Want to compute P(D)  Moralizing  Eliminating v  Eliminating x  Eliminating s  Eliminating t Multiply to get f’ t (a,l,t) Result f t (a,l) V S L T A B XD L T A B X V S D

Example  Want to compute P(D)  Moralizing  Eliminating v  Eliminating x  Eliminating s  Eliminating t  Eliminating l Multiply to get f’ l (a,b,l) Result f l (a,b) V S L T A B XD L T A B X V S D

Example  Want to compute P(D)  Moralizing  Eliminating v  Eliminating x  Eliminating s  Eliminating t  Eliminating l  Eliminating a, b Multiply to get f’ a (a,b,d) Result f(d) V S L T A B XD L T A B X V S D

u The resulting graph is the induced graph (for this particular ordering) u Main property: l Every maximal clique in the induced graph corresponds to a intermediate factor in the computation l Every factor stored during the process is a subset of some maximal clique in the graph u These facts are true for any variable elimination ordering on any network Expanded Graphs L T A B X V S D

Induced Width u The size of the largest clique in the induced graph is thus an indicator for the complexity of variable elimination u This quantity is called the induced width of a graph according to the specified ordering u Finding a good ordering for a graph is equivalent to finding the minimal induced width of the graph

Chordal Graphs u Recall: elimination ordering  undirected chordal graph Graph: u Maximal cliques are factors in elimination u Factors in elimination are cliques in the graph u Complexity is exponential in size of the largest clique in graph L T A B X V S D V S L T A B XD

Cluster Trees u Variable elimination  graph of clusters u Nodes in graph are annotated by the variables in a factor l Clusters: circles correspond to multiplication l Separators: boxes correspond to marginalization V S L T A B XD T,V A,L,T B,L,S X,A A,L,B A,B,D A A,B B,L T A,L

Properties of cluster trees u Cluster graph must be a tree l Only one path between any two clusters u A separator is labeled by the intersection of the labels of the two neighboring clusters u Running intersection property: l All separators on the path between two clusters contain their intersection T,V A,L,T B,L,S X,A A,L,B A,B,D A A,B B,L T A,L

Cluster Trees & Chordal Graphs u Combining the two representations we get that: l Every maximal clique in chordal is a cluster in tree l Every separator in tree is a separator in the chordal graph L T A B X V S D T,V A,L,T B,L,S X,A A,L,B A,B,D A A,B B,L T A,L

Cluster Trees & Chordal Graphs Observation: u If a cluster that is not a maximal clique, then it must be adjacent to one that is a superset of it u We might as well work with cluster tree were each cluster is a maximal clique L T A B X V S D T,V A,L,T B,L,S X,A A,L,B A,B,D A A,B B,L T A,L

Cluster Trees & Chordal Graphs Thm:  If G is a chordal graph, then it can be embedded in a tree of cliques such that: l Every clique in G is a subset of at least one node in the tree l The tree satisfies the running intersection property

Elimination in Chordal Graphs  A separator S divides the remaining variables in the graph in to two groups l Variables in each group appears on one “side” in the cluster tree u Examples: {A,B}: {L, S, T, V} & {D, X} {A,L}: {T, V} & {B,D,S,X} {B,L}: {S} & {A, D,T, V, X} l {A}: {X} & {B,D,L, S, T, V} l {T}; {V} & {A, B, D, K, S, X} L T A B X V S D T,V A,L,T B,L,S X,A A,L,B A,B,D A A,B B,L T A,L

Elimination in Cluster Trees  Let X and Y be the partition induced by S Observation:  Eliminating all variables in X results in a factor f X (S)  Proof: Since S is a separator only variables in S are adjacent to variables in X u Note:The same factor would result, regardless of elimination ordering x y A B S f X (S) f Y (S)

Recursive Elimination in Cluster Trees  How do we compute f X (S) ? u By recursive decomposition along cluster tree Let X 1 and X 2 be the disjoint partitioning of X - C implied by the separators S 1 and S 2 Eliminate X 1 to get f X1 (S 1 ) Eliminate X 2 to get f X2 (S 2 ) Eliminate variables in C - S to get f X (S) C S S2 S1 x1x1 x2x2 y

Elimination in Cluster Trees (or Belief Propagation revisited) u Assume we have a cluster tree  Separators: S 1,…,S k  Each S i determines two sets of variables X i and Y i, s.t. l S i  X i  Y i = {X 1,…,X n } All paths from clusters containing variables in X i to clusters containing variables in Y i pass through S i  We want to compute f Xi (S i ) and f Yi (S i ) for all i

Elimination in Cluster Trees Idea: u Each of these factors can be decomposed as an expression involving some of the others u Use dynamic programming to avoid recomputation of factors

Example T,V A,L,T B,L,S X,A A,L,B A,B,D A A,B B,L T A,L

Dynamic Programming We now have the tools to solve the multi-query problem u Step 1: Inward propagation Pick a cluster C Compute all factors eliminating from fringes of the tree toward C l This computes all “inward” factors associated with separators C

Dynamic Programming We now have the tools to solve the multi-query problem u Step 1: Inward propagation u Step 2: Outward propagation Compute all factors on separators going outward from C to fringes C

Dynamic Programming We now have the tools to solve the multi-query problem u Step 1: Inward propagation u Step 2: Outward propagation u Step 3: Computing beliefs on clusters  To get belief on a cluster C’ multiply: CPDs that involves only variables in C’ Factors on separators adjacent to C’ using the proper direction  This simulates the result of elimination of all variables except these in C’ using pre-computed factors C C’’

Complexity Time complexity: u Each traversal of the tree is costs the same as standard variable elimination u Total computation cost is twice of standard variable elimination Space complexity: u Need to store partial results u Requires two factors for each separator  Space requirements can be up to 2n more expensive than variable elimination

The “Asia” network with evidence Visit to Asia Smoking Lung Cancer Tuberculosis Abnormality in Chest Bronchitis X-Ray Dyspnea We want to compute P(L|D=t,V=t,S=f)

Initial factors with evidence We want to compute P(L|D=t,V=t,S=f) P(T|V): ( ( Tuberculosis false ) ( VisitToAsia true ) ) 0.95 ( ( Tuberculosis true ) ( VisitToAsia true ) ) 0.05 P(B|S): ( ( Bronchitis false ) ( Smoking false ) ) 0.7 ( ( Bronchitis true ) ( Smoking false ) ) 0.3 P(L|S): ( ( LungCancer false ) ( Smoking false ) ) 0.99 ( ( LungCancer true ) ( Smoking false ) ) 0.01 P(D|B,A): ( ( Dyspnea true ) ( Bronchitis false ) ( AbnormalityInChest false ) ) 0.1 ( ( Dyspnea true ) ( Bronchitis true ) ( AbnormalityInChest false ) ) 0.8 ( ( Dyspnea true ) ( Bronchitis false ) ( AbnormalityInChest true ) ) 0.7 ( ( Dyspnea true ) ( Bronchitis true ) ( AbnormalityInChest true ) ) 0.9

Initial factors with evidence (cont.) P(A|L,T): ( ( Tuberculosis false ) ( LungCancer false ) ( AbnormalityInChest false ) ) 1 ( ( Tuberculosis true ) ( LungCancer false ) ( AbnormalityInChest false ) ) 0 ( ( Tuberculosis false ) ( LungCancer true ) ( AbnormalityInChest false ) ) 0 ( ( Tuberculosis true ) ( LungCancer true ) ( AbnormalityInChest false ) ) 0 ( ( Tuberculosis false ) ( LungCancer false ) ( AbnormalityInChest true ) ) 0 ( ( Tuberculosis true ) ( LungCancer false ) ( AbnormalityInChest true ) ) 1 ( ( Tuberculosis false ) ( LungCancer true ) ( AbnormalityInChest true ) ) 1 ( ( Tuberculosis true ) ( LungCancer true ) ( AbnormalityInChest true ) ) 1 P(X|A): ( ( X-Ray false ) ( AbnormalityInChest false ) ) 0.95 ( ( X-Ray true ) ( AbnormalityInChest false ) ) 0.05 ( ( X-Ray false ) ( AbnormalityInChest true ) ) 0.02 ( ( X-Ray true ) ( AbnormalityInChest true ) ) 0.98

D,B,A B,L,S X,A T,V B,L,A T,L,A B,A B,LL,A A T Step 1: Initial Clique values C T =P(T|V) C T,L,A =P(A|L,T) C B,L,A =1 C B,L =P(L|S)P(B|S) C B,A =1 C X,A =P(X|A) “dummy” separators: this is the intersection between nodes in the junction tree and helps in defining the inference messages (see below)

D,B,A B,L,S X,A T,V B,L,A T,L,A B,A B,LL,A A T Step 2: Update from leaves S  B,L =  C B,L ST=CTST=CT S  A =  C X,A CTCT C T,L,A C B,L,A C B,L C B,A C X,A

D,B,A B,L,S X,A T,V B,L,A T,L,A B,A B,LL,A A T Step 3: Update (cont.) S  B,L STST CTCT C T,L,A C B,L,A C B,L C B,A C X,A SASA S  B,A =  ( C B,A x S  A ) S  L,A =  ( C T,L,A x S  T )

D,B,A B,L,S X,A T,V B,L,A T,L,A B,A B,LL,A A T Step 4: Update (cont.) S  B,L STST S  B,A S  L,A CTCT C T,L,A C B,L,A C B,L C B,A C X,A SASA S  B,A =  ( C B,L,A x S  L,A xS  B,L ) S  L,A =  ( C B,L,A x S  B,L XS  B,A ) S  B,L =  ( C B,L,A x S  L,A XS  B,A )

D,B,A B,L,S X,A T,V B,L,A T,L,A B,A B,LL,A A T Step 5: Update (cont.) S  B,L STST S  B,A S  L,A CTCT C T,L,A C B,L,A C B,L C B,A C X,A SASA S  B,A S  L,A S  B,L S  A =  ( C B,A x S  B,A ) S  T =  ( C T,L,A x S  L,A )

D,B,A B,L,S X,A T,V B,L,A T,L,A B,A B,LL,A A T Step 6: Compute Query S  B,L STST S  B,A S  L,A CTCT C T,L,A C B,L,A C B,L C B,A C X,A SASA S  B,A S  L,A S  B,L SASA STST P(L|D=t,V=t,S=f) =  (C B,L x S  B,L ) =  (C B,L,A x S  L,A x S  B,L x S  B,A ) = … and normalize

D,B,A B,L,S X,A T,V B,L,A T,L,A B,A B,LL,A A T How to avoid small numbers S  B,L STST S  B,A S  L,A CTCT C T,L,A C B,L,A C B,L C B,A C X,A SASA S  B,A S  L,A S  B,L SASA STST P(L|D=t,V=t,S=f) =  (C B,L x S  B,L ) =  (C B,L,A x S  L,A x S  B,L x S  B,A ) = … and normalize (with N 1 xN 2 xN 3 xN 4 xN 5 xN BLA ) Normalize by N 1 Normalize by N 3 Normalize by N 2 Normalize by N 4 Normalize by N 5

A Theorem about elimination order Triangulated graph: a graph that has no cycle with length > 3 without a chord. Simplicial node: a node that can be eliminated without the need for addition of an extra edge, i.e. all its neighbouring nodes are connected (they form a complete subgraph). Eliminatable graph: a graph which has an elimination order without the need to add edges - all the nodes are simplicial in that order. Thm: Every triangulated graph is eliminatable.

Lemma: An uncomplete triangulated graph G with a node set N (at least 3) has a complete subset S which separates the graph - every path between the two parts of N/S goes through S. Proof: Let S be a minimal set of nodes such that any path between non-adjacent nodes A and B contains a nodes from S. Assume that C,D in S are not neighbors. Since S is minimal, there is a path from A to B in G passing only through C in S (and same for D). Then there is a path from C to D in G A and in G B. This path is a cycle that a chord C--D must break. A B S GAGA GBGB

Claim: Let G be a triangulated graph. We always have two simplicial nodes that can be chosen nonadjacent (if the graph is not complete). Proof: The claim is trivial for a complete graph and a graph with 2 nodes. Let G have n nodes. If G A is complete choose any simplicial node outside S. If not, choose one of the two outside S (they cannot be both in S or they will be adjacent). Same can be done for G B and nodes are non-adjacent (separated by S). Wrapping up: Any graph with 2 nodes is triangulated and eliminatable. The claim gives us more than the single simplicial node we need. * Full proof can be found at Jensen, Appendix A.