Presentation is loading. Please wait.

Presentation is loading. Please wait.

Exact Inference ..

Similar presentations


Presentation on theme: "Exact Inference .."— Presentation transcript:

1 Exact Inference .

2 Belief Update in Trees (Markov and Bayesian networks alike)
P(x) = u,v,w,y,z P(u,…,z,x) V U X W Y Z P(x) = u,v,w,y,z g1(x,u) g2(v,u) g3(w,u) g4(y,x) g5(z,x) P(x) = u g1(x,u) v g2(v,u) w g3(w,u) y g4(y,x) z g5(z,x)

3 Belief Update in Trees P(x|e) = P(x,e) / x P(x,e) ; P(x,e) = u,v,w,y,z P(u,…,z,x,e) V U X W Y Z E1 E2 P(x,e) = u,v,w,y,z g1(x,u) g2(v,u) g3(w,u,e1) g4(y,x) g5(z,x,e2) P(x,e) = u g1(x,u) v g2(v,u) w g3(w,u,e1,) y g4(y,x) z g5(z,x,e2)

4 Belief Update in Trees ( Interpretation for Bayesian Networks)
V U X W Y Z

5 Update all variables given the evidence
Desired Query (update all beliefs): P(x|e),…,P(v|e) ? V U X W Y Z E1 E2 Solution: Keep all partial sums on the links in both directions Messages sent inwards first. P(x,e) = u g1(x,u) v g2(v,u) w g3(w,u,e1,) y g4(y,x) z g5(z,x,e2)

6 Now add a link from W to Z. What changes?
V U X W Y Z E1 E2 P(x,e) = u g1(x,u) v g2(v,u) w g3(w,u,e1,) y g4(y,x) z g5(z,x,w,e2)

7 Variable Elimination Order
We wish to compute P(l,do) for every value l of L given evidence D = do. A X D Good summation order (variable A is summed last): P(l, do) = a,l,t,x P(a,t,x,l,do) = a p(a) p(do|a) p(l|a) t p(t|a) x p(x|a) Bad summation order (variable A is summed first): P(l, do) = a,l,t,x P(a,t,x,l,do) = x t a p(a) p(l|a) p(do|a) p(t|a) p(x|a) Yields a high dimensional temporary table How to choose a reasonable order ? (generate “Small” cliques)

8 Belief Update in General BNs
Input: A Bayesian network, a set of nodes E with evidence E=e, an ordering x1,…,xm of all variables not in E. Output: P(x1,e) for every value x1 of X1. The query: Set the evidence in all local probability tables that are defined over some variables from E. Iteratively (in some “optimal or good” order) Move all irrelevant terms outside of innermost sum Perform innermost sum, getting a new term Insert the new term into the product

9 Variable elimination in Bayes networks
p(t|v) V S L T A B X D p(x|a) p(d|a,b) p(a|t,l) p(b|s) p(l|s) p(s) p(v) g1(t,v) V S L T A B X D g4(x,a) g2(a,t,l) g6(b,s) g5(l,s) g3(d,a,b) h1(t) S L T A B h4(a) g2(a,t,l) g6(b,s) g5(l,s) g3(d,a,b) g3(d,a,b) S L A B g6(b,s) g5(l,s) h2(a,l) T V,X

10 Update all variables in a general network
Desired Query (update all beliefs): P(A|d0),…,P(X|D=d0) ? Can we still compute them all at the cost of computing one term twice? g1(t,v) V S L T A B X D g4(x,a) g3(d,a,b) g2(a,t,l) g6(l,b,s) g5(a,l,b) P(a,…,x) = K g1(t,v) g2(a,t,l) g3(d,a,b) g4(x,a)g5(a,l,b)g6(l,b,s)

11 Reminder: Chordal Graphs

12 Keep Partial Results in a Tree of Cliques
g1(t,v) V S L T A B X D g4(x,a) g3(d,a,b) g2(a,t,l) g6(l,b,s) g5(a,l,b) T,V D,A,B A,L,B L,B,S A,T,L Maximum spanning tree with the size of separators as weights. X,A

13 Keep Partial Results in a Tree of Cliques
P(a,…,x) = g1(t,v) g2(a,t,l) g3(d,a,b) g4(x,a) g5(a,l,b) g6(l,b,s) d a D,A,B A,T,L X,A T,V v x b,d A T a,l t,l A,L,B L,B,S s l A node in two cliques is also in all cliques on the path connecting them. Solution: Keep all partial sums on the links in both directions Messages sent inwards first.

14 Observations Every elimination order creates a chordal graph.
Every elimination order creates a clique tree that helps store partial sums. Computing posterior belief is time exponential in the largest clique in the undirected graph with the add-in edges. Computing posterior belief for all variables requires only twice the number of computations needed for one variable update.

15 Global conditioning Fixing value of A & B L C I J M E K D a b A L C I
This transformation yields an I-map of Prob(a,b,C,D…) for fixed values of A and B. Fixing values in the beginning of the summation can decrease tables formed by variable elimination. This way space is traded with time. Special case: choose to fix a set of nodes that “break all loops”. This method is called cutset-conditioning.

16 Cuteset conditioning A B Fixing value of A & B & L breaks all loops. But can we choose less variables to break all loops ? Are there better variables to choose than others ? This optimization question translates to the well known FVS problem: Choose a set of variables of least weight that lie on every cycle of a given weighted undirected graph G. C I D E K J L M

17 The Noisy Or-Gate Model

18 Belief Update in Poly-Trees

19 Belief Update in Poly-Trees


Download ppt "Exact Inference .."

Similar presentations


Ads by Google