Presentation is loading. Please wait.

Presentation is loading. Please wait.

Solving Bayesian Decision Problems: Variable Elimination and Strong Junction Tree Methods Presented By: Jingsong Wang Scott Langevin May 8, 2009.

Similar presentations


Presentation on theme: "Solving Bayesian Decision Problems: Variable Elimination and Strong Junction Tree Methods Presented By: Jingsong Wang Scott Langevin May 8, 2009."— Presentation transcript:

1 Solving Bayesian Decision Problems: Variable Elimination and Strong Junction Tree Methods Presented By: Jingsong Wang Scott Langevin May 8, 2009

2 Introduction Solutions to Influence Diagrams Variable Elimination Strong Junction Tree Hugin Architecture Conclusions

3 3 Solutions to Influence Diagrams

4 4 The example influence diagram, DI I 0 = Φ, I 1 = {T}, I 2 = {A, B, C} Chance Nodes Decision Nodes Utility Nodes

5 5 Solutions to Influence Diagrams

6 6

7 7 The Chain Rule for Influence Diagrams

8 8 Strategies and Expected Utilities

9 9 DI unfolded into a decision tree. Apply average-out and fold-back algorithm. To reduce the size of the decision tree the last chance node in each path is defined as the Cartesian product of A and C, and that the utilities in the leaves are the sums of V 1 and V 2. Strategies and Expected Utilities

10 10 Strategies and Expected Utilities – D2

11 11 Strategies and Expected Utilities – D2

12 12 Strategies and Expected Utilities – D2 The decision tree with D 2 replaced by a utility function reflecting that the policy δ 2 for D 2 is followed.

13 13 Strategies and Expected Utilities – D1

14 14 Strategies and Expected Utilities - Combined D1

15 15 Strategies and Expected Utilities - Combined D2

16 16 Strategies and Expected Utilities

17 17 Strategies and Expected Utilities - Proof

18 18 Strategies and Expected Utilities - Proof

19 19 Strategies and Expected Utilities - Proof

20 20 Variable Elimination Compare the method for solving influence diagrams with the junction tree propagation algorithm –Similarities: Start off with a set of potentials Eliminate one variable at a time –Differences: The elimination order is constrained by the temporal order Two types of potentials to deal with Need to eliminate in only one direction Strong elimination order –Sum-marginalize I n, then max-marginalize D n, sum-marginalize I n-1, etc

21 21 Variable Elimination Analyze the calculations in eliminating a variable –Φ a set of probability potentials –Ψ a set of utility potentials –The product of all probability potentials multiplied by the sum of all utility potentials:

22 22 Variable Elimination – Sum-Marginalization

23 23 Variable Elimination – Max-Marginalization

24 24 Variable Elimination

25 Strong Junction Tree Methods Rely on secondary computational structure to calculate MEU and policies Similar idea of Junction Tree Method, only here the order of elimination is constrained by the partial order Hugin Method Lazy Propagation Method Creating a Strong Junction Tree –Moralize Influence Graph –Triangulate Moralized Graph –Arrange Cliques into Strong Junction Tree

26 Running Example Partial Temporal Order: I 0 = {B}, D 1, I 1 = {E,F}, D 2, I 2 = Ø, D 3, I 3 = {G}, D 4, I 4 = {A, C, D, H, I, J, K, L}

27 Moralization of Influence Diagram Remove informational links Add a link between nodes with a common child Remove utility nodes Remove directional arrows

28 Moralization of Influence Diagram →

29 Strong Triangulation of Moral Graph Triangulate by eliminating nodes from moral graph according to reverse of partial order imposed by influence diagram: Nodes in I k have no imposed order and can be eliminated in any order (ex: use min fill-in heuristic)

30 Strong Triangulation of Moral Graph → Partial Order: I 0 = {B}, D 1, I 1 = {E,F}, D 2, I 2 = Ø, D 3, I 3 = {G}, D 4, I 4 = {A, C, D, H, I, J, K, L} Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

31 Strong Junction Tree Construction Organize cliques of triangulated graph into a strong junction tree: –Each pair of cliques (C1, C2), C1 ∩ C2 is contained in every clique on the path connecting C1 and C2 –Each pair of cliques (C1, C2) with C1 closer to root R than C2, there exists an ordering of variables in C2 that respects the partial order and with the nodes in C1 ∩ C2 preceding the variables in C2\C1 Ensures that the maximum expected utility can be computed via local message passing in the junction tree

32 Strong Junction Tree Construction Algorithm for generation of Strong Junction Tree Number nodes in triangulated graph according to reverse of elimination order chosen during triangulation Let C be a clique in the triangulated graph and v be the highest numbered node in C with neighbor node u not in C and the number associated with u < the number associated with v If such a node v exists, then set index of C to number of v, else set index of C to 1 Order the cliques in increasing order according to their index This order will have the running intersection property: To construct Strong Junction Tree: start with C 1 (the root) then successively attach each clique C k to some clique C j that contains S k

33 Strong Junction Tree Construction → Partial Order: I 0 = {B}, D 1, I 1 = {E,F}, D 2, I 2 = Ø, D 3, I 3 = {G}, D 4, I 4 = {A, C, D, H, I, J, K, L} Cliques: {B,D 1,E,F,D}, {B,C,A}, {B,E,D,C}, {E,D 2,G}, {D 2,G,D 4,I}, {D 4,I,L}, {D 3,H,K}, {H,K,J}, {F,D 3,H}

34 Hugin Architecture Each clique C and separator S in Junction Tree contains a probability potential ϕ and a utility potential Ψ Initialize Junction Tree –Assign each potential ϕ to one and only one clique C where dom( ϕ ) ⊆ dom(C) –Combine potentials in each clique: –Assign unity potential to cliques with no probability potential assigned –Assign null utility potential to cliques with no utility potential assigned

35 Hugin Architecture Uses message passing in the strong junction tree Messages are passed from leave nodes towards the root node via adjacent nodes A clique node can pass a message when it has received messages from all adjacent nodes further from the root node Messages are stored in the separator S connecting two adjacent nodes The message consists of two potentials: probability potential ϕ S and a utility potential Ψ S that are calculated as: Note that ∑ is a general marginalization operator and is a summation for probability nodes and a max function for decision nodes. Nodes are marginalized according to reverse of partial order Message from C j is absorbed by C i by:

36 Hugin Architecture The optimal policy for a decision variable can be determined from the potentials of the clique or separator that is closest to root and contains the decision variable (it may be the root itself) The MEU is calculated using the potentials in the root node after message passing has completed

37 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

38 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 {U 4 (L)}, {P(L|D 4,I)} {P(I|D 2,G)} {P(G|E)} {U 1 (D 1 )}, {P(F|D), P(D|B,D 1 ), P(B)} {P(E|C,D)} {P(C|A,B), P(A)} {U 3 (J, K)}, {P(J|H)} {P(K|D 3,H)} {U 2 (D 3 )}, {P(H|F)} Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

39 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Ψ(L), ϕ (L,D 4,I) ϕ (I,D 2,G) ϕ (G,E) Ψ(D 1 ), ϕ (B,D,F, D 1 ) ϕ (E,C,D) ϕ (C,A,B) Ψ(J,K), ϕ (J,H) ϕ (K,D 3,H) Ψ(D 3 ), ϕ (H,F) Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

40 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

41 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

42 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

43 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

44 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

45 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B

46 Hugin Architecture C1 C2C3C4 C5C6C7 C8C9 Elimination Order: A, L, I, J, K, H, C, D, D 4, G, D 3, D 2, E, F, D 1, B Calculate Policy: D 1 use C1 D 2 use C2 D 3 use C3 D 4 use C5 MEU: Use C1

47 Conclusion We reviewed two methods for solving influence diagrams: –Variable Elimination –Strong Junction Tree Method (Hugin) There are other methods that were not discussed: –Lazy Propagation –Node Removal and Arc Reversal

48 Questions?


Download ppt "Solving Bayesian Decision Problems: Variable Elimination and Strong Junction Tree Methods Presented By: Jingsong Wang Scott Langevin May 8, 2009."

Similar presentations


Ads by Google