1 Tutorial #9 by Ma’ayan Fishelson. 2 Bucket Elimination Algorithm An algorithm for performing inference in a Bayesian network. Similar algorithms can.

Slides:



Advertisements
Similar presentations
Constraint Satisfaction Problems
Advertisements

Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
. Exact Inference in Bayesian Networks Lecture 9.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
“Using Weighted MAX-SAT Engines to Solve MPE” -- by James D. Park Shuo (Olivia) Yang.
Bucket Elimination: A Unifying Framework for Probabilistic Inference By: Rina Dechter Presented by: Gal Lavee.
Lauritzen-Spiegelhalter Algorithm
Bucket Elimination: A unifying framework for Probabilistic inference Rina Dechter presented by Anton Bezuglov, Hrishikesh Goradia CSCE 582 Fall02 Instructor:
Bayesian Networks Bucket Elimination Algorithm 主講人:虞台文 大同大學資工所 智慧型多媒體研究室.
Anagh Lal Tuesday, April 08, Chapter 9 – Tree Decomposition Methods- Part II Anagh Lal CSCE Advanced Constraint Processing.
CSCE 582 Computation of the Most Probable Explanation in Bayesian Networks using Bucket Elimination -Hareesh Lingareddy University of South Carolina.
Constraint Optimization Presentation by Nathan Stender Chapter 13 of Constraint Processing by Rina Dechter 3/25/20131Constraint Optimization.
Dynamic Bayesian Networks (DBNs)
MPE, MAP AND APPROXIMATIONS Lecture 10: Statistical Methods in AI/ML Vibhav Gogate The University of Texas at Dallas Readings: AD Chapter 10.
. Learning – EM in ABO locus Tutorial #08 © Ydo Wexler & Dan Geiger.
From Variable Elimination to Junction Trees
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
. EM algorithm and applications Lecture #9 Background Readings: Chapters 11.2, 11.6 in the text book, Biological Sequence Analysis, Durbin et al., 2001.
1 Exact Inference Algorithms Bucket-elimination and more COMPSCI 179, Spring 2010 Set 8: Rina Dechter (Reading: chapter 14, Russell and Norvig.
Anagh Lal Monday, April 14, Chapter 9 – Tree Decomposition Methods Anagh Lal CSCE Advanced Constraint Processing.
1 Directional consistency Chapter 4 ICS-275 Spring 2007.
1 Exact Inference Algorithms for Probabilistic Reasoning; COMPSCI 276 Fall 2007.
. Hidden Markov Models For Genetic Linkage Analysis Lecture #4 Prepared by Dan Geiger.
Bayesian Networks Clique tree algorithm Presented by Sergey Vichik.
Belief Propagation, Junction Trees, and Factor Graphs
CASE STUDY: Genetic Linkage Analysis via Bayesian Networks
M. HardojoFriday, February 14, 2003 Directional Consistency Dechter, Chapter 4 1.Section 4.4: Width vs. Local Consistency Width-1 problems: DAC Width-2.
Tutorial #9 by Ma’ayan Fishelson
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
A Differential Approach to Inference in Bayesian Networks - Adnan Darwiche Jiangbo Dang and Yimin Huang CSCE582 Bayesian Networks and Decision Graphs.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
AND/OR Search for Mixed Networks #CSP Robert Mateescu ICS280 Spring Current Topics in Graphical Models Professor Rina Dechter.
Inference in Gaussian and Hybrid Bayesian Networks ICS 275B.
Presented by: Ma’ayan Fishelson. Proposed Projects 1.Performing haplotyping on the input data. 2.Creating a friendly user-interface for the statistical.
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Charu Aggarwal + * Department of Computer Science, University of Texas at Dallas + IBM T. J. Watson.
Probabilistic graphical models. Graphical models are a marriage between probability theory and graph theory (Michael Jordan, 1998) A compact representation.
Ahsanul Haque *, Swarup Chandra *, Latifur Khan * and Michael Baron + * Department of Computer Science, University of Texas at Dallas + Department of Mathematical.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Automated Planning and Decision Making Prof. Ronen Brafman Automated Planning and Decision Making 2007 Bayesian networks Variable Elimination Based on.
Introduction to Bayesian Networks
Generalizing Variable Elimination in Bayesian Networks 서울 시립대학원 전자 전기 컴퓨터 공학과 G 박민규.
Inference Complexity As Learning Bias Daniel Lowd Dept. of Computer and Information Science University of Oregon Joint work with Pedro Domingos.
Computing the chromatic number for block intersection graphs of Latin squares Ed Sykes CS 721 project McMaster University, December 2004 Slide 1.
Probabilistic Networks Chapter 14 of Dechter’s CP textbook Speaker: Daniel Geschwender April 1, 2013 April 1&3, 2013DanielG--Probabilistic Networks1.
Two Approximate Algorithms for Belief Updating Mini-Clustering - MC Robert Mateescu, Rina Dechter, Kalev Kask. "Tree Approximation for Belief Updating",
Daphne Koller Message Passing Belief Propagation Algorithm Probabilistic Graphical Models Inference.
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Representing Relations Using Matrices A relation between finite sets can be represented using a zero-one matrix Suppose R is a relation from A = {a 1,
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Arc Consistency CPSC 322 – CSP 3 Textbook § 4.5 February 2, 2011.
Inference Algorithms for Bayes Networks
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
1 Structure Learning (The Good), The Bad, The Ugly Inference Graphical Models – Carlos Guestrin Carnegie Mellon University October 13 th, 2008 Readings:
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.
. The EM algorithm Lecture #11 Acknowledgement: Some slides of this lecture are due to Nir Friedman.
Qian Liu CSE spring University of Pennsylvania
Inference in Bayesian Networks
Exam Preparation Class
Tutorial 2 Simple examples of Bayesian networks, Queries, And the stories behind them Tal Shor.
Constraint Optimization And counting, and enumeration 275 class
Exact Inference Continued
Exact Inference ..
Class #19 – Tuesday, November 3
Exact Inference Continued
Class #16 – Tuesday, October 26
Clique Tree Algorithm: Computation
Goals: To identify subpopulations (subsets of the sample with distinct allele frequencies) To assign individuals (probabilistically) to subpopulations.
Genetic linkage analysis
presented by Anton Bezuglov, Hrishikesh Goradia CSCE 582 Fall02
Presentation transcript:

1 Tutorial #9 by Ma’ayan Fishelson

2 Bucket Elimination Algorithm An algorithm for performing inference in a Bayesian network. Similar algorithms can be used in other domains, such as constraint satisfaction.

3 Bayesian Network X = {X 1,…,X n } is a set of random variables. A BN is a pair (G,P): –G is a directed acyclic graph over nodes that represent the random variables X. –P = {P i |1 ≤i ≤n}. P i, defined on, is the conditional probability table associated with node X i. P i = P(X i | pa(X i )) The BN represents a probability distribution over X.

4 Bayesian Network - Example A BC E F D Bayesian Network P(A) P(C|A) P(B|A) P(E|B,C) P(F|E) P(D|A,B) P(A,B,C,D,E,F) = P(A)P(B|A)P(C|A)P(D|A,B)P(E|B,C)P(F|E) A BC E F D Moralized Graph

5 Some Probabilistic Inference Tasks… Belief Assessment: find P(X i = x i | e). Most Probable Explanation (MPE): find

6 Solution – Bucket Elimination Algorithms Given an ordering of the variables X 1,…X n : –Distribute P 1,…,P n into buckets B 1,…,B n. is the highest in order in A i. –Process the buckets in reverse order: B n  B 1. (When processing bucket B i, multiply all the probability tables in B i and eliminate the bucket’s variable X i by summing over all its possible values.) –Place the resulting function in the bucket of the highest variable (in the order) that is in its scope.

7 Example – Belief Assessment Compute: P(A=a) = Σ {A=a,B,C,D,E,F} P(A,B,C,D,E,F) = Σ {A=a,B,C,D,E,F} P(A)P(B|A)P(C|A)P(D|A,B)P(E|B,C)P(F|E) Suppose an order A,C,B,E,D,F. and evidence that F=1. The distribution into buckets is as follows: P(F|E)P(D|A,B)P(E|B,C)P(B|A)P(A)P(C|A) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F)

8 Example – Computing P(A=a) P(D|A,B) P(E|B,C) λ F (E) P(B|A)P(A)P(C|A) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F) P(F|E)P(D|A,B)P(E|B,C)P(B|A)P(A)P(C|A) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F) To process B 6 : Assign F=1, get λ F (E) = P(F=1|E) Place λ F (E) in bucket B 4 (E).

9 Computing P(A=a) (cont. 1) P(E|B,C) λ F (E) P(B|A) λ D (A,B) P(A)P(C|A) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F) P(D|A,B) P(E|B,C) λ F (E) P(B|A)P(A)P(C|A) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F) To process B 5 : λ D (A,B) = Σ D P(D|A,B) Place λ D (A,B) in bucket B 3 (B).

10 Computing P(A=a) (cont. 2) B 3 (B) P(B|A) λ D (A,B) λ E (B,C) P(A)P(C|A) B 1 (A) B 2 (C)B 4 (E)B 5 (D)B 6 (F) P(E|B,C) λ F (E) P(B|A) λ D (A,B) P(A)P(C|A) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F) To process B 4 : λ E (B,C) = Σ E P(E|B,C) λ F (E) Place λ E (B,C) in bucket B 3 (B).

11 Computing P(A=a) (cont. 3) P(A) P(C|A) λ B (A,C) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F) To process B 3 : λ B (A,C) = Σ B P(B|A) λ D (A,B) λ E (B,C) Place λ B (A,C) in bucket B 2 (C). P(A)P(C|A) B 1 (A) B 2 (C) B 3 (B) B 4 (E)B 5 (D)B 6 (F) P(B|A) λ D (A,B) λ E (B,C)

12 Computing P(A=a) (cont. 4) P(A) λ C (A) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F) To process B 2 : λ C (A) = Σ C P(C|A) λ B (A,C) Place λ C (A) in bucket B 1 (A). P(A) P(C|A) λ B (A,C) B 1 (A) B 2 (C) B 3 (B) B 4 (E)B 5 (D)B 6 (F)

13 Computing P(A=a) (cont. 5) P(A) λ C (A) B 1 (A) B 2 (C)B 3 (B)B 4 (E)B 5 (D)B 6 (F) To compute the belief P(A=a), we multiply P(A) λ C (A). We obtain a function of A.

14 Bucket Elimination Complexity The time and space complexity is bounded by the size of the probability tables that are created by the algorithm. w*(d) is the induced width (max clique size) of the moral graph along ordering d. A BC E Moral graph D B C D E A Order B,C,D,E,A. w*(d 1 ) =4 E D C B A Order E,D,C,B,A. w*(d 2 ) =2

15 Constraint Satisfaction Problem (CSP) Input : –A set of variables: X = {X 1,…,X n }. –A set of constraints: R = {R 1,…,R m }. –For each constraint R i : Scope of R i : R i is a relation over the variables in A i. Output : –An assignment to X that is consistent with R.

16 Constraint Graph Nodes: {X 1,…X n }. Edges: For simplicity, assume that constraints are defined on maximal cliques of the graph.

17 Example of CSP: Find Possible Genotypes Possible haplotypes : h 1 =A,A 1 ; h 2 = A,A 2 ; h 3 =O,A 1 ; h 4 = O,A 2 ; Possible genotypes (ordered): g ij = h i /h j (father/mother) Ind #1: g 11, g 13, g 31 Ind #2: g 44 Ind #3: g 12, g 21, g 14, g 41, g 23, g 32 Ind #4: g 22, g 24, g 42 Ind #5: g 34, g A A 1 /A 1 O A 2 /A 2 A A 1 /A 2 A A 2 /A 2 O A 1 /A 2 A pedigree typed at the ABO and AK1 loci : Constraints: R 13 = {, }, similarly for R 23, R 35, and R 45.

18 Bucket Elimination for CSP Given an order X 1,…, X n : –Distribute constraints into buckets, according to the highest variable in the constraint’s scope. –Process the buckets one by one starting from X n : Join the constraints (with equal value for the eliminated variable), in the eliminated bucket and remember the value of the eliminated variable. Put the new constraint in the bucket of the highest variable in its scope.

19 Example of CSP: Find Possible Genotypes (cont. 1) Constraints : R AC = {, }; R BC = { }; R CE = {<g {14,41,23,32} g {34,43} }; R DE = { } AB C E D Constraint graph

20 Find Possible Genotypes (cont. 2) Suppose an order: A, B, C, D, E. The distribution into buckets is as follows: BABAB BCBC BDBD BEBE Initial R AC = {, } R BC = { } R CE = { } R DE = { }

21 Find Possible Genotypes (cont. 3) BABAB BCBC BDBD BEBE Initial R AC = {, } R BC = { } R CE = { } R DE = { } When processing B E : compute R CD = { }. put R CD into bucket B D. remember the value g 34 for E.

22 Find Possible Genotypes (cont. 4) BABAB BCBC BDBD BEBE After Processing B E. R AC = {, } R BC = { } R CD = { } When processing B D : compute R C = {<g {23,32,14,41 } }. put R C into bucket B C. remember the values g 24,g 42 for D.

23 Find Possible Genotypes (cont. 5) BABAB BCBC BDBD BEBE After Processing B D. R AC = {, } R BC = { } R C = {<g {23,32,14,41} }. When processing B C : compute R AB = { }. put R C into bucket B B. remember the value g 14 for C.

24 Find Possible Genotypes (cont. 6) BABAB BCBC BDBD BEBE After Processing B D. R AB = { }. When processing B B : compute R A = { }. remember the value g 44 for B.

25 Find Possible Genotypes (cont. 7) By backtracking we get the possible values: Ind #1 (A): g 11, g 13, g 31 Ind #2 (B): g 44 Ind #3 (C): g 14 Ind #4 (D): g 24, g 42 Ind #5 (E): g A A 1 /A 1 O A 2 /A 2 A A 1 /A 2 A A 2 /A 2 O A 1 /A A|A or A|O or O|A A 1 |A 1 O|O A 2 |A 2 A|O A 1 |A 2 A|O or O|A A 2 /A 2 O|O A 1 |A 2