December 04 2003 Marginal and Joint Beliefs in BN1 A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Mark.

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

A Tutorial on Learning with Bayesian Networks
Exact Inference. Inference Basic task for inference: – Compute a posterior distribution for some query variables given some observed evidence – Sum out.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
1 Chapter 5 Belief Updating in Bayesian Networks Bayesian Networks and Decision Graphs Finn V. Jensen Qunyuan Zhang Division. of Statistical Genomics,
Lauritzen-Spiegelhalter Algorithm
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Dynamic Bayesian Networks (DBNs)
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Overview of Inference Algorithms for Bayesian Networks Wei Sun, PhD Assistant Research Professor SEOR Dept. & C4I Center George Mason University, 2009.
Junction Trees: Motivation Standard algorithms (e.g., variable elimination) are inefficient if the undirected graph underlying the Bayes Net contains cycles.
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
Global Approximate Inference Eran Segal Weizmann Institute.
Part 3 of 3: Beliefs in Probabilistic Robotics. References and Sources of Figures Part 1: Stuart Russell and Peter Norvig, Artificial Intelligence, 2.
Bayesian Networks Clique tree algorithm Presented by Sergey Vichik.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
Propagation in Poly Trees Given a Bayesian Network BN = {G, JDP} JDP(a,b,c,d,e) = p(a)*p(b|a)*p(c|e,b)*p(d)*p(e|d) a d b e c.
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering Bayes-ball—an Efficient Algorithm to Assess D-separation A Presentation for.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Computer vision: models, learning and inference
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
Belief Propagation. What is Belief Propagation (BP)? BP is a specific instance of a general class of methods that exist for approximate inference in Bayes.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Reasoning in Uncertain Situations
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
Introduction to Bayesian Networks
Generalizing Variable Elimination in Bayesian Networks 서울 시립대학원 전자 전기 컴퓨터 공학과 G 박민규.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
Probabilistic Networks Chapter 14 of Dechter’s CP textbook Speaker: Daniel Geschwender April 1, 2013 April 1&3, 2013DanielG--Probabilistic Networks1.
Marginalization & Conditioning Marginalization (summing out): for any sets of variables Y and Z: Conditioning(variant of marginalization):
Chapter 7. Learning through Imitation and Exploration: Towards Humanoid Robots that Learn from Humans in Creating Brain-like Intelligence. Course: Robots.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Bayesian Optimization Algorithm, Decision Graphs, and Occam’s Razor Martin Pelikan, David E. Goldberg, and Kumara Sastry IlliGAL Report No May.
1Computer Sciences Department. 2 Advanced Design and Analysis Techniques TUTORIAL 7.
Daphne Koller Overview Conditional Probability Queries Probabilistic Graphical Models Inference.
Today Graphical Models Representing conditional dependence graphically
Belief propagation with junction trees Presented by Mark Silberstein and Yaniv Hamo.
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
Daphne Koller Overview Maximum a posteriori (MAP) Probabilistic Graphical Models Inference.
Distributed cooperation and coordination using the Max-Sum algorithm
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Graduate School of Information Sciences, Tohoku University
Inference in Bayesian Networks
Today.
Artificial Intelligence Chapter 19
CSCI 5822 Probabilistic Models of Human and Machine Learning
Structure and Semantics of BN
Pattern Recognition and Image Analysis
Bayesian Statistics and Belief Networks
Class #19 – Tuesday, November 3
Biointelligence Lab School of Computer Sci. & Eng.
Graduate School of Information Sciences, Tohoku University
Class #16 – Tuesday, October 26
Graduate School of Information Sciences, Tohoku University
Structure and Semantics of BN
Lecture 3: Exact Inference in GMs
Clique Tree Algorithm: Computation
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Presentation transcript:

December Marginal and Joint Beliefs in BN1 A Hybrid Algorithm to Compute Marginal and Joint Beliefs in Bayesian Networks and its complexity Mark Bloemeke Artificial Intelligence Laboratory University of South Carolina Marco Valtorta Artificial Intelligence Laboratory University of South Carolina Presentation by Sreeja Vallabhan Instructor Marco Valtorta

Abstract Methods (algorithms) to Update Probability in Bayesian Network  Using a structure (Clique Tree) and perform local message based calculation to extract the belief in each variable.  Using Non Serial Dynamic Programming Techniques to extract the belief in some desired group of variables.

Goal Present a hybrid algorithm based on Non Serial Dynamic Programming Techniques and possessing the ability to retrieve the belief in all single variables.

Symbolic Probabilistic Inference (SPI) Consider the Bayesian Network with DAG G = (V, E) and conditional probability tables where are the parents of v i in G. Total joint probability using Chain Rule of Bayesian Network (1) Using marginalization to retrieve belief in any subset of variables V’ as (2) SPI is based on these two equations

Symbolic Probabilistic Inference (SPI) In SPI, to maintain control over the size and time complexity of the resulting tables:  Variables are ordered before calculations  Summations are pushed down into products

Symbolic Probabilistic Inference (SPI) Consider the following Bayesian Network Assuming that each variable has two states, P(A,C) require a total of 92 significant operations From equation (1) and (2), the joint probability of the variable A and C

Symbolic Probabilistic Inference (SPI) With a single re-ordering of the terms combined by equation (1) followed by the distribution of the summation from (2): This requires only 32 significant operations.

Factor Trees Two Stage method for deriving the desired joint and single beliefs. Creation of Factor Tree. Passing algorithm on the Factor Tree to retrieve desired joint and single beliefs.

Factor Trees Algorithm  Start by Calculating the optimal factoring order for the network given the target set of variables whose joint is desired.  Construct a Binary Tree showing the combination of initial probability table and conformal table.  Label edges between table along which variables are marginalized with the variables marginalized before combination.  Add an additional head that has an empty label above the current root, a conformal table labeled with the target set of variables, that has no variables.

Factor Trees Algorithm