Bayesian Network Reasoning with Gibbs Sampling

Slides:



Advertisements
Similar presentations
Autonomic Scaling of Cloud Computing Resources
Advertisements

Todd W. Neller Gettysburg College
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Gibbs Sampling Qianji Zheng Oct. 5th, 2010.
Bayesian Networks. Contents Semantics and factorization Reasoning Patterns Flow of Probabilistic Influence.
1 Graphical Diagnostic Tools for Evaluating Latent Class Models: An Application to Depression in the ECA Study Elizabeth S. Garrett Department of Biostatistics.
Stochastic approximate inference Kay H. Brodersen Computational Neuroeconomics Group Department of Economics University of Zurich Machine Learning and.
1 Vertically Integrated Seismic Analysis Stuart Russell Computer Science Division, UC Berkeley Nimar Arora, Erik Sudderth, Nick Hay.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Bayesian Networks. Graphical Models Bayesian networks Conditional random fields etc.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
1 © 1998 HRL Laboratories, LLC. All Rights Reserved Development of Bayesian Diagnostic Models Using Troubleshooting Flow Diagrams K. Wojtek Przytula: HRL.
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
1 Midterm Exam Mean: 72.7% Max: % Kernel Density Estimation.
1 CMSC 471 Fall 2002 Class #19 – Monday, November 4.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
Introduction to Bayesian Networks
Inference Complexity As Learning Bias Daniel Lowd Dept. of Computer and Information Science University of Oregon Joint work with Pedro Domingos.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
4 Proposed Research Projects SmartHome – Encouraging patients with mild cognitive disabilities to use digital memory notebook for activities of daily living.
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
CS 188: Artificial Intelligence Bayes Nets: Approximate Inference Instructor: Stuart Russell--- University of California, Berkeley.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
Inference Algorithms for Bayes Networks
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Belief Networks CS121 – Winter Other Names Bayesian networks Probabilistic networks Causal networks.
Probabilistic Reasoning Inference and Relational Bayesian Networks.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
A Collapsed Variational Bayesian Inference Algorithm for Latent Dirichlet Allocation Yee W. Teh, David Newman and Max Welling Published on NIPS 2006 Discussion.
Monte Carlo Sampling to Inverse Problems Wojciech Dębski Inst. Geophys. Polish Acad. Sci. 1 st Orfeus workshop: Waveform inversion.
CS498-EA Reasoning in AI Lecture #19 Professor: Eyal Amir Fall Semester 2011.
(joint work with Ai-ru Cheng, Ron Gallant, Beom Lee)
CS 2750: Machine Learning Directed Graphical Models
Bayesian networks Chapter 14 Section 1 – 2.
Topics Covered since 1st midterm…
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
CS b553: Algorithms for Optimization and Learning
Exam Preparation Class
Learning Bayesian Network Models from Data
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
CAP 5636 – Advanced Artificial Intelligence
CSE-490DF Robotics Capstone
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Instructors: Fei Fang (This Lecture) and Dave Touretzky
CAP 5636 – Advanced Artificial Intelligence
Ch13 Empirical Methods.
Professor Marie desJardins,
Bayesian Statistics and Belief Networks
CS 188: Artificial Intelligence
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence
Directed Graphical Probabilistic Models: the sequel
Class #16 – Tuesday, October 26
Class #22/23 – Wednesday, November 12 / Monday, November 17
Belief Networks CS121 – Winter 2003 Belief Networks.
Bayesian networks Chapter 14 Section 1 – 2.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

Bayesian Network Reasoning with Gibbs Sampling Todd W. Neller Gettysburg College

Outline Prerequisites Objectives What is provided Example exercise Possible uses and extensions

Prerequisites Audience: Undergraduate Intro AI students Prerequisite knowledge: DAGs Axioms of probability Conditional probabilities Conditional probability tables (CPTs) Competent programmers (pseudocodecode)

Objectives Students will Implement the core of the Gibbs Sampling algorithm, a Markov Chain Monte Carlo (MCMC) algorithms for Bayesian Network (BN) reasoning Empirically observe important concepts of reasoning, e.g. conditional independence, diagnostic inference, causal, intercausal, and mixed inference, D-separation Experience both strengths and weaknesses of this MCMC approach, gaining an appreciation for its place in our algorithmic toolset.

What is Provided Brief introduction to BN reasoning Recommendations for supplementary readings Java software to parse a simple grammar for describing BN structure, CPTs, and evidence and build a helpful BN data structure Exercises to guide a student to an empirical understanding of core BN reasoning concepts

Example Exercise – Intercausal Inference (Explaining Away) P(a) = {.20} P(b|a) = {.20, .80} P(c|a) = {.05, .20} P(d|b,c) = {.05, .80, .80, .80} P(e|c) = {.60, .80} evidence d T BNGibbsSampler After iteration 1000000: Variable, Average Conditional Probability, Fraction True a, 0.4248869857096668, 0.424857 b, 0.8001614923088834, 0.800057 c, 0.20001273834624336, 0.199574 e, 0.6399147999906855, 0.640206

Example Exercise – Intercausal Inference (Explaining Away) +c P(a) = {.20} P(b|a) = {.20, .80} P(c|a) = {.05, .20} P(d|b,c) = {.05, .80, .80, .80} P(e|c) = {.60, .80} evidence d c T T BNGibbsSampler After iteration 1000000: Variable, Average Conditional Probability, Fraction True a, 0.4992908000000346, 0.49861 b, 0.4991660000000317, 0.498817 e, 0.8000000000106631, 0.799731

Example Exercise – Intercausal Inference (Explaining Away) +b P(a) = {.20} P(b|a) = {.20, .80} P(c|a) = {.05, .20} P(d|b,c) = {.05, .80, .80, .80} P(e|c) = {.60, .80} evidence d b T T BNGibbsSampler After iteration 1000000: Variable, Average Conditional Probability, Fraction True a, 0.49982754285114167, 0.5 c, 0.1249383398139281, 0.124496 e, 0.6248991999897608, 0.62497

Example Exercise – Intercausal Inference (Explaining Away) Summary Given evidence of d (P(d) = 1): Evidence +d +d +c +d +b P(b) .8 .5 1 P(c) .2 .125 T

Possible Uses and Extensions Complete exercises emphasize core Gibbs Sampling implementation and experiential learning of BN reasoning concepts Solutions are available for instructors Could provide Gibbs sampling implementation and focus simply on empirical observations of BN reasoning concepts Ease of BN specification allows easy assignment of additional BN exercises with various foci: Representation focus Estimation qualitative/quantitative focus Estimation accuracy focus

Conclusion This assignment provides: Easy parsing of simple BN specification Auto construction of BN data structure with helper methods to aid focus on the core implementation of Gibbs Sampling Basic exercises to efficiently and experientially teach about BN reasoning Possibility of simplification, extension, various foci http://modelai.gettysburg.edu/2017/mc2