Lecture on Bayesian Belief Networks (Basics)

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Probabilistic Reasoning Bayesian Belief Networks Constructing Bayesian Networks Representing Conditional Distributions Summary.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Introduction to probability theory and graphical models Translational Neuroimaging Seminar on Bayesian Inference Spring 2013 Jakob Heinzle Translational.
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
1 © 1998 HRL Laboratories, LLC. All Rights Reserved Construction of Bayesian Networks for Diagnostics K. Wojtek Przytula: HRL Laboratories & Don Thompson:
Software Engineering Laboratory1 Introduction of Bayesian Network 4 / 20 / 2005 CSE634 Data Mining Prof. Anita Wasilewska Hiroo Kusaba.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Proactivity using Bayesian Methods and Learning 3rd UK-UbiNet Workshop, Bath Lukas Sklenar Computing Laboratory, University of Kent.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
Goal: Reconstruct Cellular Networks Biocarta. Conditions Genes.
Artificial Intelligence and Lisp Lecture 7 LiU Course TDDC65 Autumn Semester, 2010
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
A Brief Introduction to Graphical Models
Bayesian Belief Network Compiled By: Raj Gaurang Tiwari Assistant Professor SRMGPC, Lucknow.
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
VTT-STUK assessment method for safety evaluation of safety-critical computer based systems - application in BE-SECBS project.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Lecture on Bayesian Belief Networks (Basics) Patrycja Gradowska Open Risk Assessment Workshop Kuopio, 2009.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
INTERVENTIONS AND INFERENCE / REASONING. Causal models  Recall from yesterday:  Represent relevance using graphs  Causal relevance ⇒ DAGs  Quantitative.
CHAPTER 5 Probability Theory (continued) Introduction to Bayesian Networks.
SOFTWARE METRICS Software Metrics :Roadmap Norman E Fenton and Martin Neil Presented by Santhosh Kumar Grandai.
Dependency Networks for Collaborative Filtering and Data Visualization UAI-2000 발표 : 황규백.
Lecture 29 Conditional Independence, Bayesian networks intro Ch 6.3, 6.3.1, 6.5, 6.5.1,
1 Use graphs and not pure logic Variables represented by nodes and dependencies by edges. Common in our language: “threads of thoughts”, “lines of reasoning”,
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Introduction on Graphic Models
Using Bayesian Belief Networks in Assessing Software Architectures Jilles van Gurp & Jan Bosch.
COV outline Case study 1: Model with expert judgment (limited data): Perceived probability of explosive magmatic eruption of La Soufriere, Guadeloupe,
Belief Networks Kostas Kontogiannis E&CE 457. Belief Networks A belief network is a graph in which the following holds: –A set of random variables makes.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Linking Threats to Assets in Complex Ecological and Socio-Economic Systems: Qualitative Modelling for Tourism Development in North Western Australia Jeffrey.
Graduate School of Information Sciences, Tohoku University
Reasoning Under Uncertainty: More on BNets structure and construction
Bayesian networks Chapter 14 Section 1 – 2.
Quick Review Probability Theory
Quick Review Probability Theory
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: More on BNets structure and construction
Learning Bayesian Network Models from Data
Read R&N Ch Next lecture: Read R&N
Data Mining Classification: Alternative Techniques
Propagation Algorithm in Bayesian Networks
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
A Direct Measure for the Efficacy of Bayesian Network Structures Learned from Data Gary F. Holness.
Read R&N Ch Next lecture: Read R&N
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Graduate School of Information Sciences, Tohoku University
Read R&N Ch Next lecture: Read R&N
Graduate School of Information Sciences, Tohoku University
Presentation transcript:

Lecture on Bayesian Belief Networks (Basics) Patrycja Gradowska Open Risk Assessment Workshop Kuopio, 2009

Lecture contents Brief introduction to BBN and example Definition of BBN Purpose of BBN Construction of BBN Concluding remarks

A simple example Problem: Two colleagues Norman and Martin live in the same city, work together, but come to work by completely different means - Norman usually comes by train while Martin always drives. The railway workers go on strike sometimes. Goal: We want to predict whether Norman and Martin will be late for work.

A simple example Two causal relations: Strike of the railway can cause Norman to be late Strike of the railway can cause Martin to be late IMPORTANT: These relations are NOT absolute!! Strike of the railway does NOT guarantee that Norman and Martin will be late for sure. It ONLY increases the probability (chance) of lateness.

What is Bayesian Belief Network (BBN)? Bayesian Belief Network (BBN) is a directed acyclic graph associated with a set of conditional probability distributions. BBN is a set of nodes connected by directed edges in which: nodes represent discrete or continuous random variables in the problem studied, directed edges represent direct or causal relationships between variables and do not form cycles, each node is associated with a conditional probability distribution which quantitatively expresses the strength of the relationship between that node and its parents.

BBN for late-for-work example 3 discrete (Yes/No) variables Train strike Probability Y 0.1 N 0.9 Train Strike (TS) Norman late (NL) Martin late (ML) Train strike Y N Norman late 0.8 0.1 0.2 0.9 Train strike Y N Martin late 0.6 0.5 0.4

What do we need BBNs for? Bayesian Belief Network: gives the intuitive representation of the problem that is easily understood by non-experts incorporates uncertainty related to the problem and also allows to calculate probabilities of different scenarios (events) relevant to the problem and to predict consequences of these scenarios

BBN for late-for-work example Train strike Probability Y 0.1 N 0.9 Train strike Y N Norman late 0.8 0.1 0.2 0.9 Train strike Y N Martin late 0.6 0.5 0.4 joint probability P(ML = Y, NL = Y, TS = N) ML NL TS Probability Y 0.048 N 0.032 0.012 0.008 0.045 0.405 P(ML = Y, NL = N, TS = N) … P(ML = m, NL = n, TS = t) P(ML,NL,TS) = P(ML|TS)∙P(NL|TS)∙P(TS) - joint distribution P(ML = Y, NL = Y,TS = N) = = P(ML = Y|TS = N)∙P(NL = Y|TS = N)∙P(TS = N) = 0.5 ∙ 0.1 ∙ 0.9 = 0.045

BBN for late-for-work example P(ML = Y, NL = Y) ML NL TS Probability Y 0.048 N 0.032 0.012 0.008 0.045 0.405 - marginal probability marginal distribution (marginalization) P(ML = Y, NL = Y) = P(ML = Y, NL = Y, TS = t) = = P(ML = Y, NL = Y,TS = Y) + P(ML = Y, NL = Y,TS = N) = 0.048 + 0.045 = 0.093 ML NL Probability Y 0.093 N 0.077 0.417 0.413

BBN for late-for-work example P(NL = Y) - marginal probability marginal distribution (marginalization) OR easier ML NL TS Probability Y 0.048 N 0.032 0.012 0.008 0.045 0.405 P(NL = Y) = P(ML = m, NL = Y, TS = t) = = P(ML = Y, NL = Y, TS = Y) + P(ML = N, NL = Y, TS = N) = P(ML = N, NL = Y, TS = Y) + P(ML = Y, NL = Y, TS = N) = 0.048 + 0.045 + 0.032 + 0.045 = 0.17 ML NL Probability Y 0.093 N 0.077 0.417 0.413 P(NL = Y) = P(ML = m, NL = Y) = = P(ML = Y, NL = Y) + P(ML = N, NL = Y) = = 0.093 + 0.077 = 0.17 NL Probability Y 0.17 N 0.83

BBN for late-for-work example P(ML = Y| NL = Y) - conditional probability - Bayes’ Theorem - conditional distribution ML NL Probability Y 0.093 N 0.077 0.417 0.413 P(ML | NL) NL Probability Y 0.17 N 0.83 P(ML = Y, NL = Y) P(ML = Y| NL = Y) = = = 0.093 / 0.17 = 0.5471 P(NL = Y) ML Probability Y 0.51 N 0.49 P(NL = Y| TS = Y)·P(TS = T) P(TS = Y| NL = Y) = = = 0.08 / 0.17 = 0.471 > P(TS = T) = 0.1 P(NL = Y) Norman late Y N Martin late 0.5471 0.5024 0.4529 0.4976

Building and quantifying BBN Identification and understanding of the problem (purpose, scope, boundaries, variables) Identification of relationships between variables and determination of the graphical structure of the BBN Verification and validation of the structure Quantification of conditional probability distributions Verification and validation of the model – test runs Analysis of the problem via conditionalization process NOT easy in practice

Additional remarks on BBNs there are other BBNs in addition to discrete ones, i.e. continuous BBNs, mixed discrete-continuous BBNs BBNs are not only about causal relations between variables, they also capture probabilistic dependencies and can model mathematical (functional) relationships BBNs can be really large and complex but once the BBN is built the analysis of the problem is fun and simple using available BBN software, e.g. Netica, Hugin, UniNet