Reasoning with Bayesian Networks. Overview Bayesian Belief Networks (BBNs) can reason with networks of propositions and associated probabilities Useful.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Making Simple Decisions
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
E olmstead Knowledge Industries, Inc. January 27,
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
1 Bayes nets Computing conditional probability Polytrees Probability Inferences Bayes nets Computing conditional probability Polytrees Probability Inferences.
Bayes Rule The product rule gives us two ways to factor a joint probability: Therefore, Why is this useful? –Can get diagnostic probability P(Cavity |
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
1 Knowledge Engineering for Bayesian Networks. 2 Probability theory for representing uncertainty l Assigns a numerical degree of belief between 0 and.
Bayesian Classification
IMPORTANCE SAMPLING ALGORITHM FOR BAYESIAN NETWORKS
Causal and Bayesian Network (Chapter 2) Book: Bayesian Networks and Decision Graphs Author: Finn V. Jensen, Thomas D. Nielsen CSE 655 Probabilistic Reasoning.
Review: Bayesian learning and inference
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Proactivity using Bayesian Methods and Learning 3rd UK-UbiNet Workshop, Bath Lukas Sklenar Computing Laboratory, University of Kent.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
1 Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University.
Bayesian networks practice. Semantics e.g., P(j  m  a   b   e) = P(j | a) P(m | a) P(a |  b,  e) P(  b) P(  e) = … Suppose we have the variables.
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Does Naïve Bayes always work?
Judgment and Decision Making in Information Systems Computing with Influence Diagrams and the PathFinder Project Yuval Shahar, M.D., Ph.D.
Aspects of Bayesian Inference and Statistical Disclosure Control in Python Duncan Smith Confidentiality and Privacy Group CCSR University of Manchester.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Adapted from slides by Tim Finin
CVPR Winter seminar Jaemin Kim
Lecture 1 is based on David Heckerman’s Tutorial slides. (Microsoft Research) Bayesian Networks Lecture 1: Basics and Knowledge- Based Construction Requirements:
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Reasoning with Bayesian Belief Networks. Overview Bayesian Belief Networks (BBNs) can reason with networks of propositions and associated probabilities.
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 28 of 41 Friday, 22 October.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Uncertainty in Expert Systems
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Clinical Decision Support 1 Historical Perspectives.
Quiz 3: Mean: 9.2 Median: 9.75 Go over problem 1.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Introduction on Graphic Models
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
CHAPTER 3: BAYESIAN DECISION THEORY. Making Decision Under Uncertainty Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Bayesian Classification 1. 2 Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership.
© Jack Breese (Microsoft) & Daphne Koller (Stanford) 1 Course Contents » Concepts in Probability u Probability u Random variables u Basic properties (Bayes.
Reasoning with Bayesian Belief Networks
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Read R&N Ch Next lecture: Read R&N
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Learning Bayesian Network Models from Data
Read R&N Ch Next lecture: Read R&N
Uncertainty in AI.
Class #21 – Monday, November 10
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Bayesian networks Chapter 14 Section 1 – 2.
Read R&N Ch Next lecture: Read R&N
Chapter 14 February 26, 2004.
Intro. to Data Mining Chapter 6. Bayesian.
Presentation transcript:

Reasoning with Bayesian Networks

Overview Bayesian Belief Networks (BBNs) can reason with networks of propositions and associated probabilities Useful for many AI problems –Diagnosis –Expert systems –Planning –Learning

Recall Bayes Rule Note the symmetry: we can compute the probability of a hypothesis given its evidence and vice versa.

Simple Bayesian Network Cancer Smoking P(S=no)0.80 P(S=light)0.15 P(S=heavy)0.05 Smoking=nolightheavy P(C=none) P(C=benign) P(C=malig)

More Complex Bayesian Network Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics

More Complex Bayesian Network Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics Links represent “causal” relations Nodes represent variables

More Complex Bayesian Network Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics predispositions

More Complex Bayesian Network Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics condition

More Complex Bayesian Network Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics observable symptoms

Independence Age and Gender are independent. P(A|G) = P(A) A  G P(G|A) = P(G) G  A GenderAge P(A,G) = P(G|A) P(A) = P(G)P(A) P(A,G) = P(A|G) P(G) = P(A)P(G) P(A,G) = P(G)P(A)

Conditional Independence Smoking GenderAge Cancer Cancer is independent of Age and Gender given Smoking. P(C|A,G,S) = P(C|S) C  A,G | S

Conditional Independence: Naïve Bayes Cancer Lung Tumor Serum Calcium Serum Calcium is independent of Lung Tumor, given Cancer P(L|SC,C) = P(L|C) Serum Calcium and Lung Tumor are dependent Naïve Bayes assumption: evidence (e.g., symptoms) is indepen- dent given the disease. This make it easy to combine evidence

Explaining Away Exposure to Toxics is dependent on Smoking, given Cancer Exposure to Toxics and Smoking are independent Smoking Cancer Exposure to Toxics P(E = heavy | C = malignant) > P(E = heavy | C = malignant, S=heavy) “Explaining away” is like abductive inference in that it moves from observation to possible causes or explanations.

Conditional Independence Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics Cancer is independent of Age and Gender given Exposure to Toxics and Smoking. Descendants Parents Non-Descendants A variable (node) is conditionally independent of its non-descendants given its parents

Another non-descendant Diet Cancer is independent of Diet given Exposure to Toxics and Smoking Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics A variable is conditionally independent of its non-descendants given its parents

BBN Construction The knowledge acquisition process for a BBN involves three steps –Choosing appropriate variables –Deciding on the network structure –Obtaining data for the conditional probability tables

Risk of Smoking Smoking They should be values, not probabilities (1) Choosing variables Variables should be collectively exhaustive, mutually exclusive values Error Occurred No Error

Heuristic: Knowable in Principle Example of good variables –Weather {Sunny, Cloudy, Rain, Snow} –Gasoline: Cents per gallon –Temperature {  100F, < 100F} –User needs help on Excel Charting {Yes, No} –User’s personality {dominant, submissive}

(2) Structuring Lung Tumor Smoking Exposure to Toxic GenderAge Network structure corresponding to “causality” is usually good. Cancer Genetic Damage Initially this uses the designer’s knowledge but can be checked with data

(3) The numbers Zeros and ones are often enough Order of magnitude is typical: vs Sensitivity analysis can be used to decide accuracy needed Second decimal usually doesn’t matter Relative probabilities are important

Predictive Inference How likely are elderly males to get malignant cancer? P(C=malignant | Age>60, Gender=male) Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics

Predictive and diagnostic combined How likely is an elderly male patient with high Serum Calcium to have malignant cancer? P(C=malignant | Age>60, Gender= male, Serum Calcium = high) Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics

Explaining away Smoking GenderAge Cancer Lung Tumor Serum Calcium Exposure to Toxics If we see a lung tumor, the probability of heavy smoking and of exposure to toxics both go up. If we then observe heavy smoking, the probability of exposure to toxics goes back down. Smoking

Decision making Decision - an irrevocable allocation of domain resources Decision should be made so as to maximize expected utility. View decision making in terms of –Beliefs/Uncertainties –Alternatives/Decisions –Objectives/Utilities

A Decision Problem Should I have my party inside or outside? in out Regret Relieved Perfect! Disaster dry wet dry wet

Value Function A numerical score over all possible states of the world allows BBN to be used to make decisions

Netica Software for working with Bayesian belief networks and influence diagrams A commercial product but free for small networks Includes a graphical editor, compiler, inference engine, etc.

Predispositions or causes

Conditions or diseases

Functional Node

Symptoms or effects Dyspnea is shortness of breath