Bayesian Networks. Quiz: Probabilistic Reasoning 1.What is P(F), the probability that some creature can fly? 2.Creature b is a bumble bee. What’s P(F|B),

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

Markov Networks Alan Ritter.
Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
A. Darwiche Bayesian Networks. A. Darwiche Reasoning Systems Diagnostics: Which component failed? Information retrieval: What document to retrieve? On-line.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Bayes Net Inference/Learning Lab. Causal ordering Which of the following Bayes Nets exhibits a causal ordering of the variables? Battery dead No oil No.
From Rumelhart (1977) "Toward Interactive Model of Reading"
A. Darwiche Inference in Bayesian Networks. A. Darwiche Query Types Pr: –Evidence: Pr(e) –Posterior marginals: Pr(x|e) for every X MPE: Most probable.
CPSC 7373: Artificial Intelligence Lecture 4: Uncertainty
Bayesian Networks VISA Hyoungjune Yi. BN – Intro. Introduced by Pearl (1986 ) Resembles human reasoning Causal relationship Decision support system/ Expert.
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Introduction to probability theory and graphical models Translational Neuroimaging Seminar on Bayesian Inference Spring 2013 Jakob Heinzle Translational.
Toothache  toothache catch  catch catch  catch cavity  cavity Joint PDF.
Review: Bayesian learning and inference
A. Darwiche Bayesian Networks. A. Darwiche Bayesian Network Battery Age Alternator Fan Belt Battery Charge Delivered Battery Power Starter Radio LightsEngine.
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
Today Logistic Regression Decision Trees Redux Graphical Models
Bayesian Networks Alan Ritter.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Bayes’ Nets  A Bayes’ net is an efficient encoding of a probabilistic model of a domain  Questions we can ask:  Inference: given a fixed BN, what is.
Made by: Maor Levy, Temple University  Probability expresses uncertainty.  Pervasive in all of Artificial Intelligence  Machine learning 
A Brief Introduction to Graphical Models
Bayes Net Lab. 1. Absolute Independence vs. Conditional Independence For any three random variables X, Y, and Z, a)is it true that if X  Y, then X 
Lab Assignment 1 Environments Search Bayes Nets. Problem 1: Peg Solitaire Is Peg Solitaire: Partially observable? Stochastic? Continuous? Adversarial?
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Introduction to Bayesian Networks
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Announcements Project 4: Ghostbusters Homework 7
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS.
Bayesian Network By Zhang Liliang. Key Point Today Intro to Bayesian Network Usage of Bayesian Network Reasoning BN: D-separation.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
INTERVENTIONS AND INFERENCE / REASONING. Causal models  Recall from yesterday:  Represent relevance using graphs  Causal relevance ⇒ DAGs  Quantitative.
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2008 Readings: K&F: 3.1, 3.2, –  Carlos.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
Lecture 2: Statistical learning primer for biologists
Inference Algorithms for Bayes Networks
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Pattern Recognition and Machine Learning
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Conditional Independence As with absolute independence, the equivalent forms of X and Y being conditionally independent given Z can also be used: P(X|Y,
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Recuperação de Informação B Modern Information Retrieval Cap. 2: Modeling Section 2.8 : Alternative Probabilistic Models September 20, 1999.
Compiling Bayesian Networks Using Variable Elimination
CS 2750: Machine Learning Review
CS 2750: Machine Learning Directed Graphical Models
Qian Liu CSE spring University of Pennsylvania
Bayesian networks (1) Lirong Xia Spring Bayesian networks (1) Lirong Xia Spring 2017.
Read R&N Ch Next lecture: Read R&N
Tutorial 2 Simple examples of Bayesian networks, Queries, And the stories behind them Tal Shor.
Quizzz Rihanna’s car engine does not start (E).
Markov Properties of Directed Acyclic Graphs
Read R&N Ch Next lecture: Read R&N
Structure and Semantics of BN
Propagation Algorithm in Bayesian Networks
CAP 5636 – Advanced Artificial Intelligence
CSE 473: Artificial Intelligence Autumn 2011
CS 188: Artificial Intelligence Fall 2007
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence
Class #16 – Tuesday, October 26
CS 188: Artificial Intelligence Spring 2007
Structure and Semantics of BN
Probabilistic Reasoning
CS 188: Artificial Intelligence Spring 2006
Read R&N Ch Next lecture: Read R&N
Bayesian networks (1) Lirong Xia. Bayesian networks (1) Lirong Xia.
CS 188: Artificial Intelligence Fall 2008
Presentation transcript:

Bayesian Networks

Quiz: Probabilistic Reasoning 1.What is P(F), the probability that some creature can fly? 2.Creature b is a bumble bee. What’s P(F|B), the probability that b can fly given that it’s a bumble bee? 3.b has unfortunately met a malicious child, who has torn off b’s wings. What is P(F|B,N), the probability that b can fly given that it has no wings? 4.b somehow makes its way onto a jumbo jet, where it survives by drinking juice spilled by passengers. What is P(F|B, N, L=j), the probability that b can fly given that it has no wings and its location is a jet?

Example BN = (V, E, P) V = a set of random variables E = directed edges between them (cycles not allowed) P = for every node in the network, a conditional probability distribution for that random variable, given its parents in the graph Has diabetes? (D or  D) Test was positive? (+ or -) Observable node Unobservable node Diab?P(D) D0.01 DD 0.99 Diab?Test?P(T|D) D+0.9 D-0.1 DD +0.2 DD -0.8

Simple probabilistic reasoning You already know how to figure out: P(D)  stored in the Bayes Net P(+|D)  stored in the Bayes Net P(D,+)  multiply P(D)P(+|D) P(+)  apply marginalization to P(D, +) P(D|+)  apply Bayes’ Rule

Purpose behind Bayes Networks Bayes Nets help figure out more difficult cases: What’s P(Battery Dead | Car won’t start, Battery is 5 years old)? or P(Alternator broken | Car won’t start, oil light is on, lights are dim)? Battery dead Battery age Fan belt broken Battery meter No oil Battery flat Alternator broken Not charging No gas Starter broken Fuel line blocked Lights Gas gauge Oil light Car won’t start dipstick

Types of Bayes Net Queries Bayes Nets let you solve “queries”, or probabilistic questions. There are different types of queries for a Bayes Net with random variables X1, …, XN: 1.Joint queries: What is P(car starts, oil light on)? 2.Conditional queries: What is P(alternator broken, battery light dim | oil light off, lights dim)? 3.Maximum a posteriori (MAP): what values (true or false) for “Will Car Start?” makes this probability the biggest: P(Will Car Start? | battery is 5 years old, lights dim)

The Bayes Net Equation

Example P(Diab, Test) = P(Diab|parents(Diab)) *P(Test|parents(Test)) =P(Diab) *P(Test|Diab) Has diabetes? (D or  D) Test was positive? (+ or -)

Quiz: Two-test Diabetes 1.What is P(Test1=+|D)? 2. What is P(Test1=+|D,Test2=+)? 3. What is P(D|Test1=+,Test2=+)? 4. What is P(D|Test1=+,Test2=-)? Has diabetes? (D or  D) Test 1 was positive? (+ or -) Test 2 was positive? (+ or -) Diab?P(D) D0.01 DD 0.99 Diab?Test1?P(T1|D) D+0.9 D-0.1 DD +0.2 DD -0.8 Diab?Test2?P(T2|D) D+0.9 D-0.1 DD +0.2 DD -0.8

Conditional Independence in a BN In this BN, T1  T2 | D This means, e.g.: P(T1=+|D, T2=+) is the same as P(T1=+|D) Has diabetes? (D or  D) Test 1 was positive? (+ or -) Test 2 was positive? (+ or -)

Quiz: Two-test Diabetes What is P(T1=+|T2=+)? Has diabetes? (D or  D) Test 1 was positive? (+ or -) Test 2 was positive? (+ or -)

Absolute vs. Conditional Independence Remember: T1  T2 | D Does this mean that T1  T2 ? In other words, P(T1) =? P(T1 | T2) Has diabetes? (D or  D) Test 1 was positive? (+ or -) Test 2 was positive? (+ or -)

Confounding Cause 1.What is P(R | S)? 2.What is P(R | H, S)? 3.What is P(R | H,  S)? 4.What is P(R | H)? Happy? (H or  H) Sunny? (S or  S) Raise? (R or  R) S?S?P(D) S0.7 SS 0.3 R?R?P(R) R0.01 RR 0.99 Happy?Sunny?Raise?P(H|S,R)? HSR1.0 HS RR 0.7 H SS R0.9 H SS RR 0.1

Absolute vs. Conditional Independence Remember: R  S Does this mean that R  S | H ? In other words, P(R | H) =? P(R | H, S) Happy? (H or  H) Sunny? (S or  S) Raise? (R or  R)

D-Separation D-separation is the technical method for determining conditional independence in a BN. Active TripletsInactive Triplets …

D-Separation Node A is d-separated (short for directional-separated) from node B if all paths from A to B contain at least one inactive triplet. A  B | K 1, …, K m  nodes A and B are d-separated when nodes K 1, …, K m are known

D-Separation Quiz 1 C  A? C  A | B? C  D? C  D | A? E  C | D? D A B C E

D-Separation Quiz 2 A  E? A  E | B? A  E | C? A  B? A  B | C? C AB DE

D-Separation Quiz F  A? F  A | D? F  A | G? F  A | H? B AC D F E G H

Counting BN Parameters A complete joint distribution over 5 binary variables would require 31 = parameters. This BN requires 10 = parameters. C AB DE

Quiz A full joint over 6 binary variables requires = 63 parameters. How many parameters does this network require? C A BD E F

Quiz A full joint distribution over 7 binary variables requires = 127 parameters. How many parameters does this network require? D AC EG F B

Quiz A full joint distribution over 16 binary variables requires = 65,535 parameters. How many parameters does this network require? Battery dead Battery age Fan belt broken Battery meter No oil Battery flat Alternator broken Not charging No gas Starter broken Fuel line blocked Lights Gas gauge Oil light Car won’t start dipstick