BIOL 301 Guest Lecture: Reasoning Under Uncertainty (Intro to Bayes Networks) Simon D. Levy CSCI Department 8 April 2010.

Slides:



Advertisements
Similar presentations
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
Advertisements

Reasoning under Uncertainty: Marginalization, Conditional Prob., and Bayes Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 6, 2013.
Exact Inference in Bayes Nets
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Reasoning Under Uncertainty: Bayesian networks intro Jim Little Uncertainty 4 November 7, 2014 Textbook §6.3, 6.3.1, 6.5, 6.5.1,
Bayesian Networks CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Bayesian Networks VISA Hyoungjune Yi. BN – Intro. Introduced by Pearl (1986 ) Resembles human reasoning Causal relationship Decision support system/ Expert.
Bayesian Networks Using random variables to represent objects and events in the world –Various instantiations to these variables can model the current.
Bayes for beginners Methods for dummies 27 February 2013 Claire Berna
Probabilistic Reasoning (2)
From Variable Elimination to Junction Trees
CSCI 121 Special Topics: Bayesian Networks Lecture #3: Multiply-Connected Graphs and the Junction Tree Algorithm.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
Inference in Bayesian Nets
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
Bayes Nets. Bayes Nets Quick Intro Topic of much current research Models dependence/independence in probability distributions Graph based - aka “graphical.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
1 Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14,
Aspects of Bayesian Inference and Statistical Disclosure Control in Python Duncan Smith Confidentiality and Privacy Group CCSR University of Manchester.
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
A Brief Introduction to Graphical Models
1 NA387 Lecture 6: Bayes’ Theorem, Independence Devore, Sections: 2.4 – 2.5.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Reasoning Under Uncertainty: Bayesian networks intro CPSC 322 – Uncertainty 4 Textbook §6.3 – March 23, 2011.
What is Probability?  Hit probabilities  Damage probabilities  Personality (e.g. chance of attack, run, etc.)  ???  Probabilities are used to add.
Belief Propagation. What is Belief Propagation (BP)? BP is a specific instance of a general class of methods that exist for approximate inference in Bayes.
Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14, Sect.
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
Introduction to Bayesian Networks
CPSC 322, Lecture 28Slide 1 More on Construction and Compactness: Compact Conditional Distributions Once we have established the topology of a Bnet, we.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
Intro to Junction Tree propagation and adaptations for a Distributed Environment Thor Whalen Metron, Inc.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
Inference Algorithms for Bayes Networks
Quiz 3: Mean: 9.2 Median: 9.75 Go over problem 1.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
CPSC 7373: Artificial Intelligence Lecture 5: Probabilistic Inference Jiang Bian, Fall 2012 University of Arkansas at Little Rock.
Reasoning Under Uncertainty: Independence and Inference CPSC 322 – Uncertainty 5 Textbook §6.3.1 (and for HMMs) March 25, 2011.
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
CSE (c) S. Tanimoto, 2007 Bayes Nets 1 Bayes Networks Outline: Why Bayes Nets? Review of Bayes’ Rule Combining independent items of evidence General.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Statistics for Managers 5th Edition
Web-Mining Agents Data Mining Prof. Dr. Ralf Möller Universität zu Lübeck Institut für Informationssysteme Karsten Martiny (Übungen)
Bayesian Decision Theory Introduction to Machine Learning (Chap 3), E. Alpaydin.
Integrative Genomics I BME 230. Probabilistic Networks Incorporate uncertainty explicitly Capture sparseness of wiring Incorporate multiple kinds of data.
Reasoning Under Uncertainty: Belief Networks
URBDP 591 A Lecture 10: Causality
Qian Liu CSE spring University of Pennsylvania
Chapter 4 Probability.
Quick Review Probability Theory
Today.
Bayesian Networks Probability In AI.
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
Still More Uncertainty
Uncertainty in AI.
CSCI 5822 Probabilistic Models of Human and Machine Learning
Professor Marie desJardins,
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence Fall 2008
Class #16 – Tuesday, October 26
Lecture 3: Exact Inference in GMs
Clique Tree Algorithm: Computation
Presentation transcript:

BIOL 301 Guest Lecture: Reasoning Under Uncertainty (Intro to Bayes Networks) Simon D. Levy CSCI Department 8 April 2010

Review of Bayes’ Rule From the Product Rule: P(A|B) = P(A & B) / P(B) P(A &B) = P(A|B) * P(B) = P(B|A) * P(A) Rev. Thomas Bayes ( ) We derive Bayes’ Rule by substitution: P(A|B) = P(A & B) / P(B) = P(B|A) * P(A) / P(B)

Real-World Problems May Involve Many Variables

Real-World Problems May Involve Many Variables

Variables Are Typically Oberserved Simultaneously (Confounded) FeverAcheVirus P NoNoNo.950 NoNoYes.002 NoYesNo.032 NoYesYes.002 YesNoNo.002 YesNoYes.001 YesYesNo.010 YesYesYes.001 So how do we compute P(V=Yes), P(F=Yes & A=No), etc.?

Marginalization FeverAcheVirus P NoNoNo.950 NoNoYes.002 NoYesNo.032 NoYesYes.002 YesNoNo.002 YesNoYes.001 YesYesNo.010 YesYesYes.001 ______ Sum =.006 P(V=Yes):

Marginalization FeverAcheVirus P NoNoNo.950 NoNoYes.002 NoYesNo.032 NoYesYes.002 YesNoNo.002 YesNoYes.001 YesYesNo.010 YesYesYes.001 ______ Sum =.003 P(F=Yes & A=No):

Combinatorial Explosion (The “Curse of Dimensionality”) Assuming (unrealistically) only two values (Yes/No) per variable: # Variables# of Rows in Table : 20 1,048,576

Solution: Local Causality + Belief Propagation

Local Causality

Recover Joint From Prior & Posterior BEP(A) TT.95 TF.94 FT.29 FF.001 P(B).001 P(E).002 B E A Prob T T T.001*.002*.95 = T T F.001*.002*.05 = T F T.001*.998*.94 = T F F.001*.998*.06 = F T T.999*.002*.29 = F T F.999*.002*.71 = F F T.999*.998*.001 = F F F.999*.998*.999 =

Belief Propagation Consider just B → A → J P(J=T | B=T) = P(J=T | A=T) * P(A=T | B=T) Then use Bayes’ Rule and marginalization to answer more sophisticated queries like P(B=T | J=F & E=F & M=T)

Multiply-Connected Networks Wet Grass Cloudy Rain Sprinkler

Clustering (“Mega Nodes”) Cloudy Sprinkler Rain Sprinkler Rain Wet Grass Sprinkler Rain

A B D F E C G H Junction Tree Algorithm (Huang & Darwiche 1994)

A B D F E C G H A B D F E C G H “Moralize””

A B D F E C G H Junction Tree Algorithm (Huang & Darwiche 1994)

A B D F E C G H A B D F E C G H Triangulate

Junction Tree Algorithm (Huang & Darwiche 1994) A B D F E C G H

A B D F E C G H ABDADE DEF ACECEG EGH AD DE AECE EG

“Message-Passing” ABDADE DEF ACECEG EGH AD DE AECE EG Observe A=T

“Message-Passing” ABDADE DEF ACECEG EGH AD DE AECE EG Pick a cluster containing A:

“Message-Passing” ABDADE DEF ACECEG EGH AD DE AECE EG Pass messages to propagate evidence: