Summary Belief Networks zObjective: yProbabilistic knowledge base + Inference engine that computes xProb(formula | “all evidence collected so far”) zBelief.

Slides:



Advertisements
Similar presentations
BAYESIAN NETWORKS Ivan Bratko Faculty of Computer and Information Sc. University of Ljubljana.
Advertisements

1 Bayes nets Computing conditional probability Polytrees Probability Inferences Bayes nets Computing conditional probability Polytrees Probability Inferences.
Reasoning under Uncertainty Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
1 Knowledge Engineering for Bayesian Networks. 2 Probability theory for representing uncertainty l Assigns a numerical degree of belief between 0 and.
Identifying Conditional Independencies in Bayes Nets Lecture 4.
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
For Monday Finish chapter 14 Homework: –Chapter 13, exercises 8, 15.
Bayesian Networks. Introduction A problem domain is modeled by a list of variables X 1, …, X n Knowledge about the problem domain is represented by a.
Bayesian Networks VISA Hyoungjune Yi. BN – Intro. Introduced by Pearl (1986 ) Resembles human reasoning Causal relationship Decision support system/ Expert.
1 22c:145 Artificial Intelligence Bayesian Networks Reading: Ch 14. Russell & Norvig.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Review: Bayesian learning and inference
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Basics Random variable takes values Joint Probability Distribution
1 Data Mining with Bayesian Networks (I) Instructor: Qiang Yang Hong Kong University of Science and Technology Thanks: Dan Weld, Eibe.
Bayesian networks Chapter 14 Section 1 – 2.
1 But Uncertainty is Everywhere zMedical knowledge in logic? yToothache Cavity zProblems yToo many exceptions to any logical rule xHard to code accurate.
Bayesian Belief Networks
Intro to AI Uncertainty Ruth Bergman Fall Why Not Use Logic? Suppose I want to write down rules about medical diagnosis: Diagnostic rules: A x has(x,sorethroat)
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Belief Networks Russell and Norvig: Chapter 15 CS121 – Winter 2002.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
Artificial Intelligence CS 165A Tuesday, November 27, 2007  Probabilistic Reasoning (Ch 14)
Bayesian networks Chapter 14. Outline Syntax Semantics.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Probabilistic Reasoning ECE457 Applied Artificial Intelligence Spring 2007 Lecture #9.
Bayesian Statistics and Belief Networks. Overview Book: Ch 13,14 Refresher on Probability Bayesian classifiers Belief Networks / Bayesian Networks.
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
Introduction to Bayesian Networks
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Bayesian Networks: Independencies and Inference Scott Davies and Andrew Moore Note to other teachers and users of these slides. Andrew and Scott would.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Bayesian Networks Aldi Kraja Division of Statistical Genomics.
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.
Slide 1 Directed Graphical Probabilistic Models: inference William W. Cohen Machine Learning Feb 2008.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Artificial Intelligence Chapter 19 Reasoning with Uncertain Information Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Intro to AI Uncertainty Ruth Bergman Fall Why Not Use Logic? Suppose I want to write down rules about medical diagnosis: Diagnostic rules:  p Symptom(p,Toothache)
Outline for 4/11 Bayesian Networks Planning. 2 Sources of Uncertainty Medical knowledge in logic? –Toothache Cavity Problems –Too many exceptions to any.
Planning Chapter11 Bayesian Networks (Chapters 14,15)
CS 2750: Machine Learning Review
CS 2750: Machine Learning Directed Graphical Models
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Qian Liu CSE spring University of Pennsylvania
Conditional Probability, Bayes’ Theorem, and Belief Networks
Bayesian Networks Probability In AI.
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
Artificial Intelligence Chapter 19
Bayesian Statistics and Belief Networks
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence
Directed Graphical Probabilistic Models: the sequel
Class #16 – Tuesday, October 26
Belief Networks CS121 – Winter 2003 Belief Networks.
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Presentation transcript:

Summary Belief Networks zObjective: yProbabilistic knowledge base + Inference engine that computes xProb(formula | “all evidence collected so far”) zBelief (Bayes) Networks yDirected graph - arcs denote direct causal relationship yFor each root. provide unconditional (prior) probability yfor all remaining, provide a conditional probability table xProb(Child | Parent1, Parent2, Parent3) for all combos yRepresent uncertain evidence by adding a new node zInference yPoly tree case yMultiply connected networks => cluster

2 Complete Bayes Network Burglary MaryCalls JohnCalls Alarm Earthquake P(A) ATFATF P(J) ATFATF P(M) P(B).001 P(E).002 ETFTFETFTF BTTFFBTTFF

3 Cond. Independence in Bayes Nets zIf a set E d-separates paths between X & Y yThen X and Y are cond. independent given E zSet E d-separates X and Y if every undirected path between X and Y has a node Z such that, either Z Z Z Z XY E

Structural Relationships and Independence zThe basic independence assumption (simplified version): ytwo nodes X and Y are probabilistically independent conditioned on E if every undirected path from X to Y is d- separated by E xevery undirected path from X to Y is blocked by E if there is a node Z for which one of three conditions hold –Z is in E and Z has one incoming arrow on the path and one outgoing arrow –Z is in E and both arrows lead out of Z –neither Z nor any descendent of Z is in E, and both arrows lead into Z

5 Inference zGiven exact values for evidence variables zCompute posterior probability of query variable Burglary MaryCall JonCalls Alarm Earthq P(B).001 P(E).002 ATFATF P(J) ATFATF P(M) ETFTFETFTF P(A) BTTFFBTTFF Diagnostic –effects to causes Causal –causes to effects Intercausal –between causes of common effect –explaining away

6 Algorithm zIn general: NP Complete zEasy for polytrees yI.e. only one undirected path between nodes zExpress P(X|E) by y1. Recursively passing support from ancestor down x“Causal support” y2. Recursively calc contribution from descendants up x“Evidential support” zSpeed: linear in the number of nodes (in polytree)

Simplest Causal Case zSuppose know Burglary zWant to know probability of alarm yP(A|B) = 0.95 Alarm Burglary P(B).001 BTFBTF P(A).95.01

Simplest Diagnostic Case Alarm Burglary P(B).001 BTFBTF P(A) zSuppose know Alarm ringing & want to know: Burglary? zI.e. want P(B|A)

P(B|A) =P(A|B) P(B) / P(A) But we don’t know P(A)

1 =P(B|A)+P(~B|A) 1 =P(A|B)P(B)/P(A) + P(A|~B)P(~B)/P(A) 1 =[P(A|B)P(B) + P(A|~B)P(~B)] / P(A) P(A) =P(A|B)P(B) + P(A|~B)P(~B)

P(B | A) =P(A|B) P(B) / [P(A|B)P(B) + P(A|~B)P(~B)] =.95*.001 / [.95* *.999] = 0.087

General Case U1U1 UmUm X Y1Y1 YnYn Z 1j Z nj... zCompute contrib of E x + by computing effect of parents of X (recursion!) zCompute contrib of E x - by... ExEx + ExEx - zExpress P(X | E) in terms of contributions of E x + and E x -

Using Probabilities to Make Decisions zComputing the probabilities of an event is the easy part of the problem. zMaking decisions is the hard part… zThe probability of winning the lottery is Should I play? zThe probability of the professor slipping and breaking his head is Should I come to the exam?

Maximum Expected Utility zUtility Function: states -> real number zExpected Utility yEU(A|E) =  I P(Result i (A) | E,Do(A)) U(Result i (A)) zPrinciple of Maximum Expected Utility yYou’re stupid if you don’t choose the action that maximizes expected utility. (stupid is a well defined mathematical term!)

Flying To Paris zIf I fly through Chicago, it takes 11 hours. But, with prob. 0.3 I’ll miss the connection and have to wait 4 hours. zIf I fly through SF, it takes 13 hours. But with probability 0.05 I’ll miss the connection and have to wait 24 hours. zUtility: yChicago: -11 * * 0.3 = ySF: -13 * * 0.05 = -14.2

Decision Tree for Paris Travel SF Chicago miss not miss not miss