CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.

Slides:



Advertisements
Similar presentations
Making Simple Decisions Chapter 16 Some material borrowed from Jean-Claude Latombe and Daphne Koller by way of Marie desJadines,
Advertisements

Making Simple Decisions
Belief networks Conditional independence Syntax and semantics Exact inference Approximate inference CS 460, Belief Networks1 Mundhenk and Itti Based.
BIOL 301 Guest Lecture: Reasoning Under Uncertainty (Intro to Bayes Networks) Simon D. Levy CSCI Department 8 April 2010.
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
1 Knowledge Engineering for Bayesian Networks. 2 Probability theory for representing uncertainty l Assigns a numerical degree of belief between 0 and.
Bayesian Networks Chapter 14 Section 1, 2, 4. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
Uncertain knowledge and reasoning
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
CPSC 422 Review Of Probability Theory.
Probability.
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 12 Jim Martin.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
Bayesian Reasoning. Tax Data – Naive Bayes Classify: (_, No, Married, 95K, ?)
1 Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14,
Uncertainty Chapter 13.
Probabilistic Reasoning
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Probabilistic Belief States and Bayesian Networks (Where we exploit the sparseness of direct interactions among components of a world) R&N: Chap. 14, Sect.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
1 Probability FOL fails for a domain due to: –Laziness: too much to list the complete set of rules, too hard to use the enormous rules that result –Theoretical.
Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.
Making Simple Decisions Chapter 16 Some material borrowed from Jean-Claude Latombe and Daphne Koller by way of Marie desJadines,
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Belief Networks CS121 – Winter Other Names Bayesian networks Probabilistic networks Causal networks.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Probabilistic Robotics Probability Theory Basics Error Propagation Slides from Autonomous Robots (Siegwart and Nourbaksh), Chapter 5 Probabilistic Robotics.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #8
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Chapter 4 Probability.
Quick Review Probability Theory
Quick Review Probability Theory
Uncertainty Chapter 13.
Conditional Probability, Bayes’ Theorem, and Belief Networks
Bayesian Networks Probability In AI.
CSCI 121 Special Topics: Bayesian Networks Lecture #2: Bayes Nets
Uncertainty in AI.
Representing Uncertainty
Course Overview and Introduction
Class #21 – Monday, November 10
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Belief Networks CS121 – Winter 2003 Belief Networks.
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
ECE457 Applied Artificial Intelligence Spring 2008 Lecture #8
basic probability and bayes' rule
Presentation transcript:

CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty

Uncertainty Traditional models of reasoning (human; computer) use “all-or-nothing” (discrete) variables and rules: Hungry(Fido) Toothache(Simon) Toothache(X) → Cavity(X) Reality is usually more complicated: Toothache(X) → Cavity(X) 70% of the time Toothache(X) → Gingivitis(X) 20% of the time

Uncertainty In general, all-or-nothing rules fail for three reasons: 1. Laziness – we don't have enough time or resources list all such rules for a given domain. 2. Theoretical ignorance - we don't have a complete theory of the domain. 3. Practical ignorance – even with all rules and perfect theory, we can't make the necessary observations.

Uncertainty and Rational Decisions Utility – how useful is a particular outcome to the agent? Probability – how likely is a particular outcome? Utility + Probability = Decision theory E.g., Lottery: High utility ($$$) x extremely low probability → bad decision !

Basic Probability Prior probability – how likely is something, without any other knowledge? P(cavity) = 0.05 Conditional (posterior) probability – how likely is something, once you know something else? P(toothache|cavity) = 0.7 Product Rule: P(A|B) = P(A & B) / P(B) P(A &B) = P(A|B) * P(B) = P(B|A)* P(A)

Basic Probability Probability Distribution: All possible values of a given variable, and their probabilities (sum = 1): cavity=0.8; gingivitis = 0.1; abcess = 0.05; ? = 0.05 Joint probability: How likely is it that two things occur (are observed) together? rainy & cloudy = 0.3; cloudy & cool = 0.4

Axioms of Probability 1) All probabilities are between 0 and 1. 2) Necessarily true propositions (A V ~A) have prob 1; necessarily false (A & ~A) have prob. 0. 3) P (A V B) = P (A) + P (B) – P (A & B)

Axioms of Probability P (A V B) = P (A) + P (B) – P (A & B) A B A & B E.g., in Los Angeles, maybe P(sunny) = 0.8 ; P(warm) = 0.7. Since P is always less than 1, can't just add to get P(sunny V warm). Need to subtract P(sunny & warm) = 0.6 to get P(sunny V warm) = 0.9.

Bayes’ Rule From the Product Rule: P(A|B) = P(A & B) / P(B) P(A &B) = P(A|B) * P(B) = P(B|A) * P(A) Rev. Thomas Bayes ( ) We derive Bayes’ Rule by substitution: P(A|B) = P(A & B) / P(B) = P(B|A) * P(A) / P(B)

Bayesian (“Belief”) Nets Burglary Earthquake Alarm JohnCalls MaryCalls AP(J) T.90 F.05 AP(M) T.70 F.01 BEP(A) TT.95 TF.94 FT.29 FF.001 P(B).001 P(E).002

Bayesian Nets Using recently developed techniques (Pearl 1982), we can ask, e.g., “how likely is there to be a burglary, given that John has called?” Can also learn relationships, creating “hidden” variables and probability tables, based on observations. Current “hot topic” in AI.