Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

Decision Making Under Uncertainty CSE 495 Resources: –Russell and Norwick’s book.
Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertain knowledge and reasoning
Artificial Intelligence Uncertainty
Handling Uncertainty. Uncertain knowledge Typical example: Diagnosis. Consider data instances about patients: Can we certainly derive the diagnostic rule:
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
CPSC 422 Review Of Probability Theory.
Probability.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 12 Jim Martin.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
University College Cork (Ireland) Department of Civil and Environmental Engineering Course: Engineering Artificial Intelligence Dr. Radu Marinescu Lecture.
Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Handling Uncertainty. Uncertain knowledge Typical example: Diagnosis. Consider:  x Symptom(x, Toothache)  Disease(x, Cavity). The problem is that this.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Uncertainty1 Uncertainty Russell and Norvig: Chapter 14 Russell and Norvig: Chapter 13 CS121 – Winter 2003.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Introduction to Artificial Intelligence CS 438 Spring 2008 Today –AIMA, Ch. 13 –Reasoning with Uncertainty Tuesday –AIMA, Ch. 14.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
Uncertainty 1. 2 ♦ Uncertainty ♦ Probability ♦ Syntax and Semantics ♦ Inference ♦ Independence and Bayes’ Rule Outline.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Outline [AIMA Ch 13] 1 Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Introduction to Artificial Intelligence – Unit 7 Probabilistic Reasoning Course The Hebrew University of Jerusalem School of Engineering and Computer.
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
AIMA 3e Chapter 13: Quantifying Uncertainty
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty.
Probability and Information
Uncertainty in AI.
Uncertainty in Environments
Representing Uncertainty
Probability and Information
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before flight Query: ?

Problems Why can’t we determine t exactly? Partial observability road state, other drivers’ plans Uncertainty in action outcomes flat tire Immense complexity of modeling and predicting traffic

Problems Three specific issues: Laziness Too much work to list all antecedents or consequents Theoretical ignorance Not enough information on how the world works Practical ignorance If if we know all the “physics”, may not have all the facts

What happens with a purely logical approach? Either risks falsehood… “Leave(45) will get me there on time” … or leads to conclusions to weak to do anything with: “Leave(45) will get me there on time if there’s no snow and there’s no train crossing Route 19 and my tires remain intact and...” Leave(1440) might work fine, but then I’d have to spend the night in the airport

Solution: Probability Given the available evidence, Leave(35) will get me there on time with probability 0.04 Probability address uncertainty, not degree of truth Degree of truth handled by fuzzy logic IsSnowing is true to degree 0.2 Probabilities summarize effects of laziness and ignorance We will use combination of probabilities and utilities to make decisions

Subjective or Bayesian probability We will make probability estimates based on knowledge about the world P(Leave(45) | No Snow) = 0.55 Probability assessment if the world were a certain way Probabilities change with new information P(Leave(45) | No Snow, 5 AM) = 0.75

Making decision under uncertainty Suppose I believe the following: P(Leave(35) gets me there on time |...) = 0.04 P(Leave(45) gets me there on time |...) = 0.55 P(Leave(60) gets me there on time |...) = 0.95 P(Leave(1440) gets me there on time |...) = Which action do I choose? Depends on my preferences for missing flight vs. eating in airport, etc. Decision theory takes into account utility and probabilities

Axioms of Probability For any propositions A and B: Example: A = computer science major B = born in Minnesota

Notation and Concepts Unconditional probability or prior probability: P(Cavity) = 0.1 P(Weather = Sunny) = 0.55 corresponds to belief prior to arrival of any new evidence Weather is a multivalued random variable Could be one of P(Cavity) shorthand for P(Cavity=true)

Probability Distributions Probability Distribution gives probability values for all values P(Weather) = must be normalized: sum to 1 Joint Probability Distribution gives probability values for combinations of random variables P(Weather, Cavity) = 4 x 2 matrix

Posterior Probabilities Conditional or Posterior probability: P(Cavity | Toothache) = 0.8 For conditional distributions: P(Weather | Earthquake) = Earthquake=false Earthquake=true

Posterior Probabilities More knowledge does not change previous knowledge, but may render old knowledge unnecessary P(Cavity | Toothache, Cavity) = 1 New evidence may be irrelevant P(Cavity | Toothache, Schiller in Mexico) = 0.8

Definition of Conditional Probability Two ways to think about it

Definition of Conditional Probability Another way to think about it Sanity check: Why isn’t it just: General version holds for probability distributions: This is a 4 x 2 set of equations

Bayes’ Rule Product rule given by Bayes’ Rule: Bayes’ rule is extremely useful in trying to infer probability of a diagnosis, when the probability of cause is known.

Bayes’ Rule example Does my car need a new drive axle? If a car needs a new drive axle, with 30% probability this car jerks around P(jerks | needs axle) = 0.3 Unconditional probabilites: P(jerks) = 1/1000 P(needs axle) = 1/10,000 Then: P(needs axle | jerks) = P(jerks | needs axle) P(needs axle) P(jerks) = (0.3 x 1/10,000) / (1/1000) =.03 Conclusion: 3 of every 100 cars that jerk need an axle

Not dumb question Question: Why should I be able to provide an estimate of P(B|A) to get P(A|B)? Why not just estimate P(A|B) and be done with the whole thing?

Not dumb question Answer: Diagnostic knowledge is often more tenuous than causal knowledge Suppose drive axles start to go bad in an “epidemic” e.g. poor construction in a major drive axle brand two years ago is now haunting us P(needs axle) goes way up, easy to measure P(needs axle | jerks) should (and does) go up accordingly – but how to estimate? P(jerks | needs axle) is based on causal information, doesn’t change