Chapter 10: Using Uncertain Knowledge

Slides:



Advertisements
Similar presentations
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
Advertisements

Artificial Intelligence Universitatea Politehnica Bucuresti Adina Magda Florea
Reasoning under Uncertainty: Marginalization, Conditional Prob., and Bayes Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 6, 2013.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Section.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
CPSC 422 Review Of Probability Theory.
Probability.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.5 [P]: Propositions and Inference Sections.
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Sections.
1 Bayesian Reasoning Chapter 13 CMSC 471 Adapted from slides by Tim Finin and Marie desJardins.
Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review (Chapter 13)
CPSC 322, Lecture 24Slide 1 Reasoning under Uncertainty: Intro to Probability Computer Science cpsc322, Lecture 24 (Textbook Chpt 6.1, 6.1.1) March, 15,
Uncertainty Chapter 13.
For Monday after Spring Break Read Homework: –Chapter 13, exercise 6 and 8 May be done in pairs.
Methods in Computational Linguistics II Queens College Lecture 2: Counting Things.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Computer Science CPSC 322 Lecture 26 Uncertainty and Probability (Ch. 6.1, 6.1.1, 6.1.3)
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
Pattern-directed inference systems
Artificial Intelligence CS 165A Tuesday, November 20, 2007  Knowledge Representation (Ch 10)  Uncertainty (Ch 13)
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Probability and Information Copyright, 1996 © Dale Carnegie & Associates, Inc. A brief review.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
CS6133 Software Specification and Verification
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Computing & Information Sciences Kansas State University Lecture 12 of 42 CIS 530 / 730 Artificial Intelligence Lecture 12 of 42 William H. Hsu Department.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
Uncertainty Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Computer Science cpsc322, Lecture 25
Bayesian Reasoning Chapter 13 Thomas Bayes,
Bayesian Reasoning Chapter 13 Thomas Bayes,
Review of Probability.
Chapter 6: Discrete Probability
Chapter 10 (part 3): Using Uncertain Knowledge
Chapter 11: Learning Introduction
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Reasoning Under Uncertainty: Conditioning, Bayes Rule & Chain Rule
Chapter 7: Beyond Definite Knowledge
Basic Probabilistic Reasoning
Uncertainty.
Probability and Information
Professor Marie desJardins,
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
Class #21 – Monday, November 10
Bayesian Reasoning Chapter 13 Thomas Bayes,
Bayesian Reasoning Chapter 13 Thomas Bayes,
Chapter 12: Building Situated Robots
Chapter 1: Computational Intelligence and Knowledge
Representations & Reasoning Systems (RRS) (2.2)
basic probability and bayes' rule
Presentation transcript:

Chapter 10: Using Uncertain Knowledge Introduction Probability D. Poole, A. Mackworth, and R. Goebel, Computational Intelligence: A Logical Approach, Oxford University Press, January 1998

Introduction Agents don’t have complete knowledge about the world. Agents need to make decisions based on their uncertainty. It isn’t enough to assume what the world is like. (not omniscient but take decision in uncertain world!) Example: wearing a seat belt. (not accident or accident entails not seat belt!) An agent needs to reason about its uncertainty. When an agent makes an action under uncertainty it is gambling  probability. Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Probability: Frequentists vs. Subjectivists Probability is an agent’s measure of belief in some proposition — subjective probability = measure of belief of an agent given its knowledge! Example: Your probability of a bird flying is your measure of belief in the flying ability of an individual based only on the knowledge that the individual is a bird. Other agents may have different probabilities, as they may have had different experiences with birds or different knowledge about this particular bird. An agent’s belief in a bird’s flying ability is affected by what the agent knows (knowledge=episteme) about that bird. uncertainty is epistemological rather than ontological (the essence of nature)! Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Numerical Measures of Belief Belief in proposition, f , can be measured in terms of a number between 0 and 1 — this is the probability of f. The probability f is 0 means that f is believed to be definitely false. The probability f is 1 means that f is believed to be definitely true. Using 0 and 1 is purely a convention. f has a probability between 0 and 1, doesn’t mean f is true to some degree, but means you are ignorant of its truth value. Probability is a measure of your ignorance Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Random Variables A random variable is a term in a language that can take one of a number of different values. The domain of a variable x, written dom(x), is the set of values x can take. A tuple of random variables <x1, …, xn> is a complex random variable with domain dom(x1) *…* dom(xn). Assignment x = v means variable x has value v. A proposition is a Boolean formula made from assignments of values to variables. Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Possible World Semantics A possible world (interpretation with more structure: measure assigned to it) specifies an assignment of one value to each random variable. w x = v means variable x is assigned value v in world w (remember: w f: f true in world w) Logical connectives have their standard meaning: W    if w  and w  w    if w  or w  w  if w  Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Semantics of Probability For a finite number of variables with finite domains: Define a nonnegative measure (w) for each world w so that the measures of the possible worlds sum to 1. The measure specifies how much you think the world w is like the real world. The probability of proposition f is defined by: (Discrete case) Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Axioms of Probability Four axioms define what follows from a set of probabilities: Axiom 1 P(f ) = P(g) if f  g is a tautology. That is, logically equivalent formulae have the same probability. Axiom 2 0  P(f ) for any formula f . Axiom 3 P() = 1 if  is a tautology. Axiom 4 P(f V g) = P(f )+P(g) if (f  g) is a tautology. These axioms are sound and complete with respect to the semantics. Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Conditioning Probabilistic conditioning specifies how to revise beliefs based on new information. You build a probabilistic model taking all background information into account. This gives the prior probability. All other information must be conditioned on. If evidence e is the all of the information obtained subsequently, the conditional probability P(h|e) of h given e is the posterior probability of h (after new evidences emerged). (medical diagnosis: e = symptoms and h = diseases, priors = p(diseases) do not see the patients) P(ef)  P(f | e) (Nilsson vs. Pearl paper: prob. logic revisited) Imagine a domain where birds are relatively rare, non-flying birds are small proportion of birds: P( flies | birds)  P(birds  flies )=P(birds V flies ) ≈ P(birds ) low high Measure of interpretations for which f is true or e is false Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Semantics of Conditional Probability Evidence e rules out possible worlds incompatible with e. Evidence e induces a new measure, e, over possible worlds The conditional probability of formula h given evidence e is: (normalized to 1) Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Properties of Conditional Probabilities Chain rule Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Bayes’ Theorem The chain rule and commutativity of conjunction (h  e is equivalent to e  h) gives us: P(h  e) = P(h|e) * P(e) = P(e|h) * P(h). If P(e)  0, you can divide the right hand sides by P(e). This is Bayes’ theorem. Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge

Why is Bayes’ theorem interesting? Often you have causal knowledge: P(symptom | disease) P(light is off | status of switches and switch positions) P(alarm | fire) P(image looks like | a tree is in front of a car) and want to do evidential reasoning: P(disease | symptom) P(status of switches | light is off and switch positions) P(fire | alarm) P(a tree is in front of a car | image looks like) Dr Djamel Bouchaffra CS 426 Artificial Intelligence, Ch 10: Using Uncertain Knowledge