Uncertain Knowledge Representation CPSC 386 Artificial Intelligence Ellen Walker Hiram College.

Slides:



Advertisements
Similar presentations
A Survey of Probability Concepts
Advertisements

Knowledge Representation CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Probabilistic Reasoning Course 8
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
How likely something is to happen.
Bayesian Networks CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Lecture 1, Part 2 Albert Gatt Corpora and statistical methods.
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6.
Dependent and Independent Events. If you have events that occur together or in a row, they are considered to be compound events (involve two or more separate.
CPSC 422 Review Of Probability Theory.
Probability.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 12 Jim Martin.
A/Prof Geraint Lewis A/Prof Peter Tuthill
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
Uncertainty Management for Intelligent Systems : for SEP502 June 2006 김 진형 KAIST
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
For Monday after Spring Break Read Homework: –Chapter 13, exercise 6 and 8 May be done in pairs.
Methods in Computational Linguistics II Queens College Lecture 2: Counting Things.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Uncertainty1 Uncertainty Russell and Norvig: Chapter 14 Russell and Norvig: Chapter 13 CS121 – Winter 2003.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
AP Statistics Chapter 6 Notes. Probability Terms Random: Individual outcomes are uncertain, but there is a predictable distribution of outcomes in the.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
CSE PR 1 Reasoning - Rule-based and Probabilistic Representing relations with predicate logic Limitations of predicate logic Representing relations.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
1 Chapter 4, Part 1 Repeated Observations Independent Events The Multiplication Rule Conditional Probability.
Chapter 13 February 19, Acting Under Uncertainty Rational Decision – Depends on the relative importance of the goals and the likelihood of.
Uncertainty Management in Rule-based Expert Systems
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY
PROBABILITY IN OUR DAILY LIVES
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Math I.  Probability is the chance that something will happen.  Probability is most often expressed as a fraction, a decimal, a percent, or can also.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Math I.  Probability is the chance that something will happen.  Probability is most often expressed as a fraction, a decimal, a percent, or can also.
Uncertainty in AI. Birds can fly, right? Seems like common sense knowledge.
Textbook Basics of an Expert System: – “Expert systems: Design and Development,” by: John Durkin, 1994, Chapters 1-4. Uncertainty (Probability, Certainty.
+ Chapter 5 Overview 5.1 Introducing Probability 5.2 Combining Events 5.3 Conditional Probability 5.4 Counting Methods 1.
Stat 13, Thu 4/19/ Hand in HW2! 1. Resistance. 2. n-1 in sample sd formula, and parameters and statistics. 3. Probability basic terminology. 4. Probability.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
Business Statistics: A First Course, 5e © 2009 Prentice-Hall, Inc. Chap 4-1 Chapter 4 Basic Probability Business Statistics: A First Course 5 th Edition.
Uncertainty & Probability CIS 391 – Introduction to Artificial Intelligence AIMA, Chapter 13 Many slides adapted from CMSC 421 (U. Maryland) by Bonnie.
Chance We will base on the frequency theory to study chances (or probability).
Probability. Definitions Probability: The chance of an event occurring. Probability Experiments: A process that leads to well- defined results called.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
Probability ·  fraction that tells how likely something is to happen ·   the relative frequency that an event will occur.
Good afternoon! August 9, 2017.
Quick Review Probability Theory
Where are we in CS 440? Now leaving: sequential, deterministic reasoning Entering: probabilistic reasoning and machine learning.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Basic Probabilistic Reasoning
Uncertainty.
CS 188: Artificial Intelligence Fall 2008
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Bayes for Beginners Luca Chech and Jolanda Malamud
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

Uncertain Knowledge Representation CPSC 386 Artificial Intelligence Ellen Walker Hiram College

Reasoning Under Uncertainty We have no way of getting complete information, e.g. limited sensors We don’t have time to wait for complete information, e.g. driving We can’t know the result of an action until after having done it, e.g. rolling a die There are too many unlikely events to consider, e.g. “I will drive home, unless my car breaks down, or unless a natural disaster destroys the roads …”

But… A decision must be made! No intelligent system can afford to consider all eventualities, wait until all the data is in and complete, or try all possibilities to see what happens

So… We must be able to reason about the likelihood of an event We must be able to consider partial information as part of larger decisions But –We are lazy (too many options to list, most unlikely) –We are ignorant No complete theory to work from All possible observations haven’t been made

Quick Overview of Reasoning Systems Logic –True or false, nothing in between. No uncertainty Non-monotonic logic –True or false, but new information can change it. Probability –Degree of belief, but in the end it’s either true or false Fuzzy –Degree of belief, allows overlapping of true and false states

Examples Logic –Rain is precipitation Non-monotonic –It is currently raining Probability –It will rain tomorrow (70% chance) Fuzzy –It is raining (.5 hard /.8 very hard /.2 a little)

NonMonotonic Logic Once true (or false) does not mean always true (or false) As information arrives, truth values can change (Penelope is a bird, penguin, magic penguin) Implementations (you are not responsible for details) –Circumscription Bird(x) and not abnormal(x) -> flies(x) We can assume not abnormal(x) unless we know abnormal(x) –Default logic “x is true given x does not conflict with anything we already know”

Truth Maintenance Systems These systems allow truth values to be changed during reasoning (belief revision) When we retract a fact, we must also retract any other fact that was derived from it –Penelope is a bird.(can fly) –Penelope is a penguin.(cannot fly) –Penelope is magical.(can fly) –Retract (Penelope is magical).(cannot fly) –Retract (Penelope is a penguin).(can fly)

Types of TMS Justification based TMS –For each fact, track its justification (facts and rules from which it was derived) –When a fact is retracted, retract all facts that have justifications leading back to that fact, unless they have independent justifications. –Each sentence labeled IN or OUT Assumption based TMS –Represent all possible states simultaneously –When a fact is retracted, change state sets –For each fact, use list of assumptions that make that fact true; each world state is a set of assumptions.

TMS Example (Quine & Ullman 1978) Abbot, Babbit & Cabot are murder suspects –Abbot’s alibi: At hotel (register) –Babbit’s alibi: Visiting brother-in-law (testimony) –Cabot’s alibi: Watching ski race Who committed the murder? New Evidence comes in… –TV video shows Cabot at the ski race Now, who committed the murder?

JTMS Example Each belief has justifications (+ and -) We mark each fact as IN or OUT Suspect Abbot (IN) Beneficiary Abbot (IN) Alibi Abbot (OUT) + –

Revised Justification Suspect Abbot (OUT) Beneficiary Abbot (IN) Alibi Abbot (IN) + – Registered (IN) Far Away (IN) Forged (OUT) + + –

ATMS Example (Partial) List all possible assumptions (e.g. A1: register was forged, A2: register was not forged) Consider all possible facts –(e.g. Abbot was at hotel.) For each fact, determine all possible sets of assumptions that would make it valid –Eg. Abbot was at hotel (all sets that include A2 but not A1)

JTMS vs. ATMS JTMS is sequential –With each new fact, update the current set of beliefs ATMS is “pre-compiled” –Determine the correct set of beliefs for each fact in advance –When you have actual facts, find the set of beliefs that is consistent with all of them (intersection of sets for each fact)

Probability The likelihood of an event occurring represented as a percentage of observed events over total observations E.g. –I have a bag containing red & black balls –I pull 8 balls from the bag (replacing the ball each time) 6 are red and 2 are black Assume 75% of balls are red, 25% are black The probability of the next ball being red is 75%

More examples There are 52 cards in a deck, 4 suits (2 red, 2 black) –What is the probability of picking a red card (26 red cards) / (52 cards) = 50% –What is the probability of picking 2 red cards? 50% for the first card (25 red cards) / (51 cards) for the second Multiply for total result (26*25) / (52*51)

Basic Probability Notation Proposition –an assertion like “the card is red” Random variable –Describes an event we want to know the outcome of, like “ColorofCardPicked” –Domain is set of values such as {red, black} Unconditional (prior) probability P(A) –In the absence of other information Conditional probability P(A | B) –Based on specific prior knowledge

Some important equations P(true) = 1; P(false) = 0 0 <= P(a) <= 1 –All probabilities between 0 and 1, inclusive P(a v b) = P(a) + P(b) – P(a ^ b) We can derive others –P(a v ~a) = 1 –P(a ^ ~a) = 0 –P(~a) + P(a) = 1

Conditional & unconditional Probabilities in example Unconditional –P(Color2ndCard = red) = 50% –With no other knowledge Conditional –P(Color2ndCard = red |Color1stCard=red) = 25/51 –Knowing the first card was red, gives more info (lower likelihood of 2nd card being red) –The bar is read “given”

Computing Conditional Probabilities P(A|B) = P(A ^ B) / P(B) The probability that the 2nd card is red given the first card was red is (the probability that both cards are red) / (probability that 1st card is red) P(CarWontStart |NoGas) = P(CarWontStart ^ NoGas) / P(NoGas) P(NoGas | CarWontStart) = P(CarWontStart ^ NoGas) / P(CarWontStart)

Product Rule and Independence P(A^B) = P(A|B) * P(B) –(just an algebraic manipulation) Two events are independent if P(A|B) = P(A) –E.g. 2 consecutive coin flips are independent If events are independent, we can multiply their probabilities –P(A^B) = P(A)*P(B) when A and B are independent

Back to Conditional Probabilities P (CarWontStart | NoGas) –This predicts a symptom based on an underlying cause –These can be generated empirically (Drain N gastanks, see how many cars start) P (NoGas | CarWontStart) –This is a good example of diagnosis. We have a symptom and want to predict the cause –We can’t measure these

Bayes’ Rule P(A^B) = P(A|B) * P(B) = P(B|A) * P(A) So P(A|B) = P(B|A) * P(A) / P(B) This allows us to compute diagnostic probabilities from causal probabilities and prior probabilities!

Bayes’ rule for diagnosis P(measles) = 0.1 P(chickenpox) = 0.4 P(allergy) = 0.6 P(spots | measles) = 1.0 P(spots | chickenpox) = 0.5 P(spots | allergy) = 0.2 (assume diseases are independent) What is P(measles | spots)?

P(spots) P(spots) was not given. We can estimate it with the following (unlikely) assumptions –The three listed diseases are independent; no one will have two or more –There are no other causes or co-factors for spots, P.e. p(spots | none-of-the-above) = 0 Then we can say that: –P(spots) = p(spots^measles) + p(spots^chickenpox) + p(spots^allergy) (0.42)

Combining Evidence Multiple sources of evidence leading to the same conclusion ache fever flu fever thermometer reading flu Chain of evidence leading to a conclusion