Non-monotonic Reasoning zAre we having a pop quiz today? yYou assume not. yBut can you prove it? zIn commonsense reasoning, ywe often jump to conclusions,

Slides:



Advertisements
Similar presentations
Bayes rule, priors and maximum a posteriori
Advertisements

Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
BAYESIAN NETWORKS. Bayesian Network Motivation  We want a representation and reasoning system that is based on conditional independence  Compact yet.
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Reasoning under Uncertainty: Marginalization, Conditional Prob., and Bayes Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 6, 2013.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
Uncertainty Everyday reasoning and decision making is based on uncertain evidence and inferences. Classical logic only allows conclusions to be strictly.
Uncertain Knowledge Representation CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
Uncertain Reasoning CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 6.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) March, 17, 2010.
CPSC 422 Review Of Probability Theory.
Probability.
Basics Random variable takes values Joint Probability Distribution
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 12 Jim Martin.
1 Data Mining with Bayesian Networks (I) Instructor: Qiang Yang Hong Kong University of Science and Technology Thanks: Dan Weld, Eibe.
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 6.
1 But Uncertainty is Everywhere zMedical knowledge in logic? yToothache Cavity zProblems yToo many exceptions to any logical rule xHard to code accurate.
Bayesian Belief Networks
KI2 - 2 Kunstmatige Intelligentie / RuG Probabilities Revisited AIMA, Chapter 13.
Representing Uncertainty CSE 473. © Daniel S. Weld 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one.
Ai in game programming it university of copenhagen Welcome to... the Crash Course Probability Theory Marco Loog.
CSE (c) S. Tanimoto, 2008 Bayes Nets 1 Probabilistic Reasoning With Bayes’ Rule Outline: Motivation Generalizing Modus Ponens Bayes’ Rule Applying.
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13. Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability.
NONMONOTONIC LOGIC AHMED SALMAN MALIK. OVERVIEW Monotonic Logic Nonmonotonic Logic Usage and Applications Comparison with other forms of logic Related.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
Uncertainty1 Uncertainty Russell and Norvig: Chapter 14 Russell and Norvig: Chapter 13 CS121 – Winter 2003.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
1 Chapter 13 Uncertainty. 2 Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
An Introduction to Artificial Intelligence Chapter 13 & : Uncertainty & Bayesian Networks Ramin Halavati
1 Reasoning Under Uncertainty Artificial Intelligence Chapter 9.
Uncertainty. Assumptions Inherent in Deductive Logic-based Systems All the assertions we wish to make and use are universally true. Observations of the.
Reasoning under Uncertainty: Conditional Prob., Bayes and Independence Computer Science cpsc322, Lecture 25 (Textbook Chpt ) Nov, 5, 2012.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Uncertainty Chapter 13. Outline Uncertainty Probability Syntax and Semantics Inference Independence and Bayes' Rule.
Computer Science CPSC 322 Lecture 27 Conditioning Ch Slide 1.
Uncertainty in AI. Birds can fly, right? Seems like common sense knowledge.
Uncertainty Let action A t = leave for airport t minutes before flight Will A t get me there on time? Problems: 1.partial observability (road state, other.
Probabilistic Robotics Introduction Probabilities Bayes rule Bayes filters.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Planning Chapter 7 article 7.4 Production Systems Chapter 5 article 5.3 RBSChapter 7 article 7.2.
Anifuddin Azis UNCERTAINTY. 2 Introduction The world is not a well-defined place. There is uncertainty in the facts we know: What’s the temperature? Imprecise.
CSE (c) S. Tanimoto, 2007 Bayes Nets 1 Bayes Networks Outline: Why Bayes Nets? Review of Bayes’ Rule Combining independent items of evidence General.
Matching ® ® ® Global Map Local Map … … … obstacle Where am I on the global map?                                   
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Outline for 4/11 Bayesian Networks Planning. 2 Sources of Uncertainty Medical knowledge in logic? –Toothache Cavity Problems –Too many exceptions to any.
Computer Science cpsc322, Lecture 25
Planning Chapter11 Bayesian Networks (Chapters 14,15)
Limitations of First-Order Logic
Quick Review Probability Theory
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty.
Uncertainty in AI.
Representing Uncertainty
CS 188: Artificial Intelligence Fall 2008
CS 188: Artificial Intelligence Fall 2007
CS 188: Artificial Intelligence Fall 2007
Uncertainty Logical approach problem: we do not always know complete truth about the environment Example: Leave(t) = leave for airport t minutes before.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Uncertainty Chapter 13.
Presentation transcript:

Non-monotonic Reasoning zAre we having a pop quiz today? yYou assume not. yBut can you prove it? zIn commonsense reasoning, ywe often jump to conclusions, ycan’t always list the assumptions we made, yneed to retract conclusions, when we get more information. zIn first-order logic, our conclusion set is monotonically growing.

The Closed World Assumption zKB contains: Student(Joe), Student(Mary) zQuery: Student(Fred)? zIntuitively, no; but can’t prove it. zSolution: when appropriate, close the predicate Student. z  X Student(X) X=Joe v X=Mary zClosing can be subtle when multiple predicates are involved: y  X In(X)  Out(X)

More on CWA zNegation as failure:  x,y,z edge(x,z)  path(z,y)  path(x,y)  x,y edge(x,y)  path(x,y) edge(A,B), edge(B,C), edge(A,D) yConclude:  path(C,D). zDomain-closure assumption: the only named constants in the KB exist in the universe. zUnique-names assumption: every constant is mapped to a different object in the universe. (already assumed in Description Logics and Databases).

Default Rules Bird(X) C(Flies(X)) : Flies(X) is consistent. Flies(X) zApplication of default rules: the order matters! Liberal(X) Hunter(X) C(Dem(X)) C(Rep(X)) Dem(X) Rep(X)  X  (Dem(X)  Rep(X)) Liberal(Tom), Hunter(Tom)

Minimal Models: Circumscription zConsider only models in which the extension of some predicates is minimized. y  X (Bird(X)   abnormal(X))  Flies(X) zSome predicates are distinguished as “abnormal”. zAn interpretation I1 is preferred to I2 if: yI1 and I2 agree on the extensions of all objects, functions and non-abnormal predicates. yThe extension of abnormal in I1 is a strict subset of its extension in I2. zKB |= S, if S is satisfied in every minimal model of KB (I is minimal if no I2 is preferred to it).

6 But Uncertainty is Everywhere zMedical knowledge in logic? yToothache Cavity zProblems yToo many exceptions to any logical rule xHard to code accurate rules, hard to use them. yDoctors have no complete theory for the domain yDon’t know the state of a given patient state zUncertainty is ubiquitous in any problem-solving domain (except maybe puzzles) zAgent has degree of belief, not certain knowledge

Ways to Represent Uncertainty zDisjunction yIf information is correct but complete, your knowledge might be of the form xI am in either s3, or s19, or s55 xIf I am in s3 and execute a15 I will transition either to s92 or s63 yWhat we can’t represent xThere is very unlikely to be a full fuel drum at the depot this time of day xWhen I execute pickup(?Obj) I am almost always holding the object afterwards xThe smoke alarm tells me there’s a fire in my kitchen, but sometimes it’s wrong

Numerical Repr of Uncertainty zInterval-based methods y.4 <= prob(p) <=.6 zFuzzy methods yD(tall(john)) = 0.8 zCertainty Factors yUsed in MYCIN expert system zProbability Theory yWhere do numeric probabilities come from? yTwo interpretations of probabilistic statements: xFrequentist: based on observing a set of similar events. xSubjective probabilities: a person’s degree of belief in a proposition.

KR with Probabilities yOur knowledge about the world is a distribution of the form prob(s), for s  S. (S is the set of all states) y  s  S, 0  prob(s)  1 y  s  S prob(s) = 1 yFor subsets S 1 and S 2, prob(S 1  S 2 ) = prob(S 1 ) + prob(S 2 ) - prob(S 1  S 2 ) yNote we can equivalently talk about propositions: prob(p  q) = prob(p) + prob(q) - prob(p  q) xwhere prob(p) means  s  S | p holds in s prob(s) yprob(TRUE) = 1 yprob(FALSE) = 0

Probability As “Softened Logic” z“Statements of fact” yProb(TB) =.06 zSoft rules yTB  cough yProb(cough | TB) = 0.9 z(Causative versus diagnostic rules) yProb(cough | TB) = 0.9 yProb(TB | cough) = 0.05 zProbabilities allow us to reason about yPossibly inaccurate observations yOmitted qualifications to our rules that are (either epistemological or practically) necessary

Probabilistic Knowledge Representation and Updating zPrior probabilities: yProb(TB) (probability that population as a whole, or population under observation, has the disease) zConditional probabilities: yProb(TB | cough) xupdated belief in TB given a symptom yProb(TB | test=neg) xupdated belief based on possibly imperfect sensor yProb(“TB tomorrow” | “treatment today”) xreasoning about a treatment (action) zThe basic update: yProb(H)  Prob(H|E 1 )  Prob(H|E 1, E 2 ) ...

12 z Random variable takes values yCavity: yes or no z Joint Probability Distribution z Unconditional probability (“prior probability”) yP(A) yP(Cavity) = 0.1 z Conditional Probability yP(A|B) yP(Cavity | Toothache) = 0.8 Basics Cavity  Cavity Ache  Ache

Bayes Rule zP(B|A) = P(A|B)P(B) P(A) A = red spots B = measles We know P(A|B), but want P(B|A).

14 Conditional Independence z“A and P are independent” yP(A) = P(A | P) and P(P) = P(P | A) yCan determine directly from JPD yPowerful, but rare (I.e. not true here) z“A and P are independent given C” yP(A|P,C) = P(A|C) and P(P|C) = P(P|A,C) yStill powerful, and also common yE.g. suppose xCavities causes aches xCavities causes probe to catch C A P Prob F F F F F T F T F F T T T F F T F T T T F T T T Cavity Probe Ache

15 Conditional Independence z“A and P are independent given C” zP(A | P,C) = P(A | C) and also P(P | A,C) = P(P | C) C A P Prob F F F F F T F T F F T T T F F T F T T T F T T T 0.032

Suppose C=True P(A|P,C) = 0.032/( ) = 0.032/0.080 = 0.4

P(A|C) = / ( ) = 0.04 / 0.1 = 0.4

Summary so Far zBayesian updating yProbabilities as degree of belief (subjective) yBelief updating by conditioning xProb(H)  Prob(H|E 1 )  Prob(H|E 1, E 2 ) ... yBasic form of Bayes’ rule xProb(H | E) = Prob(E | H) P(H) / Prob(E) yConditional independence xKnowing the value of Cavity renders Probe Catching probabilistically independent of Ache xGeneral form of this relationship: knowing the values of all the variables in some separator set S renders the variables in set A independent of the variables in B. Prob(A|B, S ) = Prob(A| S ) xGraphical Representation...

Computational Models for Probabilistic Reasoning zWhat we want ya “probabilistic knowledge base” where domain knowledge is represented by propositions, unconditional, and conditional probabilities yan inference engine that will compute Prob(formula | “all evidence collected so far”) zProblems yelicitation: what parameters do we need to ensure a complete and consistent knowledge base? ycomputation: how do we compute the probabilities efficiently? zBelief nets (“Bayes nets”) = Answer (to both problems) ya representation that makes structure (dependencies and independencies) explicit