Download presentation
Presentation is loading. Please wait.
1
Non-monotonic Reasoning zAre we having a pop quiz today? yYou assume not. yBut can you prove it? zIn commonsense reasoning, ywe often jump to conclusions, ycan’t always list the assumptions we made, yneed to retract conclusions, when we get more information. zIn first-order logic, our conclusion set is monotonically growing.
2
The Closed World Assumption zKB contains: Student(Joe), Student(Mary) zQuery: Student(Fred)? zIntuitively, no; but can’t prove it. zSolution: when appropriate, close the predicate Student. z X Student(X) X=Joe v X=Mary zClosing can be subtle when multiple predicates are involved: y X In(X) Out(X)
3
More on CWA zNegation as failure: x,y,z edge(x,z) path(z,y) path(x,y) x,y edge(x,y) path(x,y) edge(A,B), edge(B,C), edge(A,D) yConclude: path(C,D). zDomain-closure assumption: the only named constants in the KB exist in the universe. zUnique-names assumption: every constant is mapped to a different object in the universe. (already assumed in Description Logics and Databases).
4
Default Rules Bird(X) C(Flies(X)) : Flies(X) is consistent. Flies(X) zApplication of default rules: the order matters! Liberal(X) Hunter(X) C(Dem(X)) C(Rep(X)) Dem(X) Rep(X) X (Dem(X) Rep(X)) Liberal(Tom), Hunter(Tom)
5
Minimal Models: Circumscription zConsider only models in which the extension of some predicates is minimized. y X (Bird(X) abnormal(X)) Flies(X) zSome predicates are distinguished as “abnormal”. zAn interpretation I1 is preferred to I2 if: yI1 and I2 agree on the extensions of all objects, functions and non-abnormal predicates. yThe extension of abnormal in I1 is a strict subset of its extension in I2. zKB |= S, if S is satisfied in every minimal model of KB (I is minimal if no I2 is preferred to it).
6
6 But Uncertainty is Everywhere zMedical knowledge in logic? yToothache Cavity zProblems yToo many exceptions to any logical rule xHard to code accurate rules, hard to use them. yDoctors have no complete theory for the domain yDon’t know the state of a given patient state zUncertainty is ubiquitous in any problem-solving domain (except maybe puzzles) zAgent has degree of belief, not certain knowledge
7
Ways to Represent Uncertainty zDisjunction yIf information is correct but complete, your knowledge might be of the form xI am in either s3, or s19, or s55 xIf I am in s3 and execute a15 I will transition either to s92 or s63 yWhat we can’t represent xThere is very unlikely to be a full fuel drum at the depot this time of day xWhen I execute pickup(?Obj) I am almost always holding the object afterwards xThe smoke alarm tells me there’s a fire in my kitchen, but sometimes it’s wrong
8
Numerical Repr of Uncertainty zInterval-based methods y.4 <= prob(p) <=.6 zFuzzy methods yD(tall(john)) = 0.8 zCertainty Factors yUsed in MYCIN expert system zProbability Theory yWhere do numeric probabilities come from? yTwo interpretations of probabilistic statements: xFrequentist: based on observing a set of similar events. xSubjective probabilities: a person’s degree of belief in a proposition.
9
KR with Probabilities yOur knowledge about the world is a distribution of the form prob(s), for s S. (S is the set of all states) y s S, 0 prob(s) 1 y s S prob(s) = 1 yFor subsets S 1 and S 2, prob(S 1 S 2 ) = prob(S 1 ) + prob(S 2 ) - prob(S 1 S 2 ) yNote we can equivalently talk about propositions: prob(p q) = prob(p) + prob(q) - prob(p q) xwhere prob(p) means s S | p holds in s prob(s) yprob(TRUE) = 1 yprob(FALSE) = 0
10
Probability As “Softened Logic” z“Statements of fact” yProb(TB) =.06 zSoft rules yTB cough yProb(cough | TB) = 0.9 z(Causative versus diagnostic rules) yProb(cough | TB) = 0.9 yProb(TB | cough) = 0.05 zProbabilities allow us to reason about yPossibly inaccurate observations yOmitted qualifications to our rules that are (either epistemological or practically) necessary
11
Probabilistic Knowledge Representation and Updating zPrior probabilities: yProb(TB) (probability that population as a whole, or population under observation, has the disease) zConditional probabilities: yProb(TB | cough) xupdated belief in TB given a symptom yProb(TB | test=neg) xupdated belief based on possibly imperfect sensor yProb(“TB tomorrow” | “treatment today”) xreasoning about a treatment (action) zThe basic update: yProb(H) Prob(H|E 1 ) Prob(H|E 1, E 2 ) ...
12
12 z Random variable takes values yCavity: yes or no z Joint Probability Distribution z Unconditional probability (“prior probability”) yP(A) yP(Cavity) = 0.1 z Conditional Probability yP(A|B) yP(Cavity | Toothache) = 0.8 Basics Cavity Cavity 0.04 0.06 0.01 0.89 Ache Ache
13
Bayes Rule zP(B|A) = P(A|B)P(B) ----------------- P(A) A = red spots B = measles We know P(A|B), but want P(B|A).
14
14 Conditional Independence z“A and P are independent” yP(A) = P(A | P) and P(P) = P(P | A) yCan determine directly from JPD yPowerful, but rare (I.e. not true here) z“A and P are independent given C” yP(A|P,C) = P(A|C) and P(P|C) = P(P|A,C) yStill powerful, and also common yE.g. suppose xCavities causes aches xCavities causes probe to catch C A P Prob F F F 0.534 F F T 0.356 F T F 0.006 F T T 0.004 T F F 0.048 T F T 0.012 T T F 0.032 T T T 0.008 Cavity Probe Ache
15
15 Conditional Independence z“A and P are independent given C” zP(A | P,C) = P(A | C) and also P(P | A,C) = P(P | C) C A P Prob F F F 0.534 F F T 0.356 F T F 0.006 F T T 0.004 T F F 0.012 T F T 0.048 T T F 0.008 T T T 0.032
16
Suppose C=True P(A|P,C) = 0.032/(0.032+0.048) = 0.032/0.080 = 0.4
17
P(A|C) = 0.032+0.008/ (0.048+0.012+0.032+0.008) = 0.04 / 0.1 = 0.4
18
Summary so Far zBayesian updating yProbabilities as degree of belief (subjective) yBelief updating by conditioning xProb(H) Prob(H|E 1 ) Prob(H|E 1, E 2 ) ... yBasic form of Bayes’ rule xProb(H | E) = Prob(E | H) P(H) / Prob(E) yConditional independence xKnowing the value of Cavity renders Probe Catching probabilistically independent of Ache xGeneral form of this relationship: knowing the values of all the variables in some separator set S renders the variables in set A independent of the variables in B. Prob(A|B, S ) = Prob(A| S ) xGraphical Representation...
19
Computational Models for Probabilistic Reasoning zWhat we want ya “probabilistic knowledge base” where domain knowledge is represented by propositions, unconditional, and conditional probabilities yan inference engine that will compute Prob(formula | “all evidence collected so far”) zProblems yelicitation: what parameters do we need to ensure a complete and consistent knowledge base? ycomputation: how do we compute the probabilities efficiently? zBelief nets (“Bayes nets”) = Answer (to both problems) ya representation that makes structure (dependencies and independencies) explicit
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.