1 Reasoning in Uncertain Situations 8a 8.0Introduction 8.1Logic-Based Abductive Inference 8.2Abduction: Alternatives to Logic 8.3The Stochastic Approach.

Slides:



Advertisements
Similar presentations
1 Knowledge Representation Introduction KR and Logic.
Advertisements

Uncertainty. environment Uncertain Agent agent ? sensors actuators ? ? ? model.
Introduction to Truth Maintenance Systems A Truth Maintenance System (TMS) is a PS module responsible for: 1.Enforcing logical relations among beliefs.
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
Justification-based TMSs (JTMS) JTMS utilizes 3 types of nodes, where each node is associated with an assertion: 1.Premises. Their justifications (provided.
Truth Maintenance Systems. Outline What is a TMS? Basic TMS model Justification-based TMS.
The Logic of Intelligence Pei Wang Department of Computer and Information Sciences Temple University.
Default Reasoning the problem: in FOL, universally-quantified rules cannot have exceptions –  x bird(x)  can_fly(x) –bird(tweety) –bird(opus)  can_fly(opus)
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
1 DCP 1172 Introduction to Artificial Intelligence Chang-Sheng Chen Topics Covered: Introduction to Nonmonotonic Logic.
Converting formulas into a normal form Consider the following FOL formula stating that a brick is an object which is on another object which is not a pyramid,
Knowledge Representation CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Deduction In addition to being able to represent facts, or real- world statements, as formulas, we want to be able to manipulate facts, e.g., derive new.
PROBABILITY. Uncertainty  Let action A t = leave for airport t minutes before flight from Logan Airport  Will A t get me there on time ? Problems :
For Friday Finish chapter 10 No homework (get started on program 2)
1 Chapter 13 Uncertainty Types of uncertainty Predicate logic and uncertainty Nonmonotonic logics Truth Maintenance Systems Fuzzy sets.
1 Reasoning in Uncertain Situations 9 9.0Introduction 9.1Logic-Based Abductive Inference 9.2Abduction: Alternatives to Logic 9.3The Stochastic Approach.
Intelligent systems Lecture 6 Rules, Semantic nets.
Uncertainty Russell and Norvig: Chapter 13 CMCS424 Fall 2003 based on material from Jean-Claude Latombe, Daphne Koller and Nir Friedman.
Cs774 (Prasad)L7Negation1 Negation by Failure
Uncertain knowledge and reasoning
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.6 [P]: Reasoning Under Uncertainty Section.
CPSC 422 Review Of Probability Theory.
UNIVERSITY OF SOUTH CAROLINA Department of Computer Science and Engineering CSCE 580 Artificial Intelligence Ch.5 [P]: Propositions and Inference Sections.
Auto-Epistemic Logic Proposed by Moore (1985) Contemplates reflection on self knowledge (auto-epistemic) Allows for representing knowledge not just about.
ARTIFICIAL INTELLIGENCE TECHNIQUES Knowledge Processing 1.
Methods of Proof Chapter 7, second half.
Artificial Intelligence Chapter 17 Knowledge-Based Systems Biointelligence Lab School of Computer Sci. & Eng. Seoul National University.
Uncertainty Chapter 13.
Knowledge Representation and Reasoning University "Politehnica" of Bucharest Department of Computer Science Fall 2010 Adina Magda Florea
NONMONOTONIC LOGIC AHMED SALMAN MALIK. OVERVIEW Monotonic Logic Nonmonotonic Logic Usage and Applications Comparison with other forms of logic Related.
1 Backward-Chaining Rule-Based Systems Elnaz Nouri December 2007.
Inference is a process of building a proof of a sentence, or put it differently inference is an implementation of the entailment relation between sentences.
Uncertainty1 Uncertainty Russell and Norvig: Chapter 14 Russell and Norvig: Chapter 13 CS121 – Winter 2003.
1 Knowledge Based Systems (CM0377) Lecture 13 (Last modified 2nd May 2002)
For Friday Exam 1. For Monday No reading Take home portion of exam due.
 Predicate: A sentence that contains a finite number of variables and becomes a statement when values are substituted for the variables. ◦ Domain: the.
For Wednesday Read chapter 13 Homework: –Chapter 10, exercise 5 (part 1 only, don’t redo) Progress for program 2 due.
Formal Models in AGI Research Pei Wang Temple University Philadelphia, USA.
Pattern-directed inference systems
Reasoning in Uncertain Situations
1 Reasoning in Uncertain Situations 8 8.0Introduction 8.1Logic-Based Abductive Inference 8.2Abduction: Alternatives to Logic 8.3The Stochastic Approach.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 17 Wednesday, 01 October.
Uncertainty Management in Rule-based Expert Systems
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
First-Order Logic and Inductive Logic Programming.
Knowledge Processing 1. Aims of session  Introduce types of reasoning  Deterministic  Propositional logic  Predicate logic.
Uncertainty in AI. Birds can fly, right? Seems like common sense knowledge.
1 Knowledge Based Systems (CM0377) Introductory lecture (Last revised 28th January 2002)
For Wednesday Read chapter 13 No homework. Program 2 Any questions?
Planning I: Total Order Planners Sections
Common Sense Inference Let’s distinguish between: Mathematical inference about common sense situations Example: Formalize theory of behavior of liquids.
Artificial Intelligence Knowledge Representation.
Artificial Intelligence: Applications
Limitations of First-Order Logic
Chapter 10: Using Uncertain Knowledge
Artificial Intelligence Chapter 17 Knowledge-Based Systems
Knowledge-Based Systems Chapter 17.
Artificial Intelligence Chapter 17 Knowledge-Based Systems
Symbolic Reasoning under uncertainty
First-Order Logic and Inductive Logic Programming
CS 188: Artificial Intelligence Fall 2008
Artificial Intelligence Chapter 17. Knowledge-Based Systems
Automated Reasoning in Propositional Logic
Methods of Proof Chapter 7, second half.
CS621 : Artificial Intelligence
Artificial Intelligence Chapter 17 Knowledge-Based Systems
Habib Ullah qamar Mscs(se)
Presentation transcript:

1 Reasoning in Uncertain Situations 8a 8.0Introduction 8.1Logic-Based Abductive Inference 8.2Abduction: Alternatives to Logic 8.3The Stochastic Approach to Uncertainty 8.4Epilogue and References 8.5Exercises Note: the material for Section 8.1 is enhanced Additional references for the slides: Jean-Claude Latombe’s CS121 slides: robotics.stanford.edu/~latombe/cs121

2 Chapter Objectives Learn about the issues in dynamic knowledge bases Learn about adapting logic inference to uncertain worlds Learn about probabilistic reasoning Learn about alternative theories for reasoning under uncertainty The agent model: Can solve problems under uncertainty

3 Uncertain agent environment agent ? sensors actuators ? ? ? model

4 Types of Uncertainty Uncertainty in prior knowledge E.g., some causes of a disease are unknown and are not represented in the background knowledge of a medical-assistant agent

5 Types of Uncertainty Uncertainty in actions E.g., to deliver this lecture: I must be able to come to school the heating system must be working my computer must be working the LCD projector must be working I must not have become paralytic or blind As we discussed last time, actions are represented with relatively short lists of preconditions, while these lists are in fact arbitrary long. It is not efficient (or even possible) to list all the possibilities.

6 Types of Uncertainty Uncertainty in perception E.g., sensors do not return exact or complete information about the world; a robot never knows exactly its position. Courtesy R. Chatila

7 Sources of uncertainty Laziness (efficiency) Ignorance What we call uncertainty is a summary of all that is not explicitly taken into account in the agent’s knowledge base (KB).

8 Assumptions of reasoning with predicate logic (1) (1) Predicate descriptions must be sufficient with respect to the application domain. Each fact is known to be either true or false. But what does lack of information mean? Closed world assumption, assumption based reasoning: PROLOG: if a fact cannot be proven to be true, assume that it is false HUMAN: if a fact cannot be proven to be false, assume it is true

9 Assumptions of reasoning with predicate logic (2) (2)The information base must be consistent. Human reasoning: keep alternative (possibly conflicting) hypotheses. Eliminate as new evidence comes in.

10 Assumptions of reasoning with predicate logic (3) (3) Known information grows monotonically through the use of inference rules. Need mechanisms to: add information based on assumptions (nonmonotonic reasoning), and delete inferences based on these assumptions in case later evidence shows that the assumption was incorrect (truth maintenance).

11 Questions How to represent uncertainty in knowledge? How to perform inferences with uncertain knowledge? Which action to choose under uncertainty?

12 Approaches to handling uncertainty Default reasoning [Optimistic] non-monotonic logic Worst-case reasoning [Pessimistic] adversarial search Probabilistic reasoning [Realist] probability theory

13 Default Reasoning Rationale: The world is fairly normal. Abnormalities are rare. So, an agent assumes normality, until there is evidence of the contrary. E.g., if an agent sees a bird X, it assumes that X can fly, unless it has evidence that X is a penguin, an ostrich, a dead bird, a bird with broken wings, …

14 Modifying logic to support nonmonotonic inference p(X)  unless q(X)  r(X) If we believe p(X) is true, and do not believe q(X) is true (either unknown or believed to be false) then we can infer r(X) later if we find out that q(X) is true, r(X) must be retracted “unless” is a modal operator: deals with belief rather than truth

15 Modifying logic to support nonmonotonic inference (cont’d) p(X)  unless q(X)  r(X) in KB p(Z)in KB r(W)  s(W)in KB  q(X)q(X) is not in KB r(X)inferred s(X)inferred

16 Example If it is snowing and unless there is an exam tomorrow, I can go skiing. It is snowing. Whenever I go skiing, I stop by at the Chalet to drink hot chocolate I did not check my calendar but I don’t remember an exam scheduled for tomorrow, conclude: I’ll go skiing. Then conclude: I’ll drink hot chocolate.

17 “Abnormality” p(X)  unless ab p(X)  q(X) ab: abnormal Examples: If X is a bird, it will fly unless it is abnormal. (abnormal: broken wing, sick, trapped, ostrich,...) If X is a car, it will run unless it is abnormal. (abnormal: flat tire, broken engine, no gas, …)

18 Another modal operator: M p(X)  M q(X)  r(X) If we believe p(X) is true, and q(X) is consistent with everything else, then we can infer r(X) “M” is a modal operator for “is consistent.”

19 Example  X good_student(X)  M study_hard(X)  graduates (X) How to make sure that study_hard(X) is consistent? Negation as failure proof: Try to prove  study_hard(X), if not possible assume X does study. Tried but failed proof: Try to prove study_hard(X ), but use a heuristic or a time/memory limit. When the limit expires, if no evidence to the contrary is found, declare as proven.

20 Potentially conflicting results  X good_student (X)  M study_hard (X)  graduates (X)  X good_student (X)  M  study_hard (X)   graduates (X) good_student(peter) If the KB does not contain information about study_hard(peter), both graduates(peter) and  graduates (peter) will be inferred! Solutions: autoepistemic logic, default logic, inheritance search, more rules,...  Y party_person(Y)   study_hard (Y) party_person (peter)

21 Truth Maintenance Systems They are also known as reason maintenance systems, or justification networks. In essence, they are dependency graphs where rounded rectangles denote predicates, and half circles represent facts or “and”s of facts. Base (given) facts:ANDed facts: p is in the KBp  q  r pr p q

22 How to retract inferences In traditional logic knowledge bases inferences made by the system might have to be retracted as new (conflicting) information comes in In knowledge bases with uncertainty inferences might have to be retracted even with non-conflicting new information We need an efficient way to keep track of which inferences must be retracted

23 Example When p, q, s, x, and y are given, all of r, t, z, and u can be inferred. r p q s x y z t u

24 Example (cont’d) If p is retracted, both r and u must be retracted (Compare this to chronological backtracking) r p q s x y z t u

25 Example (cont’d) If x is retracted (in the case before the previous slide), z must be retracted. r p q s x y z t u

26 Nonmonotonic reasoning using TMSs p  M q  r r p qq IN OUT IN means “IN the knowledge base.” OUT means “OUT of the knowledge base.” The conditions that must be IN must be proven. For the conditions that are in the OUT list, non-existence in the KB is sufficient.

27 Nonmonotonic reasoning using TMSs If p is given, i.e., it is IN, then r is also IN. r p qq IN OUT IN OUT

28 Nonmonotonic reasoning using TMSs If  q is now given, r must be retracted, it becomes OUT. Note that when  q is given the knowledge base contains more facts, but the set of inferences shrinks (hence the name nonmonotonic reasoning.) r p qq IN OUT

29 A justification network to believe that Pat studies hard  X good_student(X)  M study_hard(X)  study_hard (X) good_student(pat) IN OUT IN OUT  study_hard(pat) study_hard(pat)

30 It is still justifiable that Pat studies hard  X good_student(X)  M study_hard(X)  study_hard (X)  Y party_person(Y)   study_hard (Y) good_student(pat) IN OUT IN OUT  study_hard(pat) study_hard(pat) party_person(pat) OUT IN

31 “Pat studies hard” is no more justifiable  X good_student(X)  M study_hard(X)  study_hard (X)  Y party_person(Y)   study_hard (Y) good_student(pat) party_person(pat) good_student(pat) IN OUT IN OUT  study_hard(pat) study_hard(pat) party_person(pat) OUT IN OUT

32 Notes We looked at JTMSs (Justification Based Truth Maintenance Systems). “Predicate” nodes in JTMSs are pure text, there is even no information about “  ”. With LTMSs (Logic Based Truth Maintenance Systems), “  ” has the same semantics as logic. So what we covered was technically LTMSs. We will not cover ATMSs (Assumption Based Truth Maintenance Systems). Did you know that TMSs were first developed for Intelligent Tutoring Systems (ITSs)?