Artificial Intelligence

Slides:



Advertisements
Similar presentations
Inference in First-Order Logic
Advertisements

Artificial Intelligence 8. The Resolution Method
Some Prolog Prolog is a logic programming language
First-Order Logic.
Inference Rules Universal Instantiation Existential Generalization
Standard Logical Equivalences
ITCS 3153 Artificial Intelligence Lecture 15 First-Order Logic Chapter 9 Lecture 15 First-Order Logic Chapter 9.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
We have seen that we can use Generalized Modus Ponens (GMP) combined with search to see if a fact is entailed from a Knowledge Base. Unfortunately, there.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
For Friday No reading Homework: –Chapter 9, exercise 4 (This is VERY short – do it while you’re running your tests) Make sure you keep variables and constants.
13 Automated Reasoning 13.0 Introduction to Weak Methods in Theorem Proving 13.1 The General Problem Solver and Difference Tables 13.2 Resolution.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic.
Resolution in Propositional and First-Order Logic.
Prolog IV Logic, condensed. 2 Propositional logic Propositional logic consists of: The logical values true and false ( T and F ) Propositions: “Sentences,”
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2004.
1 Automated Reasoning Introduction to Weak Methods in Theorem Proving 13.1The General Problem Solver and Difference Tables 13.2Resolution Theorem.
Inference and Resolution for Problem Solving
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Inference in First-Order Logic
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Fall 2004.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2005.
Start with atomic sentences in the KB and apply Modus Ponens, adding new atomic sentences, until “done”.
Proof Systems KB |- Q iff there is a sequence of wffs D1,..., Dn such that Dn is Q and for each Di in the sequence: a) either Di is in KB or b) Di can.
Inference in First-Order logic Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
1 Chapter 8 Inference and Resolution for Problem Solving.
Logical Inference 2 rule based reasoning
Logical Agents Logic Propositional Logic Summary
1 Knowledge Representation. 2 Definitions Knowledge Base Knowledge Base A set of representations of facts about the world. A set of representations of.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Inference in FOL Compared to predicate logic, more abstract reasoning and specific conclusions.
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
Unification Algorithm Input: a finite set Σ of simple expressions Output: a mgu for Σ (if Σ is unifiable) 1. Set k = 0 and  0 = . 2. If Σ  k is a singleton,
CS Introduction to AI Tutorial 8 Resolution Tutorial 8 Resolution.
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
Computing & Information Sciences Kansas State University Lecture 14 of 42 CIS 530 / 730 Artificial Intelligence Lecture 14 of 42 William H. Hsu Department.
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
1 Logical Inference Algorithms CS 171/271 (Chapter 7, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
CPSC 386 Artificial Intelligence Ellen Walker Hiram College
Reasoning using First-Order Logic
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
For Wednesday Finish reading chapter 10 – can skip chapter 8 No written homework.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 11 Jim Martin.
Inference in First Order Logic. Outline Reducing first order inference to propositional inference Unification Generalized Modus Ponens Forward and backward.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
For Friday Finish chapter 9 Program 1 due. Program 1 Any questions?
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
EA C461 Artificial Intelligence
3. The Logic of Quantified Statements Summary
Logical Inference 2 Rule-based reasoning
Announcements No office hours today!
For Monday Read Prolog handout 4 Homework:
Logical Inference 2 Rule-based reasoning
Logical Inference: Through Proof to Truth
Artificial Intelligence
CS201: Data Structures and Discrete Mathematics I
Logic Use mathematical deduction to derive new knowledge.
Artificial Intelligence
Artificial Intelligence: Agents and Propositional Logic.
CS 416 Artificial Intelligence
Prolog IV Logic, condensed.
Methods of Proof Chapter 7, second half.
CS201: Data Structures and Discrete Mathematics I
RESOLUTION.
Presentation transcript:

Artificial Intelligence CS 165A Thursday, November 8, 2007 Inference in FOL (Ch 9) Brief midterm review Today 1 1 1

What we’ve been talking about Complete and sound inference procedures New inference rules: Universal Instantiation (gets rid of ) Existential Instantiation (gets rid of ) Existential Introduction (adds ) Generalized Modus Ponens Generalized (First-Order) Resolution Complete but semidecidable Unification Finds the substitution(s) necessary to make two sentences match Conjunctive normal form (CNF)

FOL example Domain elements: {Bob, Alice, Carol, Ted} x Sleepy(x) Sleepy(Bob)  Sleepy(Alice)  Sleepy(Carol)  Sleepy(Ted) x Hungry(x) Hungry(Bob)  Hungry(Alice)  Hungry(Carol)  Hungry(Ted) Universal Instantiation: x Sleepy(x) Sleepy(Alice) Existential Instantiation: x Hungry(x) Hungry(k) where k is a constant (not a variable!) Existential Introduction: AtHome(Ted) x AtHome(x)

Two versions of Generalized Resolution Disjunctions For literals pi and qi , where UNIFY(pj , qk) =  Implications For atomic sentences pi, qi, ri, si , where UNIFY(pj , qk) = 

Thursday quiz Note: Everyone gets a 100% for last Thursday! If the domain is natural numbers (0, 1, 2, ...), what is a general unifier (if one exists) for these two atomic sentences: GreaterThan (x, 7) GreaterThan (Squared (y), y) E.g.,  = { a/13, b/4 } What new (to us) dimension does situational calculus allow us to represent and reason about?

Conjunctive normal form (CNF) First we must convert all sentences to Conjunctive Normal Form (CNF) A CNF sentence is a disjunctions of literals Literals: Possibly negated propositions, variables, constants, or predicates So each sentence in the KB is a disjunction of literals, e.g.: P(x)  Q(y) Dog(x)  Cat(y) Big(x)  Slow(x)  Young(x) The KB itself is a conjunction of disjunctions of literals, e.g.: P(x)Q(y)  Dog(x)Cat(y)  Big(x)Slow(x)Young(x)

Another “canonical” form: INF Implicative Normal Form (INF) Each sentence in the KB is an implication with a conjunction of atoms on the left and a disjunction of atoms on the right, e.g.: P(x)  Q(x) P(x)  Q(x)  R(x)  S(x) INF and CNF are logically equivalent

Conversion to Conjunctive Normal Form (CNF) Replace (P  Q) with (P  Q) and (Q  P) Eliminate implications: Replace (P  Q) with (P  Q) Move  inwards: , , , (P  Q), (P  Q) Standardize variables apart: x P(x)  x Q(x) becomes x1 P(x1)  x2 Q(x2) [Give all variables different names] Move quantifiers left in order: x P(x)  y Q(y) becomes x y P(x)  Q(y) Eliminate  by “Skolemization” (coming) Drop universal quantifiers Distribute  over , e.g.: (P  Q)  R becomes (P  R)  (Q  R) [What about (P  Q)  R ?] Flatten nesting: (P  Q)  R becomes P  Q  R

Conversion to Implicative Normal Form (INF) First convert to CNF Convert disjunctions to implications: negative literals to the left, positive literals to the right P  Q  R  S becomes P  Q  R  S P  Q becomes P  Q  False R  S becomes True  R  S Remember that P(x) is logically equivalent to True  P(x) and P(x) is logically equivalent to P(x)  False

Skolemization To “Skolemize” is to remove existential quantifiers by elimination Existential elimination when x is on the outside x Sleepy(x) …becomes… Sleepy(RipVanWinkle) [If what?] More complicated when inside a universal quantifier x Person(x)  y Heart(y)  Has(x, y) “Everyone has a heart” x Person(x)  Heart(H1)  Has(x, H1) “Everyone has the heart H1” Rather, introduce a “Skolem function” H(x) x Person(x)  Heart(H(x))  Has(x, H(x)) where H() does not appear elsewhere in the KB Arguments of the Skolem function: all enclosing universally quantified variables

Skolemization examples By Existential Elimination, y MajorsIn(y, Sociology) MajorsIn(Somedude, Sociology) But it’s not true that x y MajorsIn(y, x) x MajorsIn(Somedude, x) Rather, x y MajorsIn(y, x) x MajorsIn(P(x), x) where P(x) refers to a different constant for every x x y Student(x)  TakesCourses(x)  KnowsAbout(x, y) x Student(x)  TakesCourses(x)  KnowsAbout(x, S(x))

Simple examples How to put these into CNF and INF? CNF INF P(x)  Q(x) P(x)  Q(x) P(x) True  P(x) P(x)  Q(x) True  P(x)  Q(x) P(x)  Q(y) True  P(x)  Q(y) P(x)  Q(y) Q(y)  P(x) P(x), Q(x) True  P(x), True  Q(x)  P(x)  Q(x) P(x) P(x)  Q(x) P(x)  Q(y) P(x)   Q(y) P(x)  Q(x) In practice, we can leave out the “True ” in these sentences

Moving  inwards x S(x)  (is equivalent to…) x S(x) x S(x)  x Hungry(x)   x Hungry(x) x S(x)  x S(x)  x Hungry(x)  x Hungry(x) (P  Q)  (P  Q) (P  Q)  (P  Q) [In CNF that’s two sentences!]

Generalized (first-order) resolution Disjunctions For literals pi and qi , where UNIFY(pj , qk) =  Examples: Note: p and q do not unify Rather, p and q unify pj: Engineer(x) qj: Engineer(Bill)  = { x/Bill } pj: Loves(x, Broccoli) qj: Loves(Joe, y)  = { x/Joe, y/Broccoli }

Generalized resolution example KB: Rich(x)  Famous(x) Rich(x)  Content(x) Famous(x)  Happy(x) Content(x)  Happy(x) Rich(Bob) Is Bob happy? ASK(KB, Happy(Bob)) GMP can’t answer this GR can First put KB in CNF KB in CNF: Rich(x)  Famous(x) Rich(x)  Content(x) Famous(x)  Happy(x) Content(x)  Happy(x) Rich(Bob) and the negated query: Happy(Bob) What’s this method called? Proof by contradiction Refutation

Example (cont.) QED Rich(x)  Famous(x) Rich(x)  Content(x) Famous(x)  Happy(x) Content(x)  Happy(x) Rich(Bob) Happy(Bob) Example (cont.) Rich(x)  Content(x) Content(x)  Happy(x) Rich(x)  Happy(x) { } Rich(Bob) Happy(Bob) {x/Bob} Happy(Bob) False { } QED

Forward and Backward Chaining A reasoning program needs A language for representing knowledge – First-Order Logic Rules – just one, Generalized Resolution Control mechanism– ??? Two general approaches to control Start with the KB and generate new conclusions (which can enable more inferences to be made) E.g., when a new fact is added to the KB Given what we want to prove, find implication sentences that would allow us to conclude it, and attempt to establish their premises E.g., when a goal is to be proved

Forward and Backward Chaining S G Forward chaining  S G Backward chaining 

Forward and Backward Chaining Forward chaining Data driven or data directed New version of TELL(KB, p) Add the sentence p, then apply inference rules to the updated KB until no more rules apply (“chaining” – “chain reaction”) Backward chaining Goal oriented ASK(KB, q) If SUBST(, q) is in KB, return q' = SUBST(, q) Else, find implication sentences p  q then set p as a subgoals Keep doing this, working “backwards” If p is not in KB, look for r  p, then set r as a subgoal Etc….. Backward chaining is the basis for logic programming (e.g., Prolog)

Midterm Review Tuesday during class Closed book exam Please be on time! Closed book exam You may bring one sheet of paper with notes (8.5” x 11”, both sides) Test paper will be supplied (don’t need to bring your own) Calculator not necessary What are you responsible for? Possibly anything that has been covered in the reading (textbook through Ch. 8 and articles), lectures, discussion sessions, and homework assignments But you should probably focus on the lectures….

You will be given… Inference rules for Propositional logic FOL Modus ponens, and-elimination, and-introduction, or-introduction, double-negation elimination, unit resolution, resolution FOL Universal elimination, existential elimination, existential introduction Generalized modus ponens Generalized (first-order) resolution Procedure to convert logic sentences to CNF These are posted on the course web

Examples of things you might be asked Which search algorithm(s) would be most appropriate in this particular situation? What are the main difference(s) between these two search algorithms What does it mean for a search alg. to be optimal? To be complete? Are these heuristics admissible or not? Do {iterative deepening search, A*, minimax, expectimax, …} on this problem. Show the truth table for this propositional logic sentence. Are these logic sentences satisfiable, unsatisfiable, or valid? What does it mean for an inference procedure to be optimal? To be complete? Does the KB entail this sentence S? How are these terms unified? Apply certain inference rules to this KB.