ITCS 3153 Artificial Intelligence Lecture 15 First-Order Logic Chapter 9 Lecture 15 First-Order Logic Chapter 9.

Slides:



Advertisements
Similar presentations
Inference in first-order logic
Advertisements

Inference in First-Order Logic
Artificial Intelligence 8. The Resolution Method
Some Prolog Prolog is a logic programming language
Resolution Proof System for First Order Logic
Inference in first-order logic
First-Order Logic.
Biointelligence Lab School of Computer Sci. & Eng.
Inference Rules Universal Instantiation Existential Generalization
Standard Logical Equivalences
Resolution.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
First Order Logic Resolution
Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
Artificial Intelligence Inference in first-order logic Fall 2008 professor: Luigi Ceccaroni.
We have seen that we can use Generalized Modus Ponens (GMP) combined with search to see if a fact is entailed from a Knowledge Base. Unfortunately, there.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
For Friday No reading Homework: –Chapter 9, exercise 4 (This is VERY short – do it while you’re running your tests) Make sure you keep variables and constants.
Logic Use mathematical deduction to derive new knowledge.
13 Automated Reasoning 13.0 Introduction to Weak Methods in Theorem Proving 13.1 The General Problem Solver and Difference Tables 13.2 Resolution.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic.
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2004.
ITCS 3153 Artificial Intelligence Lecture 11 Logical Agents Chapter 7 Lecture 11 Logical Agents Chapter 7.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9.
1 Automated Reasoning Introduction to Weak Methods in Theorem Proving 13.1The General Problem Solver and Difference Tables 13.2Resolution Theorem.
Inference and Resolution for Problem Solving
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Fall 2004.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2005.
Start with atomic sentences in the KB and apply Modus Ponens, adding new atomic sentences, until “done”.
Propositional Logic Reasoning correctly computationally Chapter 7 or 8.
INFERENCE IN FIRST-ORDER LOGIC IES 503 ARTIFICIAL INTELLIGENCE İPEK SÜĞÜT.
Proof Systems KB |- Q iff there is a sequence of wffs D1,..., Dn such that Dn is Q and for each Di in the sequence: a) either Di is in KB or b) Di can.
Inference in First-Order logic Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
CS 416 Artificial Intelligence Lecture 12 First-Order Logic Chapter 9 Lecture 12 First-Order Logic Chapter 9.
Logical Agents Logic Propositional Logic Summary
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
S P Vimal, Department of CSIS, BITS, Pilani
Unification Algorithm Input: a finite set Σ of simple expressions Output: a mgu for Σ (if Σ is unifiable) 1. Set k = 0 and  0 = . 2. If Σ  k is a singleton,
CS Introduction to AI Tutorial 8 Resolution Tutorial 8 Resolution.
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
Computing & Information Sciences Kansas State University Lecture 14 of 42 CIS 530 / 730 Artificial Intelligence Lecture 14 of 42 William H. Hsu Department.
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
1 Logical Inference Algorithms CS 171/271 (Chapter 7, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
An Introduction to Artificial Intelligence – CE Chapter 9 - Inference in first-order logic Ramin Halavati In which we define.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
Inference in First Order Logic. Outline Reducing first order inference to propositional inference Unification Generalized Modus Ponens Forward and backward.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
1 Chaining in First Order Logic CS 171/271 (Chapter 9, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
CS 416 Artificial Intelligence Lecture 13 First-Order Logic Chapter 9 Lecture 13 First-Order Logic Chapter 9.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
EA C461 Artificial Intelligence
Presented By S.Yamuna AP/CSE
Inference in first-order logic
Inference in First-Order Logic
Inference in First-Order Logic
Inference in first-order logic part 1
Artificial Intelligence
Inference in first-order logic part 1
Artificial Intelligence: Agents and Propositional Logic.
CS 416 Artificial Intelligence
Methods of Proof Chapter 7, second half.
Presentation transcript:

ITCS 3153 Artificial Intelligence Lecture 15 First-Order Logic Chapter 9 Lecture 15 First-Order Logic Chapter 9

Forward Chaining Remember this from propositional logic? Start with atomic sentences in KBStart with atomic sentences in KB Apply Modus PonensApply Modus Ponens –add new sentences to KB –discontinue when no new sentences Hopefully find the sentence you are looking for in the generated sentencesHopefully find the sentence you are looking for in the generated sentences Remember this from propositional logic? Start with atomic sentences in KBStart with atomic sentences in KB Apply Modus PonensApply Modus Ponens –add new sentences to KB –discontinue when no new sentences Hopefully find the sentence you are looking for in the generated sentencesHopefully find the sentence you are looking for in the generated sentences

Lifting forward chaining First-order definite clauses all sentences are defined this way to simplify processingall sentences are defined this way to simplify processing –disjunction of literals with exactly one positive –clause is either atomic or an implication whose antecedent is a conjunction of positive literals and whose consequent is a single positive literal First-order definite clauses all sentences are defined this way to simplify processingall sentences are defined this way to simplify processing –disjunction of literals with exactly one positive –clause is either atomic or an implication whose antecedent is a conjunction of positive literals and whose consequent is a single positive literal

Example The law says it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missles were sold to it by Colonel West, who is AmericanThe law says it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missles were sold to it by Colonel West, who is American We will prove West is a criminalWe will prove West is a criminal The law says it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missles were sold to it by Colonel West, who is AmericanThe law says it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy of America, has some missiles, and all of its missles were sold to it by Colonel West, who is American We will prove West is a criminalWe will prove West is a criminal

Example It is a crime for an American to sell weapons to hostile nationsIt is a crime for an American to sell weapons to hostile nations Nono… has some misslesNono… has some missles –Owns (Nono, M1) –Missile (M1) All of its missiles were sold to it by Colonel WestAll of its missiles were sold to it by Colonel West It is a crime for an American to sell weapons to hostile nationsIt is a crime for an American to sell weapons to hostile nations Nono… has some misslesNono… has some missles –Owns (Nono, M1) –Missile (M1) All of its missiles were sold to it by Colonel WestAll of its missiles were sold to it by Colonel West

Example We also need to know that missiles are weaponsWe also need to know that missiles are weapons and we must know that an enemy of America counts as “hostile”and we must know that an enemy of America counts as “hostile” “West, who is American”“West, who is American” The country Nono, an enemy of AmericaThe country Nono, an enemy of America We also need to know that missiles are weaponsWe also need to know that missiles are weapons and we must know that an enemy of America counts as “hostile”and we must know that an enemy of America counts as “hostile” “West, who is American”“West, who is American” The country Nono, an enemy of AmericaThe country Nono, an enemy of America

Forward-chaining Starting from the facts find all rules with satisfied premisesfind all rules with satisfied premises add their conclusions to known factsadd their conclusions to known facts repeat untilrepeat until –query is answered –no new facts are added Starting from the facts find all rules with satisfied premisesfind all rules with satisfied premises add their conclusions to known factsadd their conclusions to known facts repeat untilrepeat until –query is answered –no new facts are added

First iteration of forward chaining Look at the implication sentences first must satisfy unknown premisesmust satisfy unknown premises We can satisfy this ruleWe can satisfy this rule –by substituting {x/M1} –and adding Sells(West, M1, Nono) to KB Look at the implication sentences first must satisfy unknown premisesmust satisfy unknown premises We can satisfy this ruleWe can satisfy this rule –by substituting {x/M1} –and adding Sells(West, M1, Nono) to KB

First iteration of forward chaining We can satisfyWe can satisfy –with {x/M1} –and Weapon (M1) is added We can satisfyWe can satisfy –with {x/Nono} –and Hostile {Nono} is added We can satisfyWe can satisfy –with {x/M1} –and Weapon (M1) is added We can satisfyWe can satisfy –with {x/Nono} –and Hostile {Nono} is added

Second iteration of forward chaining We can satisfyWe can satisfy –with {x/West, y/M1, z/Nono} –and Criminal (West) is added We can satisfyWe can satisfy –with {x/West, y/M1, z/Nono} –and Criminal (West) is added

Analyze this algorithm Sound? Does it only derive sentences that are entailed?Does it only derive sentences that are entailed? Yes, because only Modus Ponens is used and it is soundYes, because only Modus Ponens is used and it is soundComplete? Does it answer every query whose answers are entailed by the KB?Does it answer every query whose answers are entailed by the KB? Yes if the clauses are definite clausesYes if the clauses are definite clausesSound? Does it only derive sentences that are entailed?Does it only derive sentences that are entailed? Yes, because only Modus Ponens is used and it is soundYes, because only Modus Ponens is used and it is soundComplete? Does it answer every query whose answers are entailed by the KB?Does it answer every query whose answers are entailed by the KB? Yes if the clauses are definite clausesYes if the clauses are definite clauses

Proving completeness Assume KB only has sentences with no function symbols What’s the most number of iterations through algorithm?What’s the most number of iterations through algorithm? Depends on the number of facts that can be addedDepends on the number of facts that can be added –Let k be the arity, the max number of arguments of any predicate and –Let p be the number of predicates –Let n be the number of constant symbols At most pn k distinct ground factsAt most pn k distinct ground facts Fixed point is reached after this many iterationsFixed point is reached after this many iterations A proof by contradiction shows that the final KB is completeA proof by contradiction shows that the final KB is complete Assume KB only has sentences with no function symbols What’s the most number of iterations through algorithm?What’s the most number of iterations through algorithm? Depends on the number of facts that can be addedDepends on the number of facts that can be added –Let k be the arity, the max number of arguments of any predicate and –Let p be the number of predicates –Let n be the number of constant symbols At most pn k distinct ground factsAt most pn k distinct ground facts Fixed point is reached after this many iterationsFixed point is reached after this many iterations A proof by contradiction shows that the final KB is completeA proof by contradiction shows that the final KB is complete

Complexit of this algorithm Three sources of complexity inner loop requires finding all unifiers such that premise of rule unifies with facts of databaseinner loop requires finding all unifiers such that premise of rule unifies with facts of database –this “pattern matching” is expensive must check every rule on every iteration to check if its premises are satisfiedmust check every rule on every iteration to check if its premises are satisfied many facts are generated that are irrelevant to goalmany facts are generated that are irrelevant to goal Three sources of complexity inner loop requires finding all unifiers such that premise of rule unifies with facts of databaseinner loop requires finding all unifiers such that premise of rule unifies with facts of database –this “pattern matching” is expensive must check every rule on every iteration to check if its premises are satisfiedmust check every rule on every iteration to check if its premises are satisfied many facts are generated that are irrelevant to goalmany facts are generated that are irrelevant to goal

Pattern matching Conjunct ordering Missile (x) ^ Owns (Nono, x) => Sells (West, x, Nono)Missile (x) ^ Owns (Nono, x) => Sells (West, x, Nono) –Look at all items owned by Nono, call them X –for each element x in X, check if it is a missile –Look for all missiles, call them X –for each element x in X, check if it is owned by Nono Optimal ordering is NP-hard, similar to matrix mult Conjunct ordering Missile (x) ^ Owns (Nono, x) => Sells (West, x, Nono)Missile (x) ^ Owns (Nono, x) => Sells (West, x, Nono) –Look at all items owned by Nono, call them X –for each element x in X, check if it is a missile –Look for all missiles, call them X –for each element x in X, check if it is owned by Nono Optimal ordering is NP-hard, similar to matrix mult

Incremental forward chaining Pointless (redundant) repetition Some rules generate new informationSome rules generate new information –this information may permit unification of existing rules some rules generate preexisting informationsome rules generate preexisting information –we need not revisit the unification of the existing rules Every new fact inferred on iteration t must be derived from at least one new fact inferred on iteration t-1 Pointless (redundant) repetition Some rules generate new informationSome rules generate new information –this information may permit unification of existing rules some rules generate preexisting informationsome rules generate preexisting information –we need not revisit the unification of the existing rules Every new fact inferred on iteration t must be derived from at least one new fact inferred on iteration t-1

Irrelevant facts Some facts are irrelevant and occupy computation of forward-chaining algorithm What if Nono example included lots of facts about food preferences?What if Nono example included lots of facts about food preferences? –Not related to conclusions drawn about sale of weapons –How can we eliminate them?  Backward chaining is one way Some facts are irrelevant and occupy computation of forward-chaining algorithm What if Nono example included lots of facts about food preferences?What if Nono example included lots of facts about food preferences? –Not related to conclusions drawn about sale of weapons –How can we eliminate them?  Backward chaining is one way

Magic Set Rewriting the rule set Sounds dangerousSounds dangerous Add elements to premises that restrict candidates that will matchAdd elements to premises that restrict candidates that will match –added elements are based on desired goal Let goal = Criminal (West)Let goal = Criminal (West) –Magic(x) ^ American(x) ^ Weapon(y) ^ Sells(x, y, z) ^ Hostile(z) => Criminal (x) –Add Magic (West) to Knowledge Base Rewriting the rule set Sounds dangerousSounds dangerous Add elements to premises that restrict candidates that will matchAdd elements to premises that restrict candidates that will match –added elements are based on desired goal Let goal = Criminal (West)Let goal = Criminal (West) –Magic(x) ^ American(x) ^ Weapon(y) ^ Sells(x, y, z) ^ Hostile(z) => Criminal (x) –Add Magic (West) to Knowledge Base

Backward Chaining Start with the premises of the goal Each premise must be supported by KBEach premise must be supported by KB Start with first premise and look for support from KBStart with first premise and look for support from KB –looking for clauses with a head that matches premise –the head’s premise must then be supported by KB A recursive, depth-first, algorithm Suffers from repetition and incompletenessSuffers from repetition and incompleteness Start with the premises of the goal Each premise must be supported by KBEach premise must be supported by KB Start with first premise and look for support from KBStart with first premise and look for support from KB –looking for clauses with a head that matches premise –the head’s premise must then be supported by KB A recursive, depth-first, algorithm Suffers from repetition and incompletenessSuffers from repetition and incompleteness

Resolution We saw earlier that resolution is a complete algorithm for refuting statements Must put first-order sentences into conjunctive normal formMust put first-order sentences into conjunctive normal form –conjunction of clauses, each is a disjunction of literals  literals can contain variables (which are assumed to be universally quantified) We saw earlier that resolution is a complete algorithm for refuting statements Must put first-order sentences into conjunctive normal formMust put first-order sentences into conjunctive normal form –conjunction of clauses, each is a disjunction of literals  literals can contain variables (which are assumed to be universally quantified)

First-order CNF For all x, American(x) ^ Weapon(y) ^ Sells(x, y, z) ^ Hostile (z) => Criminal(x)For all x, American(x) ^ Weapon(y) ^ Sells(x, y, z) ^ Hostile (z) => Criminal(x) ~American(x) V ~Weapon(y) V ~Sells(x, y, z) V ~Hostile(z) V Criminal(x)~American(x) V ~Weapon(y) V ~Sells(x, y, z) V ~Hostile(z) V Criminal(x) Every sentence of first-order logic can be converted into an inferentially equivalent CNF sentence (they are both unsatisfiable in same conditions) For all x, American(x) ^ Weapon(y) ^ Sells(x, y, z) ^ Hostile (z) => Criminal(x)For all x, American(x) ^ Weapon(y) ^ Sells(x, y, z) ^ Hostile (z) => Criminal(x) ~American(x) V ~Weapon(y) V ~Sells(x, y, z) V ~Hostile(z) V Criminal(x)~American(x) V ~Weapon(y) V ~Sells(x, y, z) V ~Hostile(z) V Criminal(x) Every sentence of first-order logic can be converted into an inferentially equivalent CNF sentence (they are both unsatisfiable in same conditions)

Example Everyone who loves all animals is loved by someone

Example

F and G are Skolem Functions arguments of function are universally quantified variables in whose scope the existential quantifier appearsarguments of function are universally quantified variables in whose scope the existential quantifier appears F and G are Skolem Functions arguments of function are universally quantified variables in whose scope the existential quantifier appearsarguments of function are universally quantified variables in whose scope the existential quantifier appears

Example Two clausesTwo clauses F(x) refers to the animal potentially unloved by xF(x) refers to the animal potentially unloved by x G(x) refers to someone who might love xG(x) refers to someone who might love x Two clausesTwo clauses F(x) refers to the animal potentially unloved by xF(x) refers to the animal potentially unloved by x G(x) refers to someone who might love xG(x) refers to someone who might love x

Resolution inference rule A lifted version of propositional resolution rule two clauses must be standardized aparttwo clauses must be standardized apart –no variables are shared can be resolved if their literals are complementarycan be resolved if their literals are complementary –one is the negation of the other –if one unifies with the negation of the other A lifted version of propositional resolution rule two clauses must be standardized aparttwo clauses must be standardized apart –no variables are shared can be resolved if their literals are complementarycan be resolved if their literals are complementary –one is the negation of the other –if one unifies with the negation of the other

Resolution