March 14, 2006AI: Chapter 9: Inference in First- Order Logic 1 Artificial Intelligence Chapter 9: Inference in First- Order Logic Michael Scherger Department.

Slides:



Advertisements
Similar presentations
Inference in first-order logic
Advertisements

Some Prolog Prolog is a logic programming language
Inference in first-order logic
First-Order Logic.
Inference Rules Universal Instantiation Existential Generalization
Standard Logical Equivalences
ITCS 3153 Artificial Intelligence Lecture 15 First-Order Logic Chapter 9 Lecture 15 First-Order Logic Chapter 9.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
We have seen that we can use Generalized Modus Ponens (GMP) combined with search to see if a fact is entailed from a Knowledge Base. Unfortunately, there.
For Friday No reading Homework: –Chapter 9, exercise 4 (This is VERY short – do it while you’re running your tests) Make sure you keep variables and constants.
13 Automated Reasoning 13.0 Introduction to Weak Methods in Theorem Proving 13.1 The General Problem Solver and Difference Tables 13.2 Resolution.
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Logic.
(FO) Inference Methods CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Inference in FOL All rules of inference for propositional logic apply to first-order logic We just need to reduce FOL sentences to PL sentences by instantiating.
First-Order Logic Inference
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2004.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9.
1 Automated Reasoning Introduction to Weak Methods in Theorem Proving 13.1The General Problem Solver and Difference Tables 13.2Resolution Theorem.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Knoweldge Representation & Reasoning
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Fall 2004.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2005.
1 Inference in First-Order Logic. 2 Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward chaining.
Cooperating Intelligent Systems Inference in first-order logic Chapter 9, AIMA.
INFERENCE IN FIRST-ORDER LOGIC IES 503 ARTIFICIAL INTELLIGENCE İPEK SÜĞÜT.
Inference in First-Order Logic CS 271: Fall 2007 Instructor: Padhraic Smyth.
Inference in first-order logic I CIS 391 – Introduction to Artificial Intelligence AIMA Chapter (through p. 278) Chapter 9.5 (through p. 300)
1 Inference in first-order logic Chapter 9. 2 Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Logical Inference 2 rule based reasoning
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
Inference in first- order logic. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward chaining.
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
First Order Predicate Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Inference in First-Order Logic 부산대학교 전자전기컴퓨터공학과 인공지능연구실 김민호
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
Automated Reasoning Early AI explored how to automate several reasoning tasks – these were solved by what we might call weak problem solving methods as.
CPSC 386 Artificial Intelligence Ellen Walker Hiram College
1 Inference in First-Order Logic CS 271: Fall 2009.
1 Inference in First Order Logic CS 171/271 (Chapter 9) Some text and images in these slides were drawn from Russel & Norvig’s published material.
An Introduction to Artificial Intelligence – CE Chapter 9 - Inference in first-order logic Ramin Halavati In which we define.
1 First Order Logic CS 171/271 (Chapters 8 and 9) Some text and images in these slides were drawn from Russel & Norvig’s published material.
1/19 Inference in first-order logic Chapter 9- Part2 Modified by Vali Derhami.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
Inference in first-order logic
Inference in First Order Logic. Outline Reducing first order inference to propositional inference Unification Generalized Modus Ponens Forward and backward.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Inference in First-Order Logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
1 Chaining in First Order Logic CS 171/271 (Chapter 9, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Presented By S.Yamuna AP/CSE
Inference in first-order logic
Inference in First-Order Logic
Inference in First-Order Logic
Inference in first-order logic part 1
Artificial intelli-gence 1:Inference in first-order logic
Artificial Intelligence
Artificial intelligence: Inference in first-order logic
Artificial Intelligence Inference in First Order Logic
Notes 9: Inference in First-order logic
Inference in first-order logic
Inference in first-order logic part 1
Inference in First Order Logic (1)
First-Order Logic Prof. Dr. Widodo Budiharto 2018
Presentation transcript:

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 1 Artificial Intelligence Chapter 9: Inference in First- Order Logic Michael Scherger Department of Computer Science Kent State University

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 2 Contents Reducing FO inference to propositional inference Unification Generalized Modus Ponens Forward and Backward Chaining Logic Programming Resolution

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 3 Necessary Algorithms We already know enough to implement TELL (although maybe not efficiently) But how do we implement ASK? Recall 3 cases –Direct matching –Finding a proof (inference) –Finding a set of bindings (unification)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 4 Inference with Quantifiers Universal Instantiation: –Given  x, person(x)  likes(x, McDonalds) –Infer person(John)  likes(John, McDonalds) Existential Instantiation: –Given  x, likes(x, McDonalds) –Infer  likes(S1, McDonalds) –S1 is a “Skolem Constant” that is not found anywhere else in the KB and refers to (one of) the indviduals that likes McDonalds.

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 5 Universal Instantiation Every instantiation of a universally quantified sentence is entailed by it: –for any variable v and ground term g ground term…a term with out variables Example: –  x King(x)  Greedy(x)  Evil(x) yields King(John)  Greedy(John)  Evil(John) King(Richard)  Greedy(Richard)  Evil(Richard) King(Father(John))  Greedy(Father(John)  Evil(Father(John)) …

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 6 Existential Instantiation For any sentence , variable v, and constant k that does not appear in the KB: Example: –  x Crown(x)  OnHead(x, John) yields: Crown(C 1 )  OnHead(C 1, John) provided C 1 is a new constant (Skolem)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 7 Existential Instantiation UI can be applied several times to add new sentences –The KB is logically equivalent to the old EI can be applied once to replace the existential sentence –The new KB is not equivalent to the old but is satisfiable iff the old KB was satisfiable

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 8 Reduction to Propositional Inference Use instantiation rules to create relevant propositional facts in the KB, then use propositional reasoning Problems: –May generate infinite sentences when it cannot prove –Will generate many irrelevant sentences along the way!

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 9 Reduction to Propositional Inference Suppose the KB had the following sentence  x King(x)  Greedy(x)  Evil(x) King(John) Greedy(John) Brother(Richard, John)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 10 Reduction to Propositional Inference Instantiating the universal sentence in all possible ways… King(John)  Greedy(John)  Evil(John) King(Richard)  Greedy(Richard)  Evil(Richard) King(John) Greedy(John) Brother(Richard, John) The new KB is propositionalized: propositional symbols are… King(John), Greedy(John), Evil(John), King(Richard), etc…

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 11 Problems with Propositionalization Propositionalization tends to generate lots of irrelevant sentences Example  x King(x)  Greedy(x)  Evil(x) King(John)  y Greedy(y) Brother(Richard, John) –Obvious that Evil(John) is true, but the fact Greedy(Richard) is irrelevant With p k-ary predicates and n constants, there are p * n k instantiations!!!

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 12 Unification Unification: The process of finding all legal substitutions that make logical expressions look identical This is a recursive algorithm –See text and online source code for details!

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 13 Unification We can get the inference immediately if we can find a substitution θ such that King(x) and Greedy(x) match King(John) and Greedy(y) θ = {x/John, y/John} works Unify( ,  ) = θ if  θ =  θ

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 14 Unification

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 15 Generalized Modus Ponens This is a general inference rule for FOL that does not require instantiation Given: p1’, p2’ … pn’ (p1  … pn)  q Subst(θ, pi’) = subst(θ, pi) for all p Conclude: –Subst(θ, q)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 16 GMP in “CS terms” Given a rule containing variables If there is a consistent set of bindings for all of the variables of the left side of the rule (before the arrow) Then you can derive the result of substituting all of the same variable bindings into the right side of the rule

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 17 GMP Example  x, Parent(x,y)  Parent(y,z)  GrandParent(x,z) Parent(James, John), Parent(James, Richard), Parent(Harry, James) We can derive: –GrandParent(Harry, John), bindings: ((x Harry) (y James) (z John) –GrandParent(Harry, Richard), bindings: ((x Harry) (y James) (z Richard)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 18 Base Cases for Unification If two expressions are identical, the result is (NIL) (succeed with empty unifier set) If two expressions are different constants, the result is NIL (fail) If one expression is a variable and is not contained in the other, the result is ((x other-exp))

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 19 Recursive Case If both expressions are lists, –Combine the results of unifying the CAR with unifying the CDR In Lisp… (cons (unify (car list1) (car list2)) (unify (cdr list1) (cdr list2))

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 20 A few more details… Don’t reuse variable names –Before actually unifying, give each rule a separate set of variables –The lisp function gentemp creates uniquely numbered variables Keep track of bindings –If a variable is already bound to something, it must retain the same value throughout the computation –This requires substituting each successful binding in the remainder of the expression

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 21 Storage and retrieval Most systems don’t use variables on predicates Therefore, hash statements by predicate for quick retrieval (predicate indexing) Subsumption lattice for efficiency (see p. 279)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 22 Forward Chaining –Start with atomic sentences in the KB and apply Modus Ponens in the forward direction, adding new atomic sentences, until no further inferences can be made.

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 23 Forward Chaining Given a new fact, generate all consequences Assumes all rules are of the form –C1 and C2 and C3 and…. --> Result Each rule & binding generates a new fact This new fact will “trigger” other rules Keep going until the desired fact is generated (Semi-decidable as is FOL in general)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 24 FC: Example Knowledge Base The law says that it is a crime for an American to sell weapons to hostile nations. The country Nono, an enemy America, has some missiles, and all of its missiles were sold to it by Col. West, who is an American. Prove that Col. West is a criminal.

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 25 FC: Example Knowledge Base …it is a crime for an American to sell weapons to hostile nations American(x)  Weapon(y)  Sells(x,y,z)  Hostile(z)  Criminal(x) Nono…has some missiles  x Owns(Nono, x)  Missiles(x) Owns(Nono, M 1 ) and Missle(M 1 ) …all of its missiles were sold to it by Col. West  x Missle(x)  Owns(Nono, x)  Sells( West, x, Nono) Missiles are weapons Missle(x)  Weapon(x)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 26 FC: Example Knowledge Base An enemy of America counts as “hostile” Enemy( x, America )  Hostile(x) Col. West who is an American American( Col. West ) The country Nono, an enemy of America Enemy(Nono, America)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 27 FC: Example Knowledge Base

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 28 FC: Example Knowledge Base

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 29 FC: Example Knowledge Base

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 30 Efficient Forward Chaining Order conjuncts appropriately –E.g. most constrained variable Don’t generate redundant facts; each new fact should depend on at least one newly generated fact. –Production systems –RETE matching –CLIPS

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 31 Forward Chaining Algorithm

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 32 OPS Facts –Type, attributes & values –(goal put-on yellow-block red-block) Rules –If conditions, then action. –Variables (,, etc) can be bound –If (goal put-on ) AND – (clear ) THEN add (goal clear )

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 33 RETE Network Based only on Left Sides (conditions) of rules Each condition (test) appears once in the network Tests with “AND” are connected with “JOIN” –Join means all tests work with same bindings

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 34 Example Rules If (goal put-on ) AND (clear ) AND (clear ) THEN add (on ) delete (clear ) If (goal clear ) AND (on ) AND (clear ) THEN add (clear ) add (on table) delete (on ) If (goal clear ) AND (on ) THEN add (goal clear ) If (goal put-on ) AND (clear ) THEND add (goal clear ) If (goal put-on ) AND (clear ) THEN add (goal clear )

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 35 RETE Network

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 36 Using the RETE Network Each time a fact comes in… –Update bindings for the relevant node (s) –Update join(s) below those bindings –Note new rules satisfied Each processing cycle –Choose a satisfied rule

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 37 Example (Facts) 1.(goal put-on yellow-block red-block) 2.(on blue-block yellow-block) 3.(on yellow-block table) 4.(on red-block table) 5.(clear red-block) 6.(clear blue-block)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 38 Why RETE is Efficient Rules are “pre-compiled” Facts are dealt with as they come in –Only rules connected to a matching node are considered –Once a test fails, no nodes below are considered –Similar rules share structure In a typical system, when rules “fire”, new facts are created / deleted incrementally –This incrementally adds / deletes rules (with bindings) to the conflict set

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 39 CLIPS CLIPS is another forward-chaining production system Important commands –(assert fact) (deffacts fact1 fact2 … ) –(defrule rule-name rule) –(reset) - eliminates all facts except “initial-fact” –(load file) (load-facts file) –(run) –(watch all) –(exit)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 40 CLIPS Rule Example (defrule putting-on ?g <- (goal put-on ?x ?y) (clear ?x) ?bottomclear <- (clear ?y) ==> (assert (on ?x ?y)) (retract ?g) (retract ?bottomclear) )

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 41 Backward Chaining Consider the item to be proven a goal Find a rule whose head is the goal (and bindings) Apply bindings to the body, and prove these (subgoals) in turn If you prove all the subgoals, increasing the binding set as you go, you will prove the item. Logic Programming (cprolog, on CS)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 42 Backward Chaining Example

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 43 Backward Chaining Example

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 44 Backward Chaining Example

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 45 Backward Chaining Example

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 46 Backward Chaining Example

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 47 Backward Chaining Example

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 48 Backward Chaining Example

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 49 Backward Chaining Algorithm

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 50 Properties of Backward Chaining Depth-first recursive proof search: space is linear in size of proof Incomplete due to infinite loops –Fix by checking current goal with every subgoal on the stack Inefficient due to repeated subgoals (both success and failure) –Fix using caching of previous results (extra space) Widely used without improvements for logic programming

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 51 Logic Programming –Identify problem –Assemble information –Tea Break –Encode information in KB –Encode problem instance as facts –Ask queries –Find false facts Ordinary Programming –Identify problem –Assemble information –Figure out solution –Program Solution –Encode problem instance as data –Apply program to data –Debug procedural errors

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 52 Logic Programming Basis: backward chaining with Horn clauses + lots of bells and whistles –Widely used in Europe and Japan Basis of 5 th Generation Languages and Projects Compilation techniques -> 60 million LIPS Programming = set of clauses head :- literal1, …, literaln criminal(X) :- american(X), weapon(X), sells(X, Y, Z), hostile(Z)

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 53 Logic Programming Rule Example puton(X,Y) :- cleartop(X), cleartop(Y), takeoff(X,Y). Capital letters are variables Three parts to the rule –Head (thing to prove) –Neck:- –Body (subgoals, separated by,) Rules end with.

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 54 Logic Programming Efficient unification by open coding Efficient retrieval of matching clauses by direct linking Depth-first, left-to-right, backward chaining Built-in predicate for arithmetic e.g. X is Y*Z+2 Closed-world assumption (“negation as failure”) –e.g. given alive(X) :- not dead(X). –alive(Joe) succeeds if dead(joe) fails

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 55 Logic Programming These notes are for gprolog (available on the departmental servers) To read a file, consult(‘file’). To enter data directly, consult(user). Type control-D when done. Every statement must end in a period. If you forget, put it on the next line. To prove a fact, enter the fact directly at the command line. gprolog will respond Yes, No, or give you a binding set. If you want another answer, type ; otherwise return. Trace(predicate) or trace(all) will allow you to watch the backward chaining process.

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 56 Logic Programming Depth-first search from start state X –dfs(X) :- goal(X). –dfs(X) :- successor(X,S), dfs(S). No need to loop over S: successor succeeds for each

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 57 Logic Programming Example: Appending two lists to produce a third –append([], Y, Y). –append([X|L], Y, [X|Z]) :- append( L, Y, Z). –query: append( A, B, [1,2]). –answers: A=[]B=[1,2] A=[1]B=[2] A=[1,2]B=[]

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 58 Inference Methods Unification (prerequisite) Forward Chaining –Production Systems –RETE Method (OPS) Backward Chaining –Logic Programming (Prolog) Resolution –Transform to CNF –Generalization of Prop. Logic resolution

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 59 Resolution Convert everything to CNF Resolve, with unification If resolution is successful, proof succeeds If there was a variable in the item to prove, return variable’s value from unification bindings

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 60 Resolution (Review) Resolution allows a complete inference mechanism (search-based) using only one rule of inference Resolution rule: –Given: P 1  P 2  P 3 …  P n, and  P 1  Q 1 …  Q m –Conclude: P 2  P 3 …  P n  Q 1 …  Q m Complementary literals P 1 and  P 1 “cancel out” To prove a proposition F by resolution, –Start with  F –Resolve with a rule from the knowledge base (that contains F) –Repeat until all propositions have been eliminated –If this can be done, a contradiction has been derived and the original proposition F must be true.

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 61 Propositional Resolution Example Rules –Cold and precipitation -> snow ¬cold  ¬precipitation  snow –January -> cold ¬January  cold –Clouds -> precipitation ¬clouds  precipitation Facts –January, clouds Prove –snow

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 62 Propositional Resolution Example ¬snow ¬cold  ¬precipitation  snow ¬cold  ¬precipitation ¬January  cold ¬January  ¬precipitation ¬clouds  precipitation ¬January  ¬clouds January ¬cloudsclouds

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 63 Resolution Theorem Proving (FOL) Convert everything to CNF Resolve, with unification –Save bindings as you go! If resolution is successful, proof succeeds If there was a variable in the item to prove, return variable’s value from unification bindings

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 64 Converting to CNF 1.Replace implication (A  B) by  A  B 2.Move  “inwards”  x P(x) is equivalent to  x  P(x) & vice versa 3.Standardize variables  x P(x)   x Q(x) becomes  x P(x)   y Q(y) 4.Skolemize  x P(x) becomes P(A) 5.Drop universal quantifiers Since all quantifiers are now , we don’t need them 6.Distributive Law

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 65 Convert to FOPL, then CNF 1.John likes all kinds of food 2.Apples are food. 3.Chicken is food. 4.Anything that anyone eats and isn’t killed by is food. 5.Bill eats peanuts and is still alive. 6.Sue eats everything Bill eats.

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 66 Prove Using Resolution 1.John likes peanuts. 2.Sue eats peanuts. 3.Sue eats apples. 4.What does Sue eat? Translate to Sue eats X Result is a valid binding for X in the proof

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 67 Another Example Steve only likes easy courses Science courses are hard All the courses in the basket weaving department are easy BK301 is a basket weaving course What course would Steve like?

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 68 Another Resolution Example

March 14, 2006AI: Chapter 9: Inference in First- Order Logic 69 Final Thoughts on Resolution Resolution is complete. If you don’t want to take this on faith, study pp Strategies (heuristics) for efficient resolution include –Unit preference. If a clause has only one literal, use it first. –Set of support. Identify “useful” rules and ignore the rest. (p. 305) –Input resolution. Intermediately generated sentences can only be combined with original inputs or original rules. (We used this strategy in our examples). –Subsumption. Prune unnecessary facts from the database.