Mundhenk and Itti – Derived from S. Russell and P. Norvig.

Slides:



Advertisements
Similar presentations
Inference in First-Order Logic
Advertisements

Inference in First-Order Logic
Some Prolog Prolog is a logic programming language
First-Order Logic.
Logic: The Big Picture Propositional logic: atomic statements are facts –Inference via resolution is sound and complete (though likely computationally.
Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
Parallel Imports, Product Innovation and Market Structures Hong Hwang, Cheng-Hau Peng and Pei-Cyuan Shih To presented at the Department of Economics, NCCU.
Inference Rules Universal Instantiation Existential Generalization
ITCS 3153 Artificial Intelligence Lecture 15 First-Order Logic Chapter 9 Lecture 15 First-Order Logic Chapter 9.
Inference in first-order logic Chapter 9. Outline Reducing first-order inference to propositional inference Unification Generalized Modus Ponens Forward.
Automated Reasoning Systems For first order Predicate Logic.
Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
Artificial Intelligence Inference in first-order logic Fall 2008 professor: Luigi Ceccaroni.
We have seen that we can use Generalized Modus Ponens (GMP) combined with search to see if a fact is entailed from a Knowledge Base. Unfortunately, there.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
For Friday No reading Homework: –Chapter 9, exercise 4 (This is VERY short – do it while you’re running your tests) Make sure you keep variables and constants.
13 Automated Reasoning 13.0 Introduction to Weak Methods in Theorem Proving 13.1 The General Problem Solver and Difference Tables 13.2 Resolution.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
IES Inference in First-Order Logic Proofs Unification Generalized modus ponens Forward and backward chaining Completeness Resolution Logic programming.
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Logic.
Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus Artificial Intelligence Chapter 14. Resolution in the Propositional Calculus.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
CS 561, Session Inference in First-Order Logic Proofs Unification Generalized modus ponens Forward and backward chaining Completeness Resolution.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2004.
ITCS 3153 Artificial Intelligence Lecture 11 Logical Agents Chapter 7 Lecture 11 Logical Agents Chapter 7.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 9 Jim Martin.
1 Automated Reasoning Introduction to Weak Methods in Theorem Proving 13.1The General Problem Solver and Difference Tables 13.2Resolution Theorem.
Knowledge in intelligent systems So far, we’ve used relatively specialized, naïve agents. How can we build agents that incorporate knowledge and a memory?
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Inference in First-Order Logic
Artificial Intelligence Chapter 14 Resolution in the Propositional Calculus Artificial Intelligence Chapter 14 Resolution in the Propositional Calculus.
CS1502 Formal Methods in Computer Science Lecture Notes 10 Resolution and Horn Sentences.
Propositional Logic Reasoning correctly computationally Chapter 7 or 8.
INFERENCE IN FIRST-ORDER LOGIC IES 503 ARTIFICIAL INTELLIGENCE İPEK SÜĞÜT.
Inference is a process of building a proof of a sentence, or put it differently inference is an implementation of the entailment relation between sentences.
Propositional Resolution Computational LogicLecture 4 Michael Genesereth Spring 2005.
Inference in First-Order logic Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
Logical Inference 2 rule based reasoning
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Logical Agents Logic Propositional Logic Summary
1 Knowledge Representation. 2 Definitions Knowledge Base Knowledge Base A set of representations of facts about the world. A set of representations of.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Jim Little UBC CS 322 – CSP October 20, 2014.
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
1 Logical Inference Algorithms CS 171/271 (Chapter 7, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 15 of 41 Friday 24 September.
1 Inference in First Order Logic CS 171/271 (Chapter 9) Some text and images in these slides were drawn from Russel & Norvig’s published material.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
11 Artificial Intelligence CS 165A Thursday, October 25, 2007  Knowledge and reasoning (Ch 7) Propositional logic 1.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Forward and Backward Chaining
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
For Friday Finish chapter 9 Program 1 due. Program 1 Any questions?
CS 416 Artificial Intelligence Lecture 13 First-Order Logic Chapter 9 Lecture 13 First-Order Logic Chapter 9.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #6
EA C461 Artificial Intelligence
EA C461 – Artificial Intelligence Logical Agent
Resolution in the Propositional Calculus
Artificial Intelligence
CS 416 Artificial Intelligence
Back to “Serious” Topics…
Biointelligence Lab School of Computer Sci. & Eng.
Methods of Proof Chapter 7, second half.
Presentation transcript:

Mundhenk and Itti – Derived from S. Russell and P. Norvig

CS 561, Session

3

Nintendo example. Nintendo says it is Criminal for a programmer to provide emulators to people. My friends dont have a Nintendo 64, but they use software that runs N64 games on their PC, which is written by Reality Man, who is a programmer. Note that this example will use Horn clauses and is polynomial in run time. More complex examples require far more sophisticated solvers. CS 561, Session

The knowledge base initially contains: Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x) Software(x) Runs(x, N64 games) Emulator(x) CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Now we add atomic sentences to the KB sequentially, and call on the forward-chaining procedure: FORWARD-CHAIN(KB, Programmer(Reality Man)) Forward chaining is agnostic with regard to the question you want to ask it. Instead it tries to in essence fill in the KB, and thus answers your query in the process. This is polynomial in part because once we forward-chain a sentence, we cannot unchain it. CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) This new premise unifies with (1) with subst( {x/Reality Man}, Programmer(x) ) but not all the premises of (1) are yet known, so nothing further happens. CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) Continue adding atomic sentences: FORWARD-CHAIN(KB, People(friends)) CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) This also unifies with (1) with subst( {z/friends}, People(z) ) but other premises are still missing. CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Add: FORWARD-CHAIN(KB, Software(U64)) CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) This new premise unifies with (3) but the other premise is not yet known. CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) Add: FORWARD-CHAIN(KB, Use(friends, U64)) CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) Use(friends, U64)(7) This premise unifies with (2) but one still lacks. CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) Use(friends, U64)(7) Add: FORWARD-CHAIN(Runs(U64, N64 games)) CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) Use(friends, U64)(7) Runs(U64, N64 games)(8) This new premise unifies with (2) and (3). CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) Use(friends, U64)(7) Runs(U64, N64 games)(8) Premises (6), (7) and (8) satisfy the implications fully. CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x) (2) Software(x) Runs(x, N64 games) Emulator(x) (3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) Use(friends, U64)(7) Runs(U64, N64 games)(8) So we can infer the consequents, which are now added to the knowledge base (this is done in two separate steps). CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) Use(friends, U64)(7) Runs(U64, N64 games)(8) Provide(Reality Man, friends, U64)(9) Emulator(U64)(10) Addition of these new facts triggers further forward chaining. CS 561, Session

Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x)(1) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x)(2) Software(x) Runs(x, N64 games) Emulator(x)(3) Programmer(Reality Man)(4) People(friends)(5) Software(U64)(6) Use(friends, U64)(7) Runs(U64, N64 games)(8) Provide(Reality Man, friends, U64)(9) Emulator(U64)(10) Criminal(Reality Man)(11) Which results in the final conclusion: Criminal(Reality Man) CS 561, Session

Forward Chaining acts like a breadth-first search at the top level, with depth-first sub-searches. Since the search space spans the entire KB, a large KB must be organized in an intelligent manner in order to enable efficient searches in reasonable time. This only works for simple KBs such as those comprised of Horn clauses. More complex KBs require more sophisticated solvers. For instance entailment in 3-CNF is co-NP with satisfiability in 3-CNF which is known to be NP- complete (Stephen Cook, 1971). This problem is famously known as 3-SAT. CS 561, Session

The algorithm (available in detail in textbook) : a knowledge base KB a desired conclusion c or question q finds all sentences that are answers to q in KB or proves c if q is directly provable by premises in KB, infer q and remember how q was inferred (building a list of answers). find all implications that have q as a consequent. for each of these implications, find out whether all of its premises are now in the KB, in which case infer the consequent and add it to the KB, remembering how it was inferred. If necessary, attempt to prove the implication also via backward chaining premises that are conjuncts are processed one conjunct at a time Is a form of goal-directed reasoning. CS 561, Session

Question: Has Reality Man done anything criminal? C riminal(Reality Man) Possible answers : Steal(x, y) Criminal(x) Kill(x, y) Criminal(x) Grow(x, y) Illegal(y) Criminal(x) HaveSillyName(x) Criminal(x) Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x) CS 561, Session

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x)

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) Steal(x,y)

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) Steal(x,y) FAIL

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) Kill(x,y)Steal(x,y) FAIL

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) Kill(x,y)Steal(x,y) FAIL

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) Kill(x,y)Steal(x,y) grows(x,y)Illegal(y) FAIL

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) Kill(x,y)Steal(x,y) grows(x,y)Illegal(y) FAIL

Question: Has Reality Man done anything criminal? Backward Chaining is a depth-first search: in any knowledge base of realistic size, many search paths will result in failure. CS 561, Session Criminal(x) Kill(x,y)Steal(x,y) grows(x,y)Illegal(y) FAIL

Question: Has Reality Man done anything criminal? We will use the same knowledge as in our forward- chaining version of this example: Programmer(x) Emulator(y) People(z) Provide(x,z,y) Criminal(x) Use(friends, x) Runs(x, N64 games) Provide(Reality Man, friends, x) Software(x) Runs(x, N64 games) Emulator(x) Programmer(Reality Man) People(friends) Software(U64) Use(friends, U64) Runs(U64, N64 games) CS 561, Session

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x)

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) Programmer(x) Yes, {x/Reality Man}

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) People(Z)Programmer(x) Yes, {x/Reality Man}Yes, {z/friends}

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) People(Z)Programmer(x) Emulator(y) Yes, {x/Reality Man}Yes, {z/friends}

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) People(z)Programmer(x) Emulator(y) Software(y) Yes, {x/Reality Man}Yes, {z/friends} Yes, {y/U64}

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) People(z)Programmer(x) Emulator(y) Software(y) Runs(U64, N64 games) Yes, {x/Reality Man}Yes, {z/friends} Yes, {y/U64} yes, {}

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) People(z)Programmer(x) Emulator(y) Software(y) Runs(U64, N64 games) Provide (reality man, U64, friends) Yes, {x/Reality Man}Yes, {z/friends} Yes, {y/U64} yes, {}

Question: Has Reality Man done anything criminal? CS 561, Session Criminal(x) People(z)Programmer(x) Emulator(y) Software(y) Runs(U64, N64 games) Provide (reality man, U64, friends) Use(friends, U64) Yes, {x/Reality Man}Yes, {z/friends} Yes, {y/U64} yes, {}

Backward Chaining benefits from the fact that it is directed toward proving one statement or answering one question. In a focused, specific knowledge base, this greatly decreases the amount of superfluous work that needs to be done in searches. However, in broad knowledge bases with extensive information and numerous implications, many search paths may be irrelevant to the desired conclusion. Unlike forward chaining, where all possible inferences are made, a strictly backward chaining system makes inferences only when called upon to answer a query. CS 561, Session

As explained earlier, Generalized Modus Ponens requires sentences to be in Horn form: atomic, or an implication with a conjunction of atomic sentences as the antecedent and an atom as the consequent. However, some sentences cannot be expressed in Horn form. e.g.: x bored_of_this_lecture (x) Cannot be expressed in Horn form due to presence of negation. CS 561, Session

A significant problem since Modus Ponens cannot operate on such a sentence, and thus cannot use it in inference. Knowledge exists but cannot be used. Thus inference using Modus Ponens is incomplete. CS 561, Session

However, Kurt Gödel in developed the completeness theorem, which shows that it is possible to find complete inference rules. The theorem states: any sentence entailed by a set of sentences can be proven from that set. => Resolution Algorithm which is a complete inference method. CS 561, Session

The completeness theorem says that a sentence can be proved if it is entailed by another set of sentences. This is a big deal, since arbitrarily deeply nested functions combined with universal quantification make a potentially infinite search space. But entailment in first-order logic is only semi- decidable, meaning that if a sentence is not entailed by another set of sentences, it cannot necessarily be proven. CS 561, Session

CS 561, Session

CS 561, Session

KB: (1) father (art, jon) (2) father (bob, kim) (3) father (X, Y) parent (X, Y) Goal:parent (art, jon)? CS 561, Session

CS 561, Session ¬parent(art,jon) ¬ father(X, Y) \/ parent(X, Y) \ / ¬ father (art, jon) father (art, jon) \ / []

CS 561, Session