CSM6120 Introduction to Intelligent Systems Propositional and Predicate Logic.

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

Computer Science CPSC 322 Lecture 25 Top Down Proof Procedure (Ch 5.2.2)
UIUC CS 497: Section EA Lecture #2 Reasoning in Artificial Intelligence Professor: Eyal Amir Spring Semester 2004.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
Logic Use mathematical deduction to derive new knowledge.
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Logic.
Logic Concepts Lecture Module 11.
Knowledge Representation Methods
Knowledge Representation I Suppose I tell you the following... The Duck-Bill Platypus and the Echidna are the only two mammals that lay eggs. Only birds.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Logic in general Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language.
Knowledge Representation & Reasoning (Part 1) Propositional Logic chapter 6 Dr Souham Meshoul CAP492.
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Knowledge Representation & Reasoning (Part 1) Propositional Logic chapter 5 Dr Souham Meshoul CAP492.
Let remember from the previous lesson what is Knowledge representation
Logical Agents Chapter 7 Feb 26, Knowledge and Reasoning Knowledge of action outcome enables problem solving –a reflex agent can only find way from.
Propositional Logic Reasoning correctly computationally Chapter 7 or 8.
I NTRO TO L OGIC Dr Shlomo Hershkop March
Logical Agents Chapter 7 (based on slides from Stuart Russell and Hwee Tou Ng)
February 20, 2006AI: Chapter 7: Logical Agents1 Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent.
CPS 170: Artificial Intelligence Propositional Logic Instructor: Vincent Conitzer.
CHAPTERS 7, 8 Oliver Schulte Logical Inference: Through Proof to Truth.
Knowledge Representation Use of logic. Artificial agents need Knowledge and reasoning power Can combine GK with current percepts Build up KB incrementally.
Pattern-directed inference systems
Logical Agents Logic Propositional Logic Summary
1 Knowledge Representation. 2 Definitions Knowledge Base Knowledge Base A set of representations of facts about the world. A set of representations of.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Slide 1 Propositional Definite Clause Logic: Syntax, Semantics and Bottom-up Proofs Jim Little UBC CS 322 – CSP October 20, 2014.
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
S P Vimal, Department of CSIS, BITS, Pilani
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents Chapter 7. 2 A simple knowledge-based agent The agent must be able to: –Represent states, actions, etc. –Incorporate new percepts –Update.
LECTURE LECTURE Propositional Logic Syntax 1 Source: MIT OpenCourseWare.
CS6133 Software Specification and Verification
1 The Wumpus Game StenchBreeze Stench Gold Breeze StenchBreeze Start  Breeze.
Computer Science CPSC 322 Lecture 22 Logical Consequences, Proof Procedures (Ch 5.2.2)
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
11 Artificial Intelligence CS 165A Thursday, October 25, 2007  Knowledge and reasoning (Ch 7) Propositional logic 1.
Chapter 7. Propositional and Predicate Logic Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Propositional Logic Rather than jumping right into FOL, we begin with propositional logic A logic involves: §Language (with a syntax) §Semantics §Proof.
ARTIFICIAL INTELLIGENCE Lecture 2 Propositional Calculus.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Logical Agents Chapter 7 Part I. 2 Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Artificial Intelligence Knowledge Representation.
March 3, 2016Introduction to Artificial Intelligence Lecture 12: Knowledge Representation & Reasoning I 1 Back to “Serious” Topics… Knowledge Representation.
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents. Inference : Example 1 How many variables? 3 variables A,B,C How many models? 2 3 = 8 models.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Chapter 7. Propositional and Predicate Logic
Instructor: Vincent Conitzer
Knowledge Representation and Reasoning
Logical Inference: Through Proof to Truth
Logic Use mathematical deduction to derive new knowledge.
Artificial Intelligence: Agents and Propositional Logic.
CS 416 Artificial Intelligence
Back to “Serious” Topics…
CPS 570: Artificial Intelligence Logic
Chapter 7. Propositional and Predicate Logic
CSNB234 ARTIFICIAL INTELLIGENCE
Methods of Proof Chapter 7, second half.
Propositional Logic CMSC 471 Chapter , 7.7 and Chuck Dyer
Artificial Intelligence Propositional Logic
CPS 570: Artificial Intelligence Logic
Presentation transcript:

CSM6120 Introduction to Intelligent Systems Propositional and Predicate Logic

Logic and AI  Would like our AI to have knowledge about the world, and logically draw conclusions from it  Search algorithms generate successors and evaluate them, but do not “understand” much about the setting  Example question: is it possible for a chess player to have 8 pawns and 2 queens?  Search algorithm could search through tons of states to see if this ever happens…

Types of knowledge  There are four basic sorts of knowledge (different sources may vary as to number and definitions)  Declarative  Facts  E.g. Joe has a car  Procedural  Knowledge exercised in the performance of some task  E.g. What to check if car won't start  First check a, then b, then c, until car starts

Types of knowledge  Meta  Knowledge about knowledge  E.g. knowing how an expert system came to its conclusion  Heuristic  Rules of thumb, empirical  Knowledge gained from experience, not proven  i.e. could be wrong!

Methods of representation  Object-attribute-value  Say you have an oak tree. You want to encode the fact that it is a tree, and that its species is oak  There are three pieces of information in there:  It's a tree  It has (belongs to) a species  The species is oak  The fact that all trees have a species might be obvious to you, but it's not obvious to a computer  We have to encode it

How to encode?  Maybe something like this:  (species tree oak)  species(tree, oak)  tree(species, oak)  Or some other similar method that your program can parse and understand

What if you’re not sure?  You could include an uncertainty factor  The uncertainty factor is a number which can be taken into account by your program when making its decisions  The final conclusion of any program where uncertainty was used in the input is likely to also have an uncertainty factor  If you're not sure of the facts, how can the program be sure of the result?

Encoding uncertainty  To encode uncertainty, we may do something like this:  species(tree, oak, 0.8)  This would mean we were only 80% sure that the tree concerned was an oak tree

Rules  A knowledge base (KB) may have rules  IF premises THEN conclusion  May be more than one premise  May contain logical functions  AND, OR, NOT  If premises evaluate to TRUE, rule fires  IF tree (species, oak) THEN tree (type, deciduous)

More complex rules IF tree is a beech AND the season is summer THEN tree is green IF tree is a beech AND the season is summer AND tree-type is red beech THEN tree is red  This example illustrates a problem whereby two rules could both fire, but lead to different conclusions  Strategies to resolve would include firing the most specific rule first  The second rule is more specific

What is a Logic?  A language with concrete rules  No ambiguity in representation (may be other errors!)  Allows unambiguous communication and processing  Very unlike natural languages e.g. English  Many ways to translate between languages  A statement can be represented in different logics  And perhaps differently in same logic  Expressiveness of a logic  How much can we say in this language?  Not to be confused with logical reasoning  Logics are languages, reasoning is a process (may use logic)

Syntax and semantics  Syntax  Rules for constructing legal sentences in the logic  Which symbols we can use (English: letters, punctuation)  How we are allowed to combine symbols  Semantics  How we interpret (read) sentences in the logic  Assigns a meaning to each sentence  Example: “All lecturers are seven foot tall”  A valid sentence (syntax)  And we can understand the meaning (semantics)  This sentence happens to be false (there is a counterexample)

Propositional logic  Syntax  Propositions, e.g. “it is wet”  Connectives: and, or, not, implies, iff (equivalent)  Brackets (), T (true) and F (false)  Semantics (Classical/Boolean)  Define how connectives affect truth  “P and Q” is true if and only if P is true and Q is true  Use truth tables to work out the truth of statements

A story  Your Friend comes home; he is completely wet  You know the following things:  Your Friend is wet  If your Friend is wet, it is because of rain, sprinklers, or both  If your Friend is wet because of sprinklers, the sprinklers must be on  If your Friend is wet because of rain, your Friend must not be carrying the umbrella  The umbrella is not in the umbrella holder  If the umbrella is not in the umbrella holder, either you must be carrying the umbrella, or your Friend must be carrying the umbrella  You are not carrying the umbrella  Can you conclude that the sprinklers are on?  Can AI conclude that the sprinklers are on?

Knowledge base for the story  FriendWet  FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers)  FriendWetBecauseOfSprinklers → SprinklersOn  FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella)  UmbrellaGone  UmbrellaGone → (YouCarryingUmbrella OR FriendCarryingUmbrella)  NOT(YouCarryingUmbrella)

Syntax  What do well-formed sentences in the knowledge base look like?  A BNF grammar:  Symbol := P, Q, R, …, FriendWet, …  Sentence := True | False | Symbol | NOT(Sentence) | (Sentence AND Sentence) | (Sentence OR Sentence) | (Sentence → Sentence)  We will drop parentheses sometimes, but formally they really should always be there

Semantics  A model specifies which of the propositional symbols are true and which are false  Given a model, I should be able to tell you whether a sentence is true or false  Truth table defines semantics of operators: Given a model, can compute truth of sentence recursively with these abNOT(a)a AND ba OR b a → b false truefalse true falsetrue falsetrue false truefalse true falsetrue

Tautologies  A sentence is a tautology if it is true for any setting of its propositional symbols (P OR Q) OR (NOT(P) AND NOT(Q)) is a tautology PQP OR Q NOT(P) AND NOT(Q) (P OR Q) OR (NOT(P) AND NOT(Q)) false true falsetrue falsetrue falsetruefalsetrue falsetrue

Is this a tautology?  (P → Q) OR (Q → P)

Logical equivalences  Two sentences are logically equivalent if they have the same truth value for every setting of their propositional variables P OR Q and NOT(NOT(P) AND NOT(Q)) are logically equivalent. Tautology = logically equivalent to True PQP OR QNOT(NOT(P) AND NOT(Q)) false true falsetrue

Famous logical equivalences  (a OR b) ≡ (b OR a) commutatitvity  (a AND b) ≡ (b AND a) commutatitvity  ((a AND b) AND c) ≡ (a AND (b AND c)) associativity  ((a OR b) OR c) ≡ (a OR (b OR c)) associativity  NOT(NOT(a)) ≡ a double-negation elimination  (a → b) ≡ (NOT(b) → NOT(a)) contraposition  (a → b) ≡ (NOT(a) OR b) implication elimination  NOT(a AND b) ≡ (NOT(a) OR NOT(b)) De Morgan  NOT(a OR b) ≡ (NOT(a) AND NOT(b)) De Morgan  (a AND (b OR c)) ≡ ((a AND b) OR (a AND c)) distributivity  (a OR (b AND c)) ≡ ((a OR b) AND (a OR c)) distributivity

Entailment  A set of premises A logically entails a conclusion B (written as A |= B) if and only if every model/interpretation that satisfies the premises also satisfies the conclusion, i.e. A → B  i.e. B is a valid consequence of A  {p} |= (p ∨ q)  {p} |# (p ∧ q)  (e.g. p=T, q=F)  {p, q} |= (p ∧ q)

Entailment  We have a knowledge base (KB) of things that we know are true  FriendWetBecauseOfSprinklers  FriendWetBecauseOfSprinklers → SprinklersOn  Can we conclude that SprinklersOn?  We say SprinklersOn is entailed by the KB if, for every setting of the propositional variables for which the KB is true, SprinklersOn is also true SprinklersOn is entailed! (KB |= SprinklersOn) FWBOSSprinklersOnKnowledge base false truefalse truefalse true

Inconsistent knowledge bases  Suppose we were careless in how we specified our knowledge base: PetOfFriendIsABird → PetOfFriendCanFly PetOfFriendIsAPenguin → PetOfFriendIsABird PetOfFriendIsAPenguin → NOT(PetOfFriendCanFly) PetOfFriendIsAPenguin  No setting of the propositional variables makes all of these true  Therefore, technically, this knowledge base implies/entails anything…  TheMoonIsMadeOfCheese

Satisfiability  A sentence is satisfiable if it is true in some model  If a sentence α is true in model m, then m satisfies α, or we say m is the model of α  A KB is satisfiable if a model m satisfies all clauses in it  How do you verify if a sentence is satisfiable?  Enumerate the possible models, until one is found to satisfy  Can order the models so as to evaluate the most promising ones first  Many problems in Computer Science can be reduced to a satisfiability problem  E.g. CSP is all about verifying if the constraints are satisfiable by some assignments

Inference  Inference is the process of deriving a specific sentence from a KB (where the sentence must be entailed by the KB)  KB |- i a = sentence a can be derived from KB by procedure i  “KBs are a haystack”  Entailment = needle in haystack  Inference = finding it  If KB is true in the real world, then any sentence derived from KB by a sound inference procedure is also true in the real world

Simple algorithm for inference  Want to find out if sentence a is entailed by knowledge base…  For every possible setting of the propositional variables,  If knowledge base is true and a is false, return false  Return true  Not very efficient: 2 #propositional variables settings  There can be many propositional variables among premises that are irrelevant to the conclusion. Much wasted work !

Inference by enumeration

Proof methods  Proof methods divide into (roughly) two kinds:  Model checking (inference by enumeration)  Truth table enumeration (always exponential in n)  Improved backtracking, e.g., Davis-Putnam-Logemann-Loveland (DPLL)  Heuristic search in model space (sound but incomplete)  Application of inference rules/reasoning patterns  Legitimate (sound) generation of new sentences from old  Proof = a sequence of inference rule applications Can use inference rules as operators in a standard search algorithm  Typically require transformation of sentences into a normal form

Reasoning patterns  Obtain new sentences directly from some other sentences in knowledge base according to reasoning patterns  E.g., if we have sentences a and a → b, we can correctly conclude the new sentence b  This is called modus ponens  E.g., if we have a AND b, we can correctly conclude a  All of the logical equivalences from before also give reasoning patterns

Formal proof that the sprinklers are on  Knowledge base: 1. FriendWet 2. FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers) 3. FriendWetBecauseOfSprinklers → SprinklersOn 4. FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella) 5. UmbrellaGone 6. UmbrellaGone → (YouCarryingUmbrella OR FriendCarryingUmbrella) 7. NOT(YouCarryingUmbrella)

Formal proof that the sprinklers are on 8. YouCarryingUmbrella OR FriendCarryingUmbrella (modus ponens on 5 and 6) 9. NOT(YouCarryingUmbrella) → FriendCarryingUmbrella (equivalent to 8) 10. FriendCarryingUmbrella (modus ponens on 7 and 9) 11. NOT(NOT(FriendCarryingUmbrella) (equivalent to 10) 12. NOT(NOT(FriendCarryingUmbrella)) → NOT(FriendWetBecauseOfRain) (equivalent to 4 by contraposition) 13. NOT(FriendWetBecauseOfRain) (modus ponens on 11 and 12) 14. FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers (modus ponens on 1 and 2) 15. NOT(FriendWetBecauseOfRain) → FriendWetBecauseOfSprinklers (equivalent to 14) 16. FriendWetBecauseOfSprinklers (modus ponens on 13 and 15) 17. SprinklersOn (modus ponens on 16 and 3)

Reasoning about penguins  Knowledge base: 1. PetOfFriendIsABird → PetOfFriendCanFly 2. PetOfFriendIsAPenguin → PetOfFriendIsABird 3. PetOfFriendIsAPenguin → NOT(PetOfFriendCanFly) 4. PetOfFriendIsAPenguin 5. PetOfFriendIsABird (modus ponens on 4 and 2) 6. PetOfFriendCanFly (modus ponens on 5 and 1) 7. NOT(PetOfFriendCanFly) (modus ponens on 4 and 3) 8. NOT(PetOfFriendCanFly) → FALSE (equivalent to 6) 9. FALSE (modus ponens on 7 and 8) 10. FALSE → TheMoonIsMadeOfCheese (tautology) 11. TheMoonIsMadeOfCheese (modus ponens on 9 and 10)

Evaluation of deductive inference  Sound  Yes, because the inference rules themselves are sound  (This can be proven using a truth table argument)  Complete  If we allow all possible inference rules, we’re searching in an infinite space, hence not complete  If we limit inference rules, we run the risk of leaving out the necessary one…  Monotonic  If we have a proof, adding information to the KB will not invalidate the proof

Getting more systematic  Any knowledge base can be written as a single formula in conjunctive normal form (CNF)  CNF formula: (… OR … OR …) AND (… OR …) AND …  … can be a symbol x, or NOT(x) (these are called literals)  Multiple facts in knowledge base are effectively ANDed together  FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers) becomes (NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers)

Converting story to CNF  FriendWet  FriendWet → (FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers)  NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers  FriendWetBecauseOfSprinklers → SprinklersOn  NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn  FriendWetBecauseOfRain → NOT(FriendCarryingUmbrella)  NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella)  UmbrellaGone  UmbrellaGone → (YouCarryingUmbrella OR FriendCarryingUmbrella)  NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella  NOT(YouCarryingUmbrella)

Unit resolution  Unit resolution: if we have l 1 OR l 2 OR … l i … OR l k and NOT(l i ) we can conclude l 1 OR l 2 OR … l i -1 OR l i +1 OR … OR l k  Basically modus ponens

Applying resolution to story problem 1. FriendWet 2. NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers 3. NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn 4. NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella) 5. UmbrellaGone 6. NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella 7. NOT(YouCarryingUmbrella) 8. NOT(UmbrellaGone) OR FriendCarryingUmbrella (6,7) 9. FriendCarryingUmbrella (5,8) 10. NOT(FriendWetBecauseOfRain) (4,9) 11. NOT(FriendWet) OR FriendWetBecauseOfSprinklers (2,10) 12. FriendWetBecauseOfSprinklers (1,11) 13. SprinklersOn (3,12)

Resolution (general)  General resolution allows a complete inference mechanism (search-based) using only one rule of inference  Resolution rule:  Given: P 1  P 2  P 3 …  P n and  P 1  Q 1 …  Q m  Conclude: P 2  P 3 …  P n  Q 1 …  Q m Complementary literals P 1 and  P 1 “cancel out”  Why it works:  Consider 2 cases: P 1 is true, and P 1 is false

Applying resolution 1. FriendWet 2. NOT(FriendWet) OR FriendWetBecauseOfRain OR FriendWetBecauseOfSprinklers 3. NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn 4. NOT(FriendWetBecauseOfRain) OR NOT(FriendCarryingUmbrella) 5. UmbrellaGone 6. NOT(UmbrellaGone) OR YouCarryingUmbrella OR FriendCarryingUmbrella 7. NOT(YouCarryingUmbrella) 8. NOT(FriendWet) OR FriendWetBecauseOfRain OR SprinklersOn (2,3) 9. NOT(FriendCarryingUmbrella) OR NOT(FriendWet) OR SprinklersOn (4,8) 10. NOT(UmbrellaGone) OR YouCarryingUmbrella OR NOT(FriendWet) OR SprinklersOn (6,9) 11. YouCarryingUmbrella OR NOT(FriendWet) OR SprinklersOn (5,10) 12. NOT(FriendWet) OR SprinklersOn (7,11) 13. SprinklersOn (1,12)

Systematic inference…  General strategy: if we want to see if sentence a is entailed, add NOT(a) to the knowledge base and see if it becomes inconsistent (we can derive a contradiction)  = proof by contradiction  CNF formula for modified knowledge base is satisfiable if and only if sentence a is not entailed  Satisfiable = there exists a model that makes the modified knowledge base true = modified knowledge base is consistent

Resolution algorithm  Given formula in conjunctive normal form, repeat:  Find two clauses with complementary literals,  Apply resolution,  Add resulting clause (if not already there)  If the empty clause results, formula is not satisfiable  Must have been obtained from P and NOT(P)  Otherwise, if we get stuck (and we will eventually), the formula is guaranteed to be satisfiable

Example  Our knowledge base: 1) FriendWetBecauseOfSprinklers 2) NOT(FriendWetBecauseOfSprinklers) OR SprinklersOn  Can we infer SprinklersOn? We add: 3) NOT(SprinklersOn)  From 2) and 3), get 4) NOT(FriendWetBecauseOfSprinklers)  From 4) and 1), get empty clause = SpinklersOn entailed

Exercises  In twos/threes, please!