Artificial Intelligence

Slides:



Advertisements
Similar presentations
Inference Rules Universal Instantiation Existential Generalization
Advertisements

Inference and Reasoning. Basic Idea Given a set of statements, does a new statement logically follow from this. For example If an animal has wings and.
Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
For Friday No reading Homework: –Chapter 9, exercise 4 (This is VERY short – do it while you’re running your tests) Make sure you keep variables and constants.
Logic Use mathematical deduction to derive new knowledge.
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Propositional Logic Reading: C , C Logic: Outline Propositional Logic Inference in Propositional Logic First-order logic.
Logic CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Logic.
Outline Recap Knowledge Representation I Textbook: Chapters 6, 7, 9 and 10.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Logic. Propositional Logic Logic as a Knowledge Representation Language A Logic is a formal language, with precisely defined syntax and semantics, which.
Inference in FOL Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 9 Spring 2004.
ITCS 3153 Artificial Intelligence Lecture 11 Logical Agents Chapter 7 Lecture 11 Logical Agents Chapter 7.
Methods of Proof Chapter 7, second half.
Knoweldge Representation & Reasoning
Predicate Calculus.
Propositional Logic Agenda: Other forms of inference in propositional logic Basics of First Order Logic (FOL) Vision Final Homework now posted on web site.
Propositional Logic Reasoning correctly computationally Chapter 7 or 8.
INFERENCE IN FIRST-ORDER LOGIC IES 503 ARTIFICIAL INTELLIGENCE İPEK SÜĞÜT.
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
Inference is a process of building a proof of a sentence, or put it differently inference is an implementation of the entailment relation between sentences.
February 20, 2006AI: Chapter 7: Logical Agents1 Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent.
Logical Inference 2 rule based reasoning
Pattern-directed inference systems
Logical Agents Logic Propositional Logic Summary
1 Knowledge Representation. 2 Definitions Knowledge Base Knowledge Base A set of representations of facts about the world. A set of representations of.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Propositional Logic: Methods of Proof (Part II) This lecture topic: Propositional Logic (two lectures) Chapter (previous lecture, Part I) Chapter.
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
S P Vimal, Department of CSIS, BITS, Pilani
CS Introduction to AI Tutorial 8 Resolution Tutorial 8 Resolution.
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
Automated Reasoning Early AI explored how to automated several reasoning tasks – these were solved by what we might call weak problem solving methods as.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
1 Logical Inference Algorithms CS 171/271 (Chapter 7, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 24, 2012.
Artificial Intelligence 7. Making Deductive Inferences Course V231 Department of Computing Imperial College, London Jeremy Gow.
Artificial Intelligence First-Order Logic (FOL). Outline of this Chapter The need for FOL? What is a FOL? Syntax and semantics of FOL Using FOL.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
CSCI 5582 Fall 2006 CSCI 5582 Artificial Intelligence Lecture 11 Jim Martin.
First-Order Logic Chapter 8 (not 8.1). Outline Why FOL? Why FOL? Syntax and semantics of FOL Syntax and semantics of FOL Using FOL Using FOL Wumpus world.
Reasoning with Propositional Logic automated processing of a simple knowledge base CD.
First-Order Logic Reading: C. 8 and C. 9 Pente specifications handed back at end of class.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
Computing & Information Sciences Kansas State University Lecture 12 of 42 CIS 530 / 730 Artificial Intelligence Lecture 12 of 42 William H. Hsu Department.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
First-Order Logic Semantics Reading: Chapter 8, , FOL Syntax and Semantics read: FOL Knowledge Engineering read: FOL.
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
ECE457 Applied Artificial Intelligence Fall 2007 Lecture #6
EA C461 Artificial Intelligence
Chapter 7. Propositional and Predicate Logic
EA C461 – Artificial Intelligence Logical Agent
Propositional Logic Session 3
Logical Agents Chapter 7 Selected and slightly modified slides from
Artificial Intelligence
Artificial Intelligence: Agents and Propositional Logic.
CS 416 Artificial Intelligence
Methods of Proof Chapter 7, second half.
Propositional Logic: Methods of Proof (Part II)
Propositional Logic CMSC 471 Chapter , 7.7 and Chuck Dyer
Presentation transcript:

Artificial Intelligence 7. Knowledge and Reasoning

Knowledge Base

Representation Good knowledge representation should combine = natural language + formal language In this chapter we concentrate on first-order logic (FOL), which forms the basis of most representations schemes in AI.

General Definition Logic  It is a formal language representing information that conclusions can be easy drawn Syntax  defines the sentences in the language Semantic  defines the meaning of the sentences

Type of Logic Language Ontological commitment (what exists in the world) Epistemological commitment (what an agent believes) Propositional logic Facts True/false/unknown First-Order Logic Facts,object,relations Temporal logic Facts,object,relations, times Probability Degree of belief 0..1

Propositional Logic Syntax

Propositional Logic Semantic

Inference Process by which conclusions are reached Logical inference: is a process that implements the entailment(deduction/conclusion) relation between sentences Or truth-preserving

We check  only if KB is true Propositional Inference: Enumeration Method We check  only if KB is true

Normal Forms *

Validity & Satisfiability

Standard Logical Equivalences A  A  A A A  A A  B  B  A A ( B  C)  (A  B)  C [ is associative] A ( B  C)  (A  B)  C [ is associative] A ( B  C)  (A  B)  (A  C) [ is distributive over ] A ( A  B)  A A  ( A  B)  A A  true  A A  false  false A  true  true A  false  A A => B   A  B [implication elimination]  (A  B )  A  B [De Morgan]  (A  B )  A  B [De Morgan]

Seven Inference Rules for Propositional Logic Modus-Ponens or Implication elimination (From an implication and the premise of the implication, you can infer the conclusion)  =>  ,   And-Elimination (From a conjunction, you can infer any of the conjuncts ) 1  2  3…  n i And-Introduction (From a list of sentences, you can infer their conjunction) 1 , 2 , 3…n 1  2  3.. n Or-Introduction (From a sentence, you can infer its disjunction) i 1  2  3.. n

Seven Inference Rules for Propositional Logic Double-Negation Elimination   Unit Resolution (From disjunction, if one is false, then you can infer the other one is true )    ,    Resolution (Because  cannot be true and false in the same time )    ,       

Extra Rules * Introduction A … C A=> C Reduction ad Absurdum  A  If we start from A true, then we reach after many steps C, then A implies C If we start from A false and we reach contradiction, then A is true

Example (1) {A,  A }  (prove ?)  A  A   (using truth table) A   (I will replace  A ) A, A   (I will add from KB A)  ( Elimination)

Example (2) {A  B } {A  B} (prove ?) A  B (assumption) A, B (by  elimination) A  B (by  introduction)

Example (3) ( A  B)  ( B  A)  A (assumption)  A  B (assumption) B (by modus ponens) B,  B (introduce  B by assumption)  A (reduction by absurdum)  B  A ( introduction)

Example (4) * (A  B)  ((B  C)  ((C  D)  (A  D))) ? _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

Inference using rules To proof KB |= A Write KB   A in CNF Form Apply inference rules to find contradiction

Artificial Intelligence First Order Logic Chapter 8 part (I)

Definition General-purpose representation language that is based on an ontological commitment to the existence of the objects and relations in the world. World  consists of: Objects: people, houses, numbers, colors, wares Relations: brother of , bigger than, inside, part of Properties: red, round, long, short,,, Functions: father of, best friend, one more than ,,,

Example “One Plus Two Equals Three “ Objects: One, Two, Three, One Plus Two Relations: Equals Functions: Plus “Congratulation Letter written with Blue Pen“ Objects: Letter, Pen Relations: written with Properties: Blue, Congratulation

Syntax & Semantic In FOL = Sentences + Terms (which represents objects) Sentences  are built using quantifiers and predicate symbols Terms  are built using constants, variables and functions symbol.

| Sentence Connective Sentence | Quantifier Var,,,,,Sentence AtomicSentence | Sentence Connective Sentence | Quantifier Var,,,,,Sentence |  Sentence | (Sentence) Predicate(Term,,,,) | Term = Term Term Function( Term,,,) | Constant | Variable Connective => |  |  |  Quantifier  |  Constant A | 1 | 3 | John | Riad,,,, Variable a | b | c | x | y | z Predicate Before | HasColor |After Function Mother | LeftLegOf | Equal

Syntax and Semantic Predicate Symbol It is a particular relation in the model between pair of objects  Predicate(Term,,,,,) < (1,2) > (3,4) Brother(mohamed,Mostefa) Function Symbol A given object it is related to exactly one other object by the relation  Function(Term,,,,,) FatherOf(Ahmad) Equal(Plus(1,2))

Predicate(Term,,,) or term =term Syntax and Semantic Terms It is an expression that refers to an object  Function(Term,,,) | variable | constant FatherOf( Khalid) x y 2 Riyadh Ahmad Atomic Sentence Is formed from a predicate symbol followed by a parenthesized list of terms. Predicate(Term,,,) or term =term Older(Youssef, 30) 1 = 1

Syntax and Semantic Complex sentences We can use logical connective to construct more complex sentences S1 S1  S2 S1  S2 S1 => S2 S1  S2 > (1,2)   (1,2) > (1,2)   >(1,2)

Model in FOL

 x At(x, PSU) => Smart(x) Syntax and Semantic Universal Quantifier  (variables), (Sentence) Everyone at PSU is smart   x At(x, PSU) => Smart(x) P is conjunction of instantiations of P At( mohamed, PSU) => Smart(mohamed)  At(Khalid, PSU) => Smart(Khalid) ►! The implies (=>) is the main connective with   x At(x, PSU)  Smart(x) will has different meaning: “everyone is at PSU and everyone is smart”

Existential Quantifier Syntax and Semantic Existential Quantifier  (variables), (Sentence) Someone in PSU is smart   x At(x, PSU)  Smart(x)  P is disjunctions of instantiations of P At( mohamed, PSU)  Smart(mohamed)  At(Khalid, PSU)  Smart(Khalid) ►! The and () is the main connective with  x At(x, PSU) => Smart(x) will have different meaning: “The sentence is True for anyone who is not in PSU ” by using the Rule: (A => B)  (A V B )

Properties of Quantifiers

Sentences in FOL

Sentences in FOL

Equality in FOL

“GrandChild – Brother – Sister – Daughter – Son” Exercises Using FOL Exercise#1: Represent the sentence “There are two only smarts students in KSU” x, y, z student(x), student (y), student(z) and smart(X) and sm,art(y) and smart(z) and different(x,y) and (equal(x,z) or equal (y,z))  Exercise#2 (8.11)Page 269 Write axioms describing the predicates: “GrandChild – Brother – Sister – Daughter – Son”

Problem Tariq, Saeed and Yussef belong to the Computer Club. Every member of the club is either programmer or a analysist or both No analysit likes design, and all programmer like C++ Yussef dislikes whatever Tariq likes and likes whatever Tariq dislikes Tariq likes C++ and design

Solution S(x) means x is a programmer M(x) means x is a analysit L(x,y) means x likes y Is there any member of the club who is analysit but not programmer?  x S(x) V M (x) ~  x M(x)  L(x, design)  x S(x) => L(x,C++)  y L(Yussef, y) <=> ~L(Tariq,y) L(tariq, C++) L(Tariq,design)

Asking and Getting answers To add sentence to a knowledge base KB, we would call TELL( KB,  m,c Mother(c ) =m  Female(m)  Parent(m,c)) To ask the KB: ASK( KB, Grandparent(Ahmad,Khalid))

Chaining Simple methods used by most inference engines to produce a line of reasoning Forward chaining: the engine begins with the initial content of the workspace and proceeds toward a final conclusion Backward chaining: the engine starts with a goal and finds knowledge to support that goal

Forward Chaining Data driven reasoning Given database of true facts bottom up Search from facts to valid conclusions Given database of true facts Apply all rules that match facts in database Add conclusions to database Repeat until a goal is reached, OR repeat until no new facts added

Forward Chaining Example Suppose we have three rules: R1: If A and B then D R2: If B then C R3: If C and D then E If facts A and B are present, we infer D from R1 and infer C from R2. With D and C inferred, we now infer E from R3.

Example Rules Facts R1: IF hot AND smoky THEN fire R2: IF alarm-beeps THEN smoky R3: IF fire THEN switch-sprinkler alarm-beeps hot switch-sprinkler Third cycle: R3 holds smoky First cycle: R2 holds fire Second cycle: R1 holds Action

Forward Chaining Algorithm Read the initials facts Begin Filter Phase => Find the fired rules While Fired rules not empty AND not end DO Choice Phase => Solve the conflicts Apply the chosen rule Modify (if any) the set of rule End do End

Backward Chaining Goal driven reasoning To prove goal G: top down Search from hypothesis and finds supporting facts To prove goal G: If G is in the initial facts, it is proven. Otherwise, find a rule which can be used to conclude G, and try to prove each of that rule’s conditions.

Backward Chaining Example The same three rules: R1: If A and B then D R2: If B then C R3: If C and D then E If E is known, then R3 implies C and D are true. R2 thus implies B is true (from C) and R1 implies A and B are true (from D).

Example Rules Hypothesis Evidence Facts R1: IF hot AND smoky THEN fire alarm-beeps hot Facts Rules Hypothesis R1: IF hot AND smoky THEN fire R2: IF alarm-beeps THEN smoky R3: IF fire THEN switch-sprinkler Should I switch the sprinklers on? IF fire Use R3 IF hot  IF smoky Use R1 IF alarm-beeps  Use R2 Evidence

Backward Chaining Algorithm Filter Phase IF set of selected rules is empty THEN Ask the user ELSE WHILE not end AND we have a selected rules DO Choice Phase Add the conditions of the rules IF the condition not solved THEN put the condition as a goal to solve END WHILE

Application Wide use in expert systems Backward chaining: Diagnosis systems start with set of hypotheses and try to prove each one, asking additional questions of user when fact is unknown. Forward chaining: design/configuration systems see what can be done with available components.

Comparison Backward chaining Forward chaining From hypotheses to relevant facts Good when: Limited number of (clear) hypotheses Determining truth of facts is expensive Large number of possible facts, mostly irrelevant Forward chaining From facts to valid conclusions Good when Less clear hypothesis Very large number of possible conclusions True facts known at start

Forward chaining Idea: fire any rule whose premises are satisfied in the KB, add its conclusion to the KB, until query is found

Forward chaining algorithm Forward chaining is sound and complete for Horn KB

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Forward chaining example

Backward chaining Idea: work backwards from the query q: to prove q by BC, check if q is known already, or prove by BC all premises of some rule concluding q Avoid loops: check if new subgoal is already on the goal stack Avoid repeated work: check if new subgoal has already been proved true, or has already failed

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Backward chaining example

Forward vs. backward chaining FC is data-driven, automatic, unconscious processing, e.g., object recognition, routine decisions May do lots of work that is irrelevant to the goal BC is goal-driven, appropriate for problem-solving, e.g., Where are my keys? How do I get into a PhD program? Complexity of BC can be much less than linear in size of KB

Exercise Page 237: 7.4 7.8