Logical Agents Chapter 7 How do we reason about reasonable decisions

Slides:



Advertisements
Similar presentations
Russell and Norvig Chapter 7
Advertisements

Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Logic.
Artificial Intelligence Knowledge-based Agents Russell and Norvig, Ch. 6, 7.
Logical Agents Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Fall 2005.
Knowledge Representation & Reasoning.  Introduction How can we formalize our knowledge about the world so that:  We can reason about it?  We can do.
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 6.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Computing & Information Sciences Kansas State University Lecture 11 of 42 CIS 530 / 730 Artificial Intelligence Lecture 11 of 42 William H. Hsu Department.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Cooperating Intelligent Systems Logical agents Chapter 7, AIMA This presentation owes some to V. Rutgers and D. OSU.
Knoweldge Representation & Reasoning
Logical Agents Chapter 7 Feb 26, Knowledge and Reasoning Knowledge of action outcome enables problem solving –a reflex agent can only find way from.
Logical Agents Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2005.
Logical Agents Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2008.
Rutgers CS440, Fall 2003 Propositional Logic Reading: Ch. 7, AIMA 2 nd Ed. (skip )
Logical Agents Chapter 7 (based on slides from Stuart Russell and Hwee Tou Ng)
Logical Agents. Knowledge bases Knowledge base = set of sentences in a formal language Declarative approach to building an agent (or other system): 
February 20, 2006AI: Chapter 7: Logical Agents1 Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Logical Agents Logic Propositional Logic Summary
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part A Knowledge Representation and Reasoning.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
1 Logical Agents Chapter 7. 2 A simple knowledge-based agent The agent must be able to: –Represent states, actions, etc. –Incorporate new percepts –Update.
Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
1 Logical Inference Algorithms CS 171/271 (Chapter 7, continued) Some text and images in these slides were drawn from Russel & Norvig’s published material.
1 The Wumpus Game StenchBreeze Stench Gold Breeze StenchBreeze Start  Breeze.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
1 UNIT-3 KNOWLEDGE REPRESENTATION. 2 Agents that reason logically(Logical agents) A Knowledge based Agent The Wumpus world environment Representation,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7 Part I. 2 Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Artificial Intelligence Logical Agents Chapter 7.
Logical Agents. Inference : Example 1 How many variables? 3 variables A,B,C How many models? 2 3 = 8 models.
LOGICAL AGENTS CHAPTER 7 AIMA 3. OUTLINE  Knowledge-based agents  Wumpus world  Logic in general - models and entailment  Propositional (Boolean)
CS666 AI P. T. Chung Logic Logical Agents Chapter 7.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Propositional Logic: Logical Agents (Part I)
Notes 7: Knowledge Representation, The Propositional Calculus
EA C461 Artificial Intelligence
Knowledge and reasoning – second part
EA C461 – Artificial Intelligence Logical Agent
Notes 7: Knowledge Representation, The Propositional Calculus
Logical Agents Chapter 7.
Learning and Knowledge Acquisition
Artificial Intelligence Logical Agents
EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS
Logical Agents Chapter 7 Selected and slightly modified slides from
Logical Agents Reading: Russell’s Chapter 7
Logical Agents Chapter 7.
Artificial Intelligence
Artificial Intelligence: Agents and Propositional Logic.
CS 416 Artificial Intelligence
Knowledge and reasoning – second part
Logical Agents Chapter 7.
CS 416 Artificial Intelligence
Knowledge Representation I (Propositional Logic)
Methods of Proof Chapter 7, second half.
CMSC 471 Fall 2011 Class #10 Tuesday, October 4 Knowledge-Based Agents
Problems in AI Problem Formulation Uninformed Search Heuristic Search
Logical Agents Prof. Dr. Widodo Budiharto 2018
Presentation transcript:

Logical Agents Chapter 7 How do we reason about reasonable decisions Fall 2009 Copyright, 1996 © Dale Carnegie & Associates, Inc.

A knowledge-based agent Accepting new tasks in explicit goals Knowing about its world current state of the world, unseen properties from percepts, how the world evolves help deal with partially observable environments help understand “John threw the brick thru the window and broke it.” – natural language understanding Reasoning about its possible course of actions Achieving competency quickly by being told or learning new knowledge Adapting to changes by updating the relevant knowledge CS 471/598 by H. Liu

Knowledge Base A knowledge base (KB) is a set of representations (sentences) of facts about the world. TELL and ASK - two basic operations to add new knowledge to the KB to query what is known to the KB Infer - what should follow after the KB has been TELLed. CS 471/598 by H. Liu

A generic KB agent (Fig 7.1) CS 471/598 by H. Liu

Three levels of A KB Agent Knowledge level (the most abstract) Logical level (knowledge is of sentences) Implementation level Building a knowledge base A declarative approach - telling a KB agent what it needs to know A procedural approach – encoding desired behaviors directly as program code A learning approach - making it autonomous CS 471/598 by H. Liu

Specifying the environment The Wumpus world (Fig 7.2) in PEAS Performance: +1000 for getting the gold, -1000 for being dead, -1 for each action taken, -10 for using up the arrow Goal: bring back gold as quickly as possible Environment: 4X4, start at (1,1) ... Actions: Turn, Grab, Shoot, Climb, Die Sensors: (Stench, Breeze, Glitter, Bump, Scream) It’s possible that the gold is in a pit or surrounded by pits -> try not to risk life, just go home empty-handed The variants of the Wumpus world – they can be very difficult Multiple agents Mobile wumpus Multiple wumpuses CS 471/598 by H. Liu

Wumpus World PEAS description Performance measure gold +1000, death -1000 -1 per step, -10 for using the arrow Environment Squares adjacent to wumpus are smelly Squares adjacent to a pit are breezy Glitter iff gold is in the same square Shooting kills wumpus if you are facing it Shooting uses up the only arrow Grabbing picks up gold if in same square Releasing drops the gold in same square Sensors: Stench, Breeze, Glitter, Bump, Scream Actuators: Left turn, Right turn, Forward, Grab, Release, Shoot CS 471/598 by H. Liu

Acting & reasoning Let’s play the wumpus game! The conclusion: “what a fun game!” Another conclusion: If the available information is correct, the conclusion is guaranteed to be correct. Figs 7.3 and 7.4 CS 471/598 by H. Liu

Logic The primary vehicle for representing knowledge Simple Concise Precise Can be manipulated following rules It cannot represent uncertain knowledge well (so it’s where new research is about) We will learn Logic first and other techniques later CS 471/598 by H. Liu

Logics A logic consists of the following: Some examples of logics ... A formal system for describing states of affairs, consisting of syntax (how to make sentences) and semantics (to relate sentences to states of affairs). A proof theory - a set of rules for deducing the entailments of a set of sentences. Some examples of logics ... CS 471/598 by H. Liu

Propositional Logic e.g., D means “the wumpus is dead” In this logic, symbols represent whole propositions (facts) e.g., D means “the wumpus is dead” W1,1 Wumpus is in square (1,1) S1,1 there is stench in square (1,1). Propositional logic can be connected using Boolean connectives to generate sentences with more complex meanings, but does not specify how objects are represented. CS 471/598 by H. Liu

Other logics First order logic represents worlds using objects and predicates on objects with connectives and quantifiers. Temporal logic assumes that the world is ordered by a set of time points or intervals and includes mechanisms for reasoning about time. CS 471/598 by H. Liu

Other logics (2) Probability theory allows the specification of any degree of belief. Fuzzy logic allows degrees of belief in a sentence and degrees of truth. CS 471/598 by H. Liu

Propositional logic Syntax Semantics A set of rules to construct sentences: and, or, imply, equivalent, not literals, atomic or complex sentences BNF grammar (Fig 7.7, P205) Semantics Specifies how to compute the truth value of any sentence Truth table for 5 logical connectives (Fig 7.8) CS 471/598 by H. Liu

Knowledge Representation Syntax - the possible configurations that can constitute sentences Semantics - the meaning of the sentences x > y is a sentence about numbers; or x+y=4; A sentence can be true or false Defines the truth of each sentence w.r.t. each possible world What are possible worlds for x+y = 4 Entailment: one sentence logically follows another  |= , iff in every model in which  is true,  is also true `Sentences’ entails `sentence’ w.r.t. `aspects’ follows `aspect’ (Fig 7.6) Sentence is x+y=4; one model (a possible world) is x = 2 and y = 2 Sentence is true in some models and false in other models CS 471/598 by H. Liu

Reasoning KB entails sentence  if KB is true,  is true Model checking (Fig 7.5) for two sentences/models Asking whether KB entails s given KB? 1 = “There is no pit in [1,2]” -> yes or no? 2 = “There is no pit in [2,2]” -> yes or no? P[1,2], P[2,2], and P[3,1], a total of 2^3 = 8 models, KB is in read, sentences (alpha1 and alpha2) CS 471/598 by H. Liu

An inference procedure can generate new valid sentences or verify if a sentence is valid given KB is sound if it generates only entailed sentences A proof is the record of operation of a sound inference procedure An inference procedure is complete if it can find a proof for any sentence that is entailed. Sound reasoning is called logical inference or deduction. A reasoning system should be able to draw conclusions that follow from the premises, regardless of the world to which the sentences are intended to refer. CS 471/598 by H. Liu

Equivalence, validity, and satisfiability Logical equivalence requires  |= and  |=  Validity: a sentence  is true in all models Valid sentences are tautologies (P v !P) “deduction theorem”: for any  and ,  |= iff the sentence ( ) is valid Satisfiability: a sentence  is satisfiable if it is true in some models E.g., A v B, P  |= iff the sentence ( ^ !) is unsatisfiable or !( ^ !) is valid . Connecting validity and satisfiability:  is valid iff ! is unstatisfiable; contrapositively,  is satisfiable iff ! is not valid. alpha /= beta iff in every model in which alpha is true, beta is also true ! (a =>b), !(!a v b), a ^ !b CS 471/598 by H. Liu

Inference Truth tables can be used not only to define the connectives, but also to test for validity: If a sentence is true in every row, it is valid. What is a truth table for “Premises imply Conclusion” A simple knowledge base for Wumpus A simple KB with five rules (P208) What if we write R2 as B1,1 => (P1,2 v P2,1) Think about the definition of => KB |= . Let’s check its validity (Fig 7.9) E.g., in Figure 7.9, there are three true models for the KB with 5 rules. A truth-table enumeration algorithm (Fig 7.10) There are only finitely many models to examine, but it is exponential in size of the input (n) Can we prove this? The proof – what’s the size of the truth table? CS 471/598 by H. Liu

Reasoning Patterns in Prop Logic  |= iff the sentence ( ^ !) is unstatisfiable  are known axioms, thus true (T) Proof by refutation (or contradiction): assuming  is F, !  is T, we now need to prove !(^T) is valid, … Inference rules Modus Ponens, AND-elimination, Bicond-elimination All the logical equivalences in Fig 7.11 A proof is a sequence of applications of inference rules An example to conclude neither [1,2] nor [2,1] contains a pit Start with R2 Monotonicity (consistency): the set of entailed sentences can only increase as information is added to KB For  and , if KB |=  then KB^ |=  Propositional logic and first-order logic are monotonic The first bullet follows from the deduction theorem. CS 471/598 by H. Liu

Resolution – an inference rule An example of resolution R11, R12 (new facts added), R13, R14 (derived from R11, and R12), R15 from R3 and R5, R16, R17 – P3,1 (there is a pit in [3,1]) (P213) Unit resolution: l1 v l2 …v lk, m = !li We have seen examples earlier Full resolution: l1 v l2 …v lk, m1 v…v mn where li = mj An example: (P1,1vP3,1, !P1,1v!P2,2)/P3,1v!P2,2 Soundness of resolution Considering literal li, If it’s true, mj is false, then … If it’s false, … CS 471/598 by H. Liu

Refutation completeness Resolution can always be used to either confirm or refute a sentence Conjunctive normal form (CNF) A conjunction of disjunctions of literals A sentence in k-CNF has exactly k literals per clause (l1,1 v … v l1,k) ^…^ (ln,1 v …v ln,k) A simple conversion procedure (turn R2 to CNF, next slide or see P.215) CS 471/598 by H. Liu

Conversion to CNF B1,1  (P1,2  P2,1) Eliminate , replacing α  β with (α  β)(β  α). (B1,1  (P1,2  P2,1))  ((P1,2  P2,1)  B1,1) 2. Eliminate , replacing α  β with α β. (B1,1  P1,2  P2,1)  ((P1,2  P2,1)  B1,1) 3. Move  inwards using de Morgan's rules and double-negation: (B1,1  P1,2  P2,1)  ((P1,2  P2,1)  B1,1) 4. Apply distributivity law ( over ) and flatten: (B1,1  P1,2  P2,1)  (P1,2  B1,1)  (P2,1  B1,1) CS 471/598 by H. Liu

A resolution algorithm (Fig 7.12) An example (KB= R2^R4, to prove !P1,2, Fig. 7.13) Completeness of resolution Ground resolution theorem CS 471/598 by H. Liu

Horn cluases A Horn clause is a disjunction of literals of which at most one is positive An example: (!L1,1 v !Breeze V B1,1) An Horn sentence can be written in the form P1^P2^…^Pn=>Q, where Pi and Q are nonnegated atoms Deciding entailment with Horn clauses can be done in linear time in size of KB Inference with Horn clauses can be done thru forward and backward chaining Forward chaining is data driven Backward chaining works backwards from the query, goal-directed reasoning CS 471/598 by H. Liu

An Agent for Wumpus The knowledge base (an example on p208) Bx,y  …, Sx,y … There is exactly one W: (1) there is at least one W, and (2) there is at most one W Finding pits and wumpus using logical inference Keeping track of location and orientation Translating knowledge into action A1,1^EastA^W2,1=>!Forward Problems with the propositional agent too many propositions to handle (“Don’t go forward if…”) hard to deal with change (time dependent propositions) (p225) has the details for the first bullet CS 471/598 by H. Liu

Summary Knowledge is important for intelligent agents Sentences, knowledge base Propositional logic and other logics Inference: sound, complete; valid sentences Propositional logic is impractical for even very small worlds Therefore, we need to continue our AI class ... CS 471/598 by H. Liu