‘In which we introduce a logic that is sufficent for building knowledge- based agents!’

Slides:



Advertisements
Similar presentations
Russell and Norvig Chapter 7
Advertisements

Methods of Proof Chapter 7, second half.. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound)
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Methods of Proof Chapter 7, Part II. Proof methods Proof methods divide into (roughly) two kinds: Application of inference rules: Legitimate (sound) generation.
Logic.
Artificial Intelligence Knowledge-based Agents Russell and Norvig, Ch. 6, 7.
Proof methods Proof methods divide into (roughly) two kinds: –Application of inference rules Legitimate (sound) generation of new sentences from old Proof.
Logic in general Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the sentences in the language.
Knowledge Representation & Reasoning.  Introduction How can we formalize our knowledge about the world so that:  We can reason about it?  We can do.
1 Problem Solving CS 331 Dr M M Awais Representational Methods Formal Methods Propositional Logic Predicate Logic.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Knowledge Representation & Reasoning (Part 1) Propositional Logic chapter 6 Dr Souham Meshoul CAP492.
Logical Agents Chapter 7. Why Do We Need Logic? Problem-solving agents were very inflexible: hard code every possible state. Search is almost always exponential.
Knowledge in intelligent systems So far, we’ve used relatively specialized, naïve agents. How can we build agents that incorporate knowledge and a memory?
Logical Agents Chapter 7.
Methods of Proof Chapter 7, second half.
INTRODUÇÃO AOS SISTEMAS INTELIGENTES Prof. Dr. Celso A.A. Kaestner PPGEE-CP / UTFPR Agosto de 2011.
Knowledge Representation & Reasoning (Part 1) Propositional Logic chapter 5 Dr Souham Meshoul CAP492.
Logical Agents Chapter 7 Feb 26, Knowledge and Reasoning Knowledge of action outcome enables problem solving –a reflex agent can only find way from.
Rutgers CS440, Fall 2003 Propositional Logic Reading: Ch. 7, AIMA 2 nd Ed. (skip )
Logical Agents Chapter 7 (based on slides from Stuart Russell and Hwee Tou Ng)
Logical Agents. Knowledge bases Knowledge base = set of sentences in a formal language Declarative approach to building an agent (or other system): 
February 20, 2006AI: Chapter 7: Logical Agents1 Artificial Intelligence Chapter 7: Logical Agents Michael Scherger Department of Computer Science Kent.
Logical Agents (NUS) Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Propositional Logic: Logical Agents (Part I) This lecture topic: Propositional Logic (two lectures) Chapter (this lecture, Part I) Chapter 7.5.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
Logical Agents Logic Propositional Logic Summary
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part A Knowledge Representation and Reasoning.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
1 Problems in AI Problem Formulation Uninformed Search Heuristic Search Adversarial Search (Multi-agents) Knowledge RepresentationKnowledge Representation.
An Introduction to Artificial Intelligence – CE Chapter 7- Logical Agents Ramin Halavati
S P Vimal, Department of CSIS, BITS, Pilani
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 17, 2012.
Logical Agents Chapter 7. Knowledge bases Knowledge base (KB): set of sentences in a formal language Inference: deriving new sentences from the KB. E.g.:
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
1 Logical Agents Chapter 7. 2 A simple knowledge-based agent The agent must be able to: –Represent states, actions, etc. –Incorporate new percepts –Update.
Knowledge Representation Lecture # 17, 18 & 19. Motivation (1) Up to now, we concentrated on search methods in worlds that can be relatively easily represented.
Logical Agents Chapter 7. Outline Knowledge-based agents Logic in general Propositional (Boolean) logic Equivalence, validity, satisfiability.
© Copyright 2008 STI INNSBRUCK Intelligent Systems Propositional Logic.
Dr. Shazzad Hosain Department of EECS North South Universtiy Lecture 04 – Part B Propositional Logic.
1 Propositional Logic Limits The expressive power of propositional logic is limited. The assumption is that everything can be expressed by simple facts.
Computing & Information Sciences Kansas State University Wednesday, 13 Sep 2006CIS 490 / 730: Artificial Intelligence Lecture 10 of 42 Wednesday, 13 September.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Propositional (Boolean) logic Equivalence, validity, satisfiability Inference rules and theorem.
Logical Agents Russell & Norvig Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean)
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents Chapter 7 Part I. 2 Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic.
Proof Methods for Propositional Logic CIS 391 – Intro to Artificial Intelligence.
Logical Agents Chapter 7. Outline Knowledge-based agents Wumpus world Logic in general - models and entailment Propositional (Boolean) logic Equivalence,
Logical Agents. Inference : Example 1 How many variables? 3 variables A,B,C How many models? 2 3 = 8 models.
LOGICAL AGENTS CHAPTER 7 AIMA 3. OUTLINE  Knowledge-based agents  Wumpus world  Logic in general - models and entailment  Propositional (Boolean)
CS666 AI P. T. Chung Logic Logical Agents Chapter 7.
Logical Agents. Outline Knowledge-based agents Logic in general - models and entailment Propositional (Boolean) logic Equivalence, validity, satisfiability.
Propositional Logic: Logical Agents (Part I)
EA C461 – Artificial Intelligence Logical Agent
Logical Agents Chapter 7.
Artificial Intelli-gence 1: logic agents
Logical Agents Chapter 7 Selected and slightly modified slides from
Logical Agents Reading: Russell’s Chapter 7
Logical Agents Chapter 7.
Artificial Intelligence
Artificial Intelligence: Logic agents
Logical Agents Chapter 7.
CS 416 Artificial Intelligence
Knowledge Representation I (Propositional Logic)
Methods of Proof Chapter 7, second half.
Problems in AI Problem Formulation Uninformed Search Heuristic Search
Logical Agents Prof. Dr. Widodo Budiharto 2018
Presentation transcript:

‘In which we introduce a logic that is sufficent for building knowledge- based agents!’

 Introduction  Knowledge-Based Agents  Syntax and Semantics  Entailment  Logical Agents for the Wumpus World  Inference  Propositional Logic  Wumpus World Sentences  Logical Eqivalence  Important Equivalence  Validity and Satisfiability  Resolution  Normal Forms  Forward and Backward Chaining  Conclusion

 The concept of this chapter is the representation of knowledge and the reasoning process that bring knowledge to life.  Humans, it seems, know things and do reasoning. Knowledge and reasoning are also important for artificial agents because they enable successful behaviors that would be very hard to achieve.  The knowledge of problem-solving agents is very specific and inflexible.  Logic will be the primary vehicle for the representing knowledge.The knowledge of logical agents is always definite although each proposition is either true or false in the world.

 Humans can know “things” and “reason” › Representation: How are the things stored? › Reasoning: How is the knowledge used?  To solve a problem…  To generate more knowledge…  Knowledge and reasoning are important to artificial agents because they enable successful behaviors difficult to achieve otherwise › Useful in partially observable environments  Can benefit from knowledge in very general forms, combining and recombining information

 Central component of a Knowledge-Based Agent is a Knowledge-Base › A set of sentences in a formal language  Sentences are expressed using a knowledge representation language  Two generic functions: › TELL - add new sentences (facts) to the KB  “Tell it what it needs to know” › ASK - query what is known from the KB  “Ask what to do next”

 The agent must be able to: › Represent states and actions › Incorporate new percepts › Update internal representations of the world › Deduce hidden properties of the world › Deduce appropriate actions Inferene Engine Knowledge-Base Domain- Independent Algorithms Domain- Specific Content

 Declarative › You can build a knowledge-based agent simply by “TELLing” it what it needs to know  Procedural › Encode desired behaviors directly as program code  Minimizing the role of explicit representation and reasoning can result in a much more efficient system

 Logics are formal languages for representing information such that conclusions can be drawn  Syntax defines the sentences in the language  Semantics define the "meaning" of sentences  Term is a logical expression that refers to an object  Atomic sentence is formed from a predicate symbol followed by a parenthesized list of terms.

Example; -Syntax;  x+2 ≥ y is a sentence;  x2+y > {} is not a sentence -Semantics;  x+2 ≥ y is true iff the number x+2 is no less than the number y  x+2 ≥ y is true in a world where x = 7, y = 1  x+2 ≥ y is false in a world where x = 0, y = 6

 Definition:Knowledge base (KB) entails sentence a(alpha) if and only if a(alpha) is true in all worlds where KB is true  Notation: KB ╞ a (alpha) ‘Entailment is a relationship between sentences that is based on semantics’

 Example; The KB containing ‘the shirt is green’ and ‘the shirt is striped’entails ‘the shirt is green or the shirt is striped’.  Example; x+y=4 entails 4=x+y  Models: Models are formally structured worlds,with respect to which truth can be evaluated.  m is a model of a sentence a(alpha) if a(alpha) is true in m  M(a) is the set of all models of a(alpha)  KB ╞ a(alpha) if and only if M(KB) M(a)  Example; KB = The shirt is green and striped a(alpha) = The shirt is green

 Performance Measure › Gold +1000, Death – 1000 › Step -1, Use arrow -10  Environment › Square adjacent to the Wumpus are smelly › Squares adjacent to the pit are breezy › Glitter iff gold is in the same square › Shooting kills Wumpus if you are facing it › Shooting uses up the only arrow › Grabbing picks up the gold if in the same square › Releasing drops the gold in the same square  Actuators › Left turn, right turn, forward, grab, release, shoot  Sensors › Breeze, glitter, and smell

 Characterization of Wumpus World › Observable  partial, only local perception › Deterministic  Yes, outcomes are specified › Episodic  No, sequential at the level of actions › Static  Yes, Wumpus and pits do not move › Discrete  Yes › Single Agent  Yes

 KB=wumpus-world rules+observations

KB=wumpus-world rules+observations

 KB ├ i α = sentence α can be derived from KB by procedure i (i is an algorithm that derives α from KB )  Soundness: i is sound if whenever KB ├ i α, it is also true that KB╞ α  Completeness: i is complete if whenever KB╞ α, it is also true that KB ├ i α

 Propositional Symbols; A,B,P 1, P 2,ShirtisGreen are atomic sentences. o If S,S 1,S 2 are sentences, then Propositional models; each model specifies true/false for each proposition symbol.

PQ¬P¬P PQPQPQPQPQPQPQPQ False TrueFalse True FalseTrue FalseTrue False TrueFalse TrueFalse True FalseTrue

 Propositional Symbols; P i,j means; ‘there is a pit in [i,j]’ B i,j means; ‘there is a breeze in [i,j]’ Sentences; ‘Pits cause breezes in adjacent squares A square is breezy if and only if there is an adjacent pit

 Tw o sentences are logically equivalent, denoted by;  If they are true in the same models;

 A sentence is valid if it is true in all models  A sentence is satisfiable if it is true in some models;  A sentence is unsatisfiable if it is true in no models

Connects validity and unsatisfiability is valid if and only if is unsatisfiable Connects inference and unsatisfiablity if and only if is unsatisfiable

 There are two kinds of proof methods. These are application of inference rules and model checking.  Application of inference rules; legitimate (sound) generation of new sentences from old. Proof; a sequence of inference rule applications can use inference rules as operators in a standard search algorithm. Typically (in algorithms) require transformation of sentences into a normal form. -Model Checking; KB ├ i α  truth table enumeration (always exponential in n)  backtracking & improved backtracking,  heuristic search in model space (sound but incomplete)

 Literal is an atomic sentence (propositional symbol), or the negation of an atomic sentence  Clause a disjunction of literals  Conjunctive Normal Form (CNF):a conjunction of disjunctions of literals

 In mathematical logic and resolution is a rule of inference leading to a refutation theorem-proving technique for sentences in propositional logic.  In other words, iteratively applying the resolution rule in a suitable way allows for telling whether a propositional formula is satisfiable and for proving that a first-order formula is unsatisfiable.  This method may prove the satisfiability but not always, as it is the case for all methods for first-order logic.

 Example;  Wetness is high and weather is cloudy.  If weather is cloudy, it means that it will rain,  If the wetness is high,weather is hot.  Weather is not hot. CNF

 Forward chaining is one of the two main methods of reasoning when using inference rules in artificial intelligence and can be described logically.Forward chaining is a popular implementation strategy for expert systems, business and production rule systems. The opposite of forward chaining is backward chaining.  Forward chaining starts with the available data and uses inference rules to extract more data until a goal is reached. An inference engineusing forward chaining searches the inference rules until it finds one where the ‘If clause’ is known to be true. When such a rule is found, the engine can conclude, or infer ‘ Then clause ’, resulting in the addition of new information to its data.  Inference engines will iterate through this process until a goal is reached.

 Suppose that the goal is to conclude the color of a pet named Fritz,  given that he croaks and eats flies, and that the rule base contains the following four rules:  If X croaks and eats flies - Then X is a frog  If X chirps and sings - Then X is a canary  If X is a frog - Then X is green  If X is a canary - Then X is yellow

 Let us illustrate forward chaining by following the pattern of a computer as it evaluates the rules. Assume the following facts:  Fritz croaks  Fritz eats flies  Tweety eats flies  Tweety chirps  Tweety is yellow

With forward reasoning, the computer can derive that Fritz is green in four steps: 1. Fritz croaks and Fritz eats flies Based on logic, the computer can derive: 2. Fritz croaks and eats flies Based on rule 1, the computer can derive: 3. Fritz is a frog Based on rule 3, the computer can derive: 4. Fritz is green.

 The name "forward chaining" comes from the fact that the computer starts with the data and reasons its way to the answer, as opposed to backward chaining, which works the other way around.  In the derivation, the rules are used in the reverse order as compared to backward chaining.  The data determines which rules are selected and used, this method is called data-driven, in contrast to goal-driven backward chaining inference.  One of the advantages of forward-chaining over backward-chaining is that the reception of new data can trigger new inferences, which makes the engine better suited to dynamic situations in which conditions are likely to change

Example; suppose that the goal is to conclude whether Tweety or Fritz is a frog, given information about each of them, and that the rule base contains the following four rules:  If X croaks and eats flies – Then X is a frog  If X chirps and sings – Then X is a canary  If X is a frog – Then X is green  If X is a canary – Then X is yellow Let us illustrate backward chaining by following the pattern of a computer as it evaluates the rules. Assume the following facts:  Fritz croaks  Fritz eats flies  Tweety eats flies  Tweety chirps  Tweety is yellow

 With backward reasoning, the computer can answer the question "Who is a frog?" in four steps: In its reasoning, the computer uses a placeholder 1. ? is a frog Based on rule 1, the computer can derive: 2. ? croaks and eats flies Based on logic, the computer can derive: 3. ? croaks and ? eats flies Based on the facts, the computer can derive: 4. Fritz croaks and Fritz eats flies  This derivation will cause the computer to produce Fritz as the answer to the question "Who is a frog?".  Computer has not used any knowledge about Tweety to compute that Fritz is a frog.

 FC is data-driven, automatic, unconscious processing  May do lots of work that is irrelevant to the goal  BC is goal-driven, appropriate for problem-solving  Complexity of BC can be much less than linear in size of KB

 Logical agents apply inference to a knowledge base to derive new information and make decisions  Basic concepts of logic are syntax, semantics, entailment,inference,soundness and completeness.  Wumpus world requires the ability to represent partial and negated information,reason by cases.  Resolution is sound and complete for propositional logic.  Propositional logic lacks expressive power.