Presentation is loading. Please wait.

Presentation is loading. Please wait.

CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 17, 2012.

Similar presentations


Presentation on theme: "CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 17, 2012."— Presentation transcript:

1 CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 17, 2012

2 Outline Finish python examples Finish reflex agent discussion Knowledge-based agent – Requirements KB agent as a computer program Example of KB agent: the wumpus world Entailment and Inference Propositional logic (review) Propositional KB for the wumpus world

3 Vacuum-cleaner world (review) Percepts: location and contents, e.g., [A,Dirty] Actions: Left, Right, Suck, NoOp Production rules: [A, Dirty]  Suck [B, Dirty]  Suck [A, Clean]  Right [B, Clean]  Left

4 Reflex Agent’s Abilities Consider the “reflex” vacuum agent – Lacks explicit knowledge of the environment – Lack memory of its own past behavior – Lacks knowledge of the effects of what its actions The environment may behave reasonably, but the agent does not have any machinery for understanding that concept & making predictions Note: what do we mean by an environment tat behaves reasonably??? – According to OUR COMMON SENSE !!!!

5 “Reasonable” behavior of the environment ? 1. If dirty and action ≠ suck, that square will be dirty the next time it is perceived 2. If action is suck or nop then the next percept is the same location 3. If action is left or right then the next percept is the expected location (note: what if we are at B and go right??) What kind of agent could include this kind of common sense knowledge and take advantage of it?

6 Consider extensions of the vacuum agent Consider an environment with three squares: A, B, C, and the same percepts and actions – specify the production rules What about a grid-shaped environment (say, 2 by 2)? – Would need additional actions: (up down left right). Define a declarative representation of: – The world – The agent’s history Consider how that knowledge would be used

7 Evaluating An Agent’s Performance Consider a vacuum agent trying to be rational under a performance measure that assigns a cost to actions and a reward for keeping the room clean – How to represent this knowledge ? How it use it? – Would it need knowledge of its own prior actions (?) Consider a vacuum agent with parameters: Ci = cost of each action i P = penalty for each dirty square at each time step Strategy to minimize total cost + total penalties NOTE: what else would we want to know ?????

8 Elements of a Knowledge-based agent (the knowledge maintenance part) World Knowledge Base Percept Updated World Knowledge Base We need 3 things: A formal language for expressing knowledge declaratively “knowledge representation” A knowledge base design to express what is known “knowledge engineering” or “ontology” Algorithms to use and update the KB “automated inference”

9 Ontology Design (“Knowledge Engineering”) General world knowledge: how someone becomes POTUS Facts that happen to be true In current world state (dynamic) Obama is POTUS New percepts are added to lower box

10 Example World Knowledge Base General knowledge includes family relations such as: a female with the same parents is a sister a male with the same parents is a brother parents of your parents are your grandparents a male child of your brother or sister is your nephew Facts include: Sam and Mary have a male child named Max New Percept: Sam and Mary have a female child named Sarah Updated Knowledge Sam and Mary have a female child named Sarah Max has a sister named Sarah Sarah has a brother named Max Percept: Mary has a father named Tom -- ?? Percept: Max has a son named Simon -- ??

11 KB agent as a computer program The declarative approach: beliefs are represented as a set of sentences in a formal logic A set of sentences are stored in a database (called a knowledge base). The sentences are believed to be true by the agent. Other beliefs of the agent can be inferred from these. Trade-off in choosing a representation language: – Expressiveness of language v. Tractability of inference – Horn clauses  logic sentences

12 Formal logic basis of automated reasoning If beliefs include both p and p  q, then the agent can infer q. (This logical inference rule is called modus ponens). A generalized version of this is the Resolution Rule. Example: – T1. raining  ground is wet – T2. ground is wet  ground is slippery – P1  raining Infer: ground is wet Infer: ground is slippery This is called “chaining” and by chaining we can produce complex reasoning sequences

13 Wumpus World PEAS description Performance measure – gold +1000, death -1000 – -1 per step, -10 for using the arrow Environment – Squares adjacent to wumpus are smelly – Squares adjacent to pit are breezy – Glitter iff gold is in the same square – Shooting kills wumpus if you are facing it – Shooting uses up the only arrow – Grabbing picks up gold if in same square – Releasing drops the gold in same square Actions: Left turn, Right turn, Forward, Grab, Release, Shoot Sensors: Stench, Breeze, Glitter, Bump, Scream

14 Exploring a wumpus world 1

15 Exploring a wumpus world 2

16 Exploring a wumpus world 2a

17 Exploring a wumpus world 3,4

18 Exploring a wumpus world 4a

19 Exploring a wumpus world 5

20 Exploring a wumpus world 5a

21 Exploring a wumpus world 6

22 What did we just do? Analyze and describe graphically what knowledge the agent needs to represent and how it evolves. The first step in building a formal framework for this domain.

23 Logic in general Logics are formal languages for representing information such that conclusions can be drawn Syntax defines the acceptable sentences in the language (called wffs or well formed formulas) Semantics define the "meaning" of sentences – i.e., how to decide truth of a sentence in a world model E.g., the language of arithmetic – x+2 > y is a sentence; x2+y > is not a sentence (syntax) – x+2 > y is true iff the number x+2 is greater than the number y – x+2 > y is true in a world where x = 7, y = 1 – x+2 > y is false in a world where x = 1, y = 7

24 Entailment and Inference Entailment means that one thing follows from another: KB ╞ α Knowledge base KB entails sentence α if and only if α is true in all world models where KB is true – E.g., the KB containing “the Giants won” and “the Reds won” entails “The Giants won and the Reds won” – E.g., x+y = 4 entails 4 = x+y – Entailment is a relationship between sentences that is based on their meaning (semantics) Remember: KB is a database of what the agent knows

25 Inference KB ├ i α = sentence α can be derived from KB by an algorithmic procedure i. It is a relationship between sentences based on their syntax. Soundness: i is sound if whenever KB ├ i α, it is also true that KB ╞ α Completeness: i is complete if whenever KB ╞ α, it is also true that KB ├ i α Preview: later we will define a logic (first-order logic) which is expressive enough to say many things of interest, and for which there exists a sound and complete inference procedure. That is, the procedure will answer any question whose answer follows from what is known by the KB.

26 Proof Theory and Model Theory for Logic Sentences Facts Symbolic Expressions World (Model) Denote Inference Entailment

27 Entailment and Inference Entailment means that one thing follows from another: KB ╞ α Knowledge base KB entails sentence α if and only if α is true in all world models where KB is true – E.g., the KB containing “the Giants won” and “the Reds won” entails “The Giants won and the Reds won” – E.g., x+y = 4 entails 4 = x+y – Entailment is a relationship between sentences that is based on their meaning (semantics) Remember: KB is a database of what the agent knows

28 Inference KB ├ i α = sentence α can be derived from KB by an algorithmic procedure I. It is a relationship between sentences based on their syntax. Soundness: i is sound if whenever KB ├ i α, it is also true that KB ╞ α Completeness: i is complete if whenever KB ╞ α, it is also true that KB ├ i α Preview: later we will define a logic (first-order logic) which is expressive enough to say many things of interest, and for which there exists a sound and complete inference procedure. That is, the procedure will answer any question whose answer follows from what is known by the KB.

29 Propositional logic: Syntax Propositional logic is the simplest logic – illustrates automated reasoning algorithms – The proposition symbols P1, P2.. etc are (atomic) sentences – Compound sentences: – If S is a sentence,  S is a sentence (negation) – If S 1 and S 2 are sentences, S 1  S 2 is a sentence (conjunction) – If S 1 and S 2 are sentences, S 1  S 2 is a sentence (disjunction) – If S 1 and S 2 are sentences, S 1  S 2 is a sentence (implication) – If S 1 and S 2 are sentences, S 1  S 2 is a sentence (biconditional )

30 Examples Propositions: Red, Yellow, Round, Oblong, Apple, Banana Red ^ Round Yellow ^ ~Round Oblong v Apple

31 Propositional logic: Semantics A model assigns true/false for each proposition symbol E.g. P 1,2 P 2,2 P 3,1 falsetruefalse With 3 symbols, 8 possible models can be enumerated Rules for evaluating truth of compound sentences in model m:  Sis true iff S is false S 1  S 2 is true iff S 1 is true and S 2 is true S 1  S 2 is true iff S 1 is true or S 2 is true S 1  S 2 is true iffS 1 is false or S 2 is true i.e., is false iffS 1 is true and S 2 is false S 1  S 2 is true iffS 1  S 2 is true andS 2  S 1 is true Simple recursive process evaluates an arbitrary sentence, e.g.,  P 1,2  (P 2,2  P 3,1 ) = true  (true  false) = true  true = true

32 Semantic Reasoning: Truth Tables Discuss “material implication” [the == > connective] For a given set of proposition symbols P1... PN, the maximum number of models is 2 N The number of models of a given set of sentences is at most equal to 2 number of proposition symbols. (However, it may be less – why???)

33 Logical equivalence (can be proved using truth tables) Two sentences are logically equivalent} iff true in same models: α ≡ ß iff α ╞ β and β ╞ α


Download ppt "CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 17, 2012."

Similar presentations


Ads by Google