KR Using rules IF.. THEN ECA (Event Condition Action) RULES. APLLICATIONS EXAMPLES 1.If flammable liquid was spilled, call the fire department. 2.If the.

Slides:



Advertisements
Similar presentations
Natural Language Processing Lecture 2: Semantics.
Advertisements

Default Reasoning the problem: in FOL, universally-quantified rules cannot have exceptions –  x bird(x)  can_fly(x) –bird(tweety) –bird(opus)  can_fly(opus)
1 Antecedent-Consequent Rules Rn:IF condition 1 condition 2... THEN consequent 1 consequent 2...
Knowledge Representation
Conceptual Dependency ATRANS – abstract transfer (give) PTRANS – physical transfer (move) PROPEL – application of force (push) MTRANS – mental transfer.
Knowledge Representation. Essential to artificial intelligence are methods of representing knowledge. A number of methods have been developed, including:
Knowledge Representation.  What is Knowledge Representation What is Knowledge Representation  Type of knowledge in AI Type of knowledge in AI  Declarative.
CPSC 322 Introduction to Artificial Intelligence November 5, 2004.
Representations One of the major distinctions between ordinary software and AI is the need to represent domain knowledge (or other forms of worldly knowledge)
1 Pertemuan 14 Strong Slot-and-Filler Structures Matakuliah: T0264/Inteligensia Semu Tahun: Juli 2006 Versi: 2/1.
1 Knowledge Representation We’ve discussed generic search techniques. Usually we start out with a generic technique and enhance it to take advantage of.
Conceptual Dependency
Chapter 7 Knowledge Terms: concept, categorization, prototype, typicality effect, object concepts, rule-governed, exemplars, hierarchical organization,
CSC 550: Introduction to Artificial Intelligence
Knowledge Representation
Organization of Semantic Memory The study of Collins & Quillian (1969):Collins & Quillian (1969): The authors were interested in the organization of semantic.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Chapter 6: Design of Expert Systems
Varieties of Learning Structural descriptions and instances Scenarios and locations; eating in a fast food restaurant Perceptual and semantic representations.
PSY 369: Psycholinguistics Mental representations II.
Natural Categories Hierarchical organization of categories –Superordinate (e.g., furniture) –Basic-level (e.g., chair) –Subordinate (e.g., armchair) Rosch.
Knowledge Acquisitioning. Definition The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
1 Pertemuan 13 Strong Slot-and-Filler Structures Matakuliah: T0264/Inteligensia Semu Tahun: 2005 Versi: 1/0.
Knowledge Representation
Second Grade English High Frequency Words
What is your favorite subject in school?
Knowledge Representation
The human 2 of 3 1 The Human 2 of 3 chapter 1. the human 2 of 3 2 the human Lecture 2 Information i/o … visual, auditory, haptic, movement Lecture 3 (today)
Sight words.
Representations One of the major distinctions between ordinary software and AI is the need to represent domain knowledge (or other forms of worldly knowledge)
Early Work Masterman: 100 primitive concepts, 15,000 concepts Wilks: Natural Language system using semantic networks Shapiro: Propositional calculus based.
Structured Knowledge Chapter 7. 2 Logic Notations Does logic represent well knowledge in structures?
Knowledge Representation. Key Issue in AI 1. Mapping between objects and relations in a problem domain and computational objects and relations in a program.
PSY 323 – COGNITION Chapter 9: Knowledge.  Categorization ◦ Process by which things are placed into groups  Concept ◦ Mental groupings of similar objects,
Rule Based Systems  If then  Rules to solve problems  What is forward chaining?  What is backward chaining?  Expert systems.
Similarity and Attribution Contrasting Approaches To Semantic Knowledge Representation and Inference Jay McClelland Stanford University.
Rules, Movement, Ambiguity
Cognitive Processes PSY 334 Chapter 5 – Meaning-Based Knowledge Representation.
1 Lectures on Artificial Intelligence (CS 364) 1 Khurshid Ahmad Professor of Artificial Intelligence Centre for Knowledge Management September 2001.
Knowledge Representation
1 How is knowledge stored? Human knowledge comes in 2 varieties: Concepts Concepts Relations among concepts Relations among concepts So any theory of how.
Artificial Intelligence Knowledge Representation.
Knowledge Representation
Knowledge Engineering. Sources of Knowledge - Books - Journals - Manuals - Reports - Films - Databases - Pictures - Audio and Video Tapes - Flow Diagram.
WHAT DOES THE WORD SCIENCE MEAN?
Artificial Intelligence – CS364 Knowledge Representation Lectures on Artificial Intelligence – CS364 Conceptual Dependency 20 th September 2005 Dr Bogdan.
Definition and Technologies Knowledge Representation.
Module 5 Other Knowledge Representation Formalisms
Knowledge Representation Techniques
CO Games Development 2 Week 20 Production Systems
Artificial Intelligence
Year 2 – Autumn Term – Science Plan Focus – Dangerous Animals
Knowledge Representation
Ostrich By: Nico Briceno
Unit 3: Learning, Memory, Thinking, and Language
Knowledge Representation
Artificial Intelligence (CS 370D)
SISTEM INFORMASI MANAJEMEN 2
Conceptual Dependency (CD)
Knowledge Representation
Strong Slot-and-Filler Structures
Strong Slot-and-Filler Structures
KNOWLEDGE REPRESENTATION
Semantic Nets and Frames
Inference and Resolution for Problem Solving
Human Computer Interaction Lecture 3 The Human
Structured Knowledge Representation
Subject : Artificial Intelligence
Conceptual Dependency Theory
Knowledge Representation
Presentation transcript:

KR Using rules IF.. THEN ECA (Event Condition Action) RULES. APLLICATIONS EXAMPLES 1.If flammable liquid was spilled, call the fire department. 2.If the pH of the spill is less than 6, the spill material is an acid. 3.If the spill material is an acid, and the spill smells like vinegar, the spill material is acetic acid. ( are used to represent rules)

FACTS MATCH EXECUTE [ ] [ ] [ ] Fig. 1 the rule Interpreted cycles through a Match- Execute sequence

FACTS A flammable liquid was spilled The pH of the spill is < 6 Spill smells like vinegar The spill material is an acid MATCH EXECUTE If the pH of the spill is less than 6,the spill material is acid RULES Fig.2 Rules execution can modify the facts in the knowledge base New fact added to the KB

FACTS A flammable liquid was spilled The pH of the spill is < 6 Spill smells like vinegar The spill material is an acid ACETIC ACID MATCH EXECUTE If the spill material is an acid and the spill smells like vinegar, the spill material is acetic acid RULES Fig.3 Facts added by rules can match rules

FACTS A flammable liquid was spilled The pH of the spill is < 6 Spill smells like vinegar MATCH EXECUTE If a flammable liquid was spilled, call the fire department RULES Fig.4 Rule execution can affect the real world Fire dept is called

The pH of the spill is < 6 The spill material is an acid Spill smells like vinegar The spill material is an acetic acid Fig.5 Inference chain for inferring the spill material

A B G C E H D A E G C B H B F A E GC H D Z A G F D E H B C MATCH EXECUTE F &B  Z C &D  F A  D F &B  Z C &D  F A  D F &B  Z C &D  F A  D RULES Fig. 6 An example of forward chaining

AD C F B Z Fig. 7 Inference chain produced by Fig. 6

FACTS Step RULES A E H GCB AE H G B C AE G H BCCCC CC C AA A AAA EE E E EE GG G G G G HH H H H H B B B BBB D F D F Z F&B  Z C&D  F A  D F&B  Z C&D  F A  D F&B  Z C&D  F A  D F&B  Z C&D  F A  D F&B  Z C&D  F A  D F&B  Z C&D  F A  D F&B  Z C&D  F A  D F&B  Z C&D  F A  D F&B  Z C&D  F A  D Need to get F B Z not here Want Z ZhereZhere Get C D F not here Want F F here C here Want C Need to Get A D not here Want D Want A A here Have C & D Have F & B Have Z Execute DhereDhere Fig. 8 An example of Backward Chaining

Figure 1 ANTECEDENTSCONSEQUENTS …… Rn If if1 if2 : then then1 then2 : Z1If?x has hair then ?xis a mammal Z2If?x gives milk then ?x is a mammal Z3If?x has feathers then ?xis a bird Z4If?x flies ?x lays eggs then ?xis a bird

Z5If?xis a mammal ?xeats meat then?xis carnivore Z6If?xis a mammal ?xhas pointed teeth ?xhas claws ?xhas forward-pointing eyes then?xis carnivore Z7If?xis a mammal ?xhas hoops then?xis an ungulate Z8If?xis a mammal ?xchews cud then?xis an ungulate

Z9If?xis a carnivore ?xhas tawny color ?xhas dark spots then?xis a cheetah Z10If?xis a carnivore ?xhas tawny color ?xhas dark spots then?xis a tiger Z11If?xis an ungulate ?xhas long legs ?xhas long neck ?xhas tawny color ?xhas dark spots then?xis a giraffe

Z12If?xis a ungulate ?xhas white color ?xhas black strips then?xis a zebra Z13If?xis a bird ?xdoes not fly ?xhas long legs ?xhas long neck ?xis black and white then?xis a ostrich Z14If?xis a bird ?xdoes not fly ?xhas swim ?xis black and white then?xis a penguin

Z15If?xis a bird ?xis a good flyer then?xis an albatross Stretch has hair. Stretch chews cud. Stretch has long legs. Stretch has long neck. Stretch has tawny color. Stretch has dark spots.

Z1 Z8 Z11 Fired first Has hair is a mammal Fired second is an ungulate Fired third is a giraffe Has long legs Has long neck Has tawny color Has dark sports Chews cud FIGURE: 2

Z6 Z1 Z9 Z5 First rule used Second rule used Third rule used Fourth rule used Has forward-pointing eyes Has claws Has pointed teeth is a carnivore is a cheetah is a mammal Has hair Eats meat Has tawny color Has dark sports FIGURE: 3

ANIMAL BirdFish Canary Has Wings Can fly Has feathers Has Skin Can Move Around Eats Breathes Can Sing Is Yellow Has Long Thin Legs Is all Can’t fly Can Bite Is Dangerous Shark Is Pink Is Edible Swims Upstream to lay Eggs Fig. 1 A Typical Semantic Network Salmon Ostrich Ross Quillian

PENGUIN VICTOR CHARLEY POODLE DOG TERRIOR LIKES INST SUBC INST Fig. 5 Semantic Network with Property Relations

PENGUIN VICTOR INST Fig.2 Simple Semantic Network

1.Victor is a Penguin 2.All Penguins are birds 3.All Birds are animals 4.All Mammals are animals 5.Charles is a Poodle 6.All dogs are mammals 7.All Poodle are Dogs 8.All Terriors are Dogs Fig. 3 Facts about the Animal Kingdom

ANIMAL BIRD PENGUIN VICTOR MAMMAL DOG POODLE TERRIER CHARLEY SUBC INSTANCE SUBC INST Fig. 4 A larger Semantic Network From fig. 2

ANIMAL MAMMALBIRD PENGUINDOG VICTORPOODLE CHARLEY RUN CAN BARK TERRIER FRIENDLY BLACK HOSTILE CAN FLY SUBC INST SUBC PROP LIKES FIG.6 COMPLEX Semantic Network with properties

DOG POODLE LABRADOR RETRIEVER SUSIE CHARLEY BLACK SUBC INST PROP Fig. 7 What can we do with this network ?

NAME SLOT – 1Filler SLOT – 2Filler SLOT – 3Filler SLOT - NFiller INST SUBS INST Slot / Filler Pair INST A Frame Inheritance Link Fig. 8 The Structure of a Frame System

HOME EARTH DOG SUBC ANIMAL SLOT CHRIS INST SLOT CHARLEY INST SLOT COLOR BLACK INST OWNER LOGIC OF FRAMES Fig- 9 Inheritance in a simple frame system

Conceptual Dependency Knowledge representation in natural language sentences The goal is to represent the knowledge in a way that: –Facilitates drawing inference from the sentences –Is independent of the language in which the sentences were originally stated. p Oto I ATRANS book man I < from R

Symbols Arrow – direction of dependency Double arrows – two way link between actor and action. P indicates past tense ATRANS is one of the primitive acts used by the theory. It indicates transfer of possession. O indicates object case relation. R indicates the recipient case relation. Fig.1 A sample conceptual dependency Representation.

PRIMITIVES ATRANS - Transfer of an abstract relationship PTRANS - Trans of physical location of an object PROPLE - Application of physical force to an object MOVE - Movement of a body part by its owner (e.g.. Kick) GRASP - Grasping of an object by an actor INGEST - Ingestion of an object by an animal (e.g. Eat) EXPEL - Expulsion of something from the body of an animal (e.g. Tell) MTRANS - Transfer of mental information (e.g. Say) SPEAK - Production of sounds (e.g. Say) ATTEND – Focusing of a sense organ toward a stimulus (e.g. Listen) Dependencies among the Conceptualization There are four primitives conceptual categories from which dependency structures can be built. They are :

ACTS Actions PPs Objects (picture products) AAs Modifiers of action (action aiders) PAs Modifiers of PPs (picture aiders) Rules Examples of their use English version of p the example 1.PP ACT John PTRANS John ran Rule 1 describes the relationship between an actor and the event he or she causes – Two way dependency – p past tense. 2.PP ⇚⇛ PA John (height > average) John is tall Rule 2 describes the relationship between PP and a PA that is being asserted to describe it. Many state description such as height, are represented in CD as numeric scales. 3. PP PP Jhon doctor John is a doctor. Rule 3 describes the relationship between two PPs one of which belongs to the set defined by the other. ⇚⇛

4.PP boy A nice boy PA nice Rule 4 describes the relationship between a PP and an attribute that has already been predicted of it. Direction – toward PP 5.PP dog Poss-by John’s dog PP John Rule 5 describes the relationship between two PPs, one of which provides a particular kind of information about the other. Three types of information are: Possession – POSS-BY Location – LOC Physical containment – CONT The direction of arrow – towards the concept

6.ACT PP John PROPEL cart John pushed the cart Rule 6 describes the relationship between ac ACT and the PP that is the object of ACT. The direction of the arrow is toward the ACT since the context of the specific ACT determines the meaning of the object relation. 7. ACT John John ATRANS book John took the book from Mary Rule 7 describes the relationship between an ACT and the source and the recipient of the ACT < Mary P PP < PP OO O R R

8. ACT P I John John INGEST O do O spoon John ate ice cream with a spoon Rule 8 describes the relationship between an ACT and the instrument with which it is performed. The instrument must always be a full conceptualization (i.e. it must contain an ACT) not just a single physical object. ice cream I

9. ACT John PTRANS 10. ⇚ Rule 10 represents the relationship between a PP and a state in which it started and another in which it ended. > < PP D field bag > < O fertilizer John fertilized the field P D Rule 9 describes the relationship between an ACT and its physical source and destination > < PP PA ⇚ > < Size > x Size = x plants The plants grow

⇛ 11. (a)(b) ⇛ ⇚ > < Bill PEOPLE bullet > < Bob gun Bob ⇛ ⇚ > < health (-10) Bill shot Bob OR p Rule 11 describes the relationship between one conceptualization and another that causes it. Notice that the arrows indicate dependency of one conceptualization on another and so point in the opposite direction of the implication arrows. The two forms of the rule describe the cause of an action and the cause of a state change.

(12) John PTRANS Yesterday John ran yesterday Rule 12 describes the relationship between a conceptualization and the time at which the event it describes occurred. (13) PTRANS I O D Home I MTRANS O Frog R CP Eyes I I While going home, I saw a frog. Rule 13 describes the relationship between one conceptualization and another that is the time of the first. The example for this rule also shows how CD exploits a model of the human information processing system; see is represented as the transfer of information between the eyes and the conscious processor. P <

PPWoods I heard a frog in the woods. Rule 14 describes the relationship between a conceptualization and the place at which it occurred. 14 MTRANS Frog OR < Ears CP

Scripts Script-name: food market Track: super market ROLES: shopper deli attendant seafood attendant checkout clerk sacking clerk other shoppers Entry Conditions: shopper needs groceries food market open PROPS: shopping cart display aisles market items checkout stands cashier money

Scene 1: Enter Market shopper PTRANS shopper into market shopper PTRANS shopping – cart to shopper Scene 2: Shop for Items shopper MOVE shopper through aisles. shopper ATTEND eyes to display items. shopper PTRANS items to shopping cart. Scene 3: Check out shopper MOVE shopper to checkout stand. shopper WAIT shopper turn. shopper ATTEND eyes to charges. shopper ATRANS money to cashier. sacker ATRANS bags to shopper. Scene 4: Exit Market shopper PTRANS to exit market. Results: shopper has less money shopper has grocery items market has less grocery items market has more money Fig-1 A supermarket script structure

KNOWLEDGE ACQUISITION Domain Expert Knowledge Engineer Knowledge Base Knowledge Concepts, Solutions Fig. 1 Typical Knowledge Acquisition Process. Formal

Sources  TEXTBOOKS  REPORTS  DATABASES  CASE STUDIES  EMPERICAL DATA  PERSONAL EXPERIENCE  DOMAINS EXPERTS ASSUME BASIC KNOWLEDGE - Competent (more) – less desirable. KE Paradox  Don’t be your own expert  Don’t believe everything experts say.

Types of Expert problem Solving Past Experience A D E G D E Match Situation A Situation D Situation E Situation G a) Problem solving by an expert in a familiar situation

Types of Expert problem Solving GENERAL PRINCIPLES What Next ? Situation 1 Situation 2 Situation 3 Situation 4 b) Problem solving by an expert in a novel situation What Next ? What Next ?

Techniques for Extracting Knowledge from a domain expert On-site observation (Watch) Problem discussion (Explore the kind of data, knowledge & Procedures) Problem description (Prototypical systems from expert) Problem Analysis (Sample problems solved by expert given by KE) System Refinement (Rules) System Examination (Critics) System Validation (Outside expert)