Presentation is loading. Please wait.

Presentation is loading. Please wait.

CSM6120 Introduction to Intelligent Systems Knowledge Representation 2.

Similar presentations


Presentation on theme: "CSM6120 Introduction to Intelligent Systems Knowledge Representation 2."— Presentation transcript:

1 rkj@aber.ac.uk CSM6120 Introduction to Intelligent Systems Knowledge Representation 2

2 What we’ve ignored  Objects in the world tend to be related to each other  Classes, superclasses & subclasses  Part / whole hierarchies  Properties are inherited across relationships  The state of the world can change over time  Explicit representation of time  Frame axiom  Non-monotonic reasoning  We must reason without complete knowledge Closed world assumption  Not all knowledge is “black & white” (later modules)  Uncertainty  Statistics, fuzzy logic

3 Classes  “I want to buy a basketball”  I want to buy BB27341 (NO)  I want to buy an object that is a member of the basketball class (YES)  Objects organized into a hierarchy by class  BB27341  basketballs  Basketballs  balls  Facts (objects) & rules (classes)  All balls are round  All basketballs are in diameter  BB27341 is red, white and blue  BB27341 is a basketball  (Therefore BB27341 is round and in diameter)

4 Inheritance  If a property is true of a class, it is true of all subclasses of that class  If a property is true of a class, it is true of all objects that are members of that class  (If a property is true of a class, it is true of all objects that are members of subclasses of that class)  There are exceptions (to be dealt with later…)

5 Part / Whole Inheritance  A cow has 4 legs  Each leg is part of the cow  The cow is in the field  All of the cow’s parts are also in the field  The cow is (entirely) brown  All of the cow’s parts are brown  The cow is happy  All of the cow’s parts are happy (?)  Note: some properties are inherited by parts, others are not. This is generally made explicit by rules, such as  part-of(x,y) and location(y,z) -> location(x,z)

6 Stuff vs. things  Some objects are “stuff”  When you divide the object, its parts are still the same: e.g. butter, snow  “stuff” cannot be referred to with “a” or “an”  “Please pass a butter” (?)  Other objects are “things”  When you divide them, you destroy them, e.g. aardvark, snowflake  “A snowflake landed on my nose” vs. “A snow landed on my nose”

7 Situations and events  A state of the world is a “situation”  Divide predicates into “eternal” and “fluent”  At(x,y,S) is fluent - S is the situation where it holds  Gold(g) is eternal - it is true in all situations  Each action causes a result  Result(move-to(x,y),S i ) = At(x,y, S i+1 )

8 The frame problem  What about everything else?  Result(move-to(x,y), S i ) = At(x,y, S i+1 ) and   z, z≠x  at(z,w, S i )  at(z,w, S i+1 )  This is a frame axiom  Representing all frame axioms explicitly is a pain  Instead: assume it’s the same if we’re not told it’s different  Fluent is true in new situation iff last action made it true, or if it was true in the previous situation  Real world example: what colour is John’s hair?  We assume it hasn’t changed, but maybe he coloured it today!

9 Semantic networks  Semantic networks are essentially a generalization of inheritance hierarchies  Each node is an object or class  Each link is a relationship  is-a (the usual subclass or element relationship)  has-part or part-of  Any other relationship that makes sense in context  Note: semantic networks predated OOP  Inheritance: follow one member-of link, as many subclass or other links as necessary

10 Graphical representation  Graphs easy to store in a computer  To be of any use must impose a formalism  Jason is 15, Bryan is 40, Arthur is 70, Jim is 74  How old is Julia?

11 Semantic networks  Because the syntax is the same  We can guess that Julia’s age is similar to Bryan’s  Formalism imposes restricted syntax

12  Knowledge represented as a network or graph subclass haspart subclass instance likes size Animal Reptile Elephant Nellie Mammal apples large head Africa livesin Semantic networks

13  By traversing network we can find:  That Nellie has a head (by inheritance)  That certain concepts related in certain ways (e.g., apples and elephants)  But: meaning of semantic networks not always well defined  Are all Elephants big, or just typical elephants?  Do all Elephants live in the “same” Africa?  Do all animals have the same head?  For machine processing these things must be defined

14 Algorithm for inheritance  If the current object has a value for the property, return that value  Otherwise, if the current object is a member of a class, return the value of the property of the class (recursively)  Otherwise, if the current object is a subclass, return the value of the property of the superclass (recursively)  This is depth-first search; stop at the first one found

15 Defaults and Exceptions  Exception for a single object:  Set a property of the object to the (exception) value  Default for a class  Set a property of the class to the (default) value  If there are multiple default values, the one “closest” to the object wins

16 Example  All birds can fly  Birds with broken wings are birds, but cannot fly  Penguins are birds, but they cannot fly  Magical penguins are penguins that can fly  Who can fly?  Tweety is a bird  Peter is a penguin  Penelope is a magical penguin  Note that beliefs can be changed as new information comes in that changes the classification of an object

17 Two useful assumptions  Closed World Assumption  Anything that I don’t know is true, I will assume to be false  (Negation as failure in Prolog)  Unique Names Assumption  Unique names refer to different objects  “Chris and the programmer …” implies Chris isn’t the programmer  Again, Prolog implements this assumption.

18 Advantages of semantic networks  Easy to visualize  Formal definitions of semantic networks have been developed  Related knowledge is easily clustered  Efficient in space requirements  Objects represented only once  Relationships handled by pointers

19 Disadvantages of semantic networks  Inheritance (particularly from multiple sources and when exceptions in inheritance are wanted) can cause problems  Facts placed inappropriately cause problems  No standards about node and arc values

20 Conceptual graphs  Semantic network where each graph represents a single proposition  Concept nodes can be  Concrete (visualisable) such as restaurant, my dog Spot  Abstract (not easily visualisable) such as anger  Edges do not have labels  Instead, conceptual relation nodes  Easy to represent relations between multiple objects

21 Causal, temporal, inheritance networks  Causal networks  nodes represent concepts, objects, events  links represent causal relationship between these concepts, objects, events  Temporal networks  nodes represent events  links represent temporal relationship between events, like ‘before’, ‘after’,...  Inheritance networks (terminologies, taxonomies)  nodes represent concepts  links represent class-/subclass-relationship is-a: superclass – subclass or super-concept / sub-concept

22 Causal network – example

23 Classification hierarchy

24 Inheritance networks  nodes represent concepts (events, objects, actions,...)  links represent  super-concept / sub-concept-relationships is-a: specialization / subsumption of concepts  concept-instance-relationships instance-of  relationships between concepts role (slot)  attributes/features/properties of concept  constraints attached to roles, e.g. number of fillers  Closely related to First-Order Predicate Logic

25 Terminological network – example Example: Concepts: bird, robin, flying-animal, “Speedy” Feature: colour is-a (robin,bird), is-a (bird,flying-animal)Superclass instance-of (“Speedy”, robin)Instance has-colour (robin)=“grey”Feature Task: Express that a typical elephant has legs, usually 4 of them, has a certain colour, and there is a specific elephant named Clyde elephant, colour, legs, has (usually) 4 legs,”Clyde”

26 Terminological network - solution Solution: Concepts: elephant, legs, “Clyde” (instance) or Clyde (individual concept), (colour and grey) Roles: has-legs, (has-colour) Feature: colour Specific representation of Clyde: has-legs (elephant, 4) has-colour (elephant, grey) instance-of (“Clyde”, elephant)specific object “Clyde” is-a(Clyde, elephant)individual concept ‘Clyde’

27 Semantics of KR Languages Formal semantics e.g. Predicate Logic Interpretation, derive meaning of complex expressions based on meaning of atomic expressions plus construction mechanism Use / reasoning e.g. Spreading activation positive or negative association between concepts; see Neural Networks

28 Frames  Devised by Marvin Minsky, 1974  Incorporates certain valuable human thinking characteristics:  Expectations, assumptions, stereotypes. Exceptions. Fuzzy boundaries between classes  The essence of this form of knowledge representation is typicality, with exceptions, rather than definition

29 Frames  Frames were the next development, allowing more convenient “packaging” of facts about an object  We use the terms “slots” and “slot values” mammal: subclass: animal elephant: subclass: mammal size: large haspart: trunk Nellie: instance: elephant likes: apples mammal: subclass: animal elephant: subclass: mammal size: large haspart: trunk Nellie: instance: elephant likes: apples

30 Frames  Frames often allowed you to say which things were just typical of a class, and which were definitional, so couldn’t be overridden  Frames also allow multiple inheritance (Nellie is an Elephant and is a circus animal) Elephant: subclass: mammal haspart: trunk * colour: grey * size: large Elephant: subclass: mammal haspart: trunk * colour: grey * size: large

31 Frame representations  Semantic networks where nodes have structure  Frame with a number of slots (age, height,...)  Each slot stores specific item of information  When agent faces a new situation  Slots can be filled in (value may be another frame)  Filling in may trigger actions  May trigger retrieval of other frames  Inheritance of properties between frames  Very similar to objects in OOP

32 Example: Frame Representation

33 Flexibility in frames  Slots in a frame can contain  Information for choosing a frame in a situation  Relationships between this and other frames  Procedures to carry out after various slots filled  Default information to use where input is missing  Blank slots: left blank unless required for a task  Other frames, which gives a hierarchy  Can also be expressed in first order logic

34 How frames are organised  A frame system is a hierarchy of frames  Each frame has:  a name  slots: these are the properties of the entity that has the name, and they have values  A particular value may be:  a default value  an inherited value from a higher frame  a procedure, called a daemon, to find a value  a specific value, which might represent an exception

35 How frames are organised  In the higher levels of the frame hierarchy, typical knowledge about the class is stored  The value in a slot may be a range or a condition  In the lower levels, the value in a slot may be a specific value, to overwrite the value which would otherwise be inherited from a higher frame

36 How frames are organised  An instance of an object is joined to its class by an 'instance_of' relationship  A class is joined to its superclass by a 'subclass_of' relationship  Frames may contain both procedural and declarative knowledge  Slot values normally amount to declarative knowledge, but a daemon is in effect a small program  So a slot with a daemon in it amounts to procedural knowledge

37 How frames are organised  Note that a frames system may allow multiple inheritance but, if it does so, it must make provision for cases when inherited values conflict

38 Frames, schemas, prototypes  Frames  Concepts as record-like structures  Slots – relationships to other concepts, attributes  Fillers – values for slots (other concept or value)  Schema-theory / Prototypes  Some objects are more typical for a certain class of objects  Precise definition for concepts sometimes not possible, then reference to prototypes

39 Typical bird is robin – take all “robin-features” as description for class ‘bird’. This forms a prototype. Take typical chair as prototype. Other chairs are more or less similar to this prototypical chair The class of all chairs is “fuzzy” since there are no precise or exact boundaries for the class 'chair', i.e. to decide when something is a chair or not. Frames, schemas, prototypes

40 Defaults  Defaults represent standard values for some attributes of a concept  e.g. the standard number of legs of an elephant is 4  (Inherited) defaults may be overwritten at lower-level concepts, or for individual concepts  “Clyde” - the famous 3-legged AI-elephant  Problem: If roles, attributes etc. in a concept description can be changed or cancelled, what is the definition of a concept? How can we classify? And reason?

41 Multiple inheritance and views  Multiple inheritance  Sub-concept inherits descriptions from several superconcepts  Possibly conflicting information ( = ambiguity)  skeptical reasoners: “don’t know” (no conclusion)  credulous reasoners: “whatever” (several conclusions)  Views  Description of concept from different viewpoints  Inheritance of multiple, complementing descriptions  e.g. view computer as machine or as equipment

42 Multiple inheritance - views

43 Multiple inheritance - ambiguity Person subclass non-pacifist Nixon RepublicanQuaker pacifist subclass instance

44 Frames and procedures  Frames often allow slots to contain procedures  So... size slot could contain code to calculate the size of an animal from other data  Sometimes divided into “if-needed” procedures, run when value needed, and “if-added” procedures, run when a value is added (to update rest of data, or inform user)  So... similar, but not quite like OO languages

45 Frames: some examples  The following is an exploration of Minsky's frames, using simple examples  Note that this is not exactly the way Minsky described frames in his original paper, but it is the way the idea has come to be used in practice

46 Frames: some examples  First a verbal description of some concept is provided  Then a diagram is shown representing the resulting frame  Next to this is a bit of code which implements this frame in a suitable KR programming language

47 Frames: some examples  We will start with a simple piece of information: there is a category of things called cars  Given this information, we can start to build a frame:

48 Name: car Subclass of: thing

49  More information: a car has 4 wheels, is moved by an engine, and runs on petrol or diesel  We can now add three slots to the frame  The last of these has a restriction rather than a specific value

50 Name: car Subclass of: thing Slots: Name: Value: Restrictions: car subclass_of thing with wheels: 4, moved_by: engine, fuel: [value: unknown, type: [petrol,diesel]] wheels 4 moved by engine fuel ?petrol or diesel “a car has 4 wheels, is moved by an engine, and runs on petrol or diesel”

51  More information: there is a particular type of car called a VW, manufactured in Germany  We can add a second frame to our system, with one slot  We don’t need to repeat the slots and values in the previous frame: they will be inherited

52 Name: VW Subclass of: car Slots: Name: Value: Restrictions: ‘VW’ subclass_of car with made_in: ‘Germany’ made in Germany “there is a particular type of car called a VW, manufactured in Germany”

53  More information: there is a particular type of VW called a Golf, which has a sunroof  We can add a third frame to our system, with one slot  Once again, we don’t repeat the slots in the previous frames, because they will be inherited

54 Name: Golf Subclass of: VW Slots: Name: Value: Restrictions: ‘Golf’ subclass_of VW with top: sunroof top sunroof “there is a particular type of VW called a Golf, which has a sunroof”

55  More information: there is a particular type of Golf called a TDi, which runs on diesel. A TDi has 4 cylinders, and an engine capacity of 1.8 litres  We can add a fourth frame to our system, with three slots. One of the slots (fuel) was already in the system, but appears here because it now has a specific value rather than a restriction

56 Name: TDi Subclass of: Golf Slots: Name: Value: Restrictions: ‘TDi’ subclass_of ‘Golf’ with fuel: diesel, engine_capacity: 1.8, cylinders: 4 fuel diesel engine capacity 1.8 litres capacity 1.8 litres cylinders 4 “there is a particular type of Golf called a TDi, which runs on diesel, has 4 cylinders, and has a 1.8 litre engine”

57  More information: my car, called C637SRK, is a Golf TDi. It hasn’t got a sunroof.  We can add a fifth frame to our system, with two slots. Unlike all the previous frames, this one is an instance frame. One of the slots (top) was already in the system, but appears here because the value contradicts (overwrites) the value which would otherwise be inherited

58 Name: C637SRK Instance of: TDi Slots: Name: Value: Restrictions: ‘C637SRK’ instance_of ‘TDi’ with owner: me, top: no_sun_roof owner me top no sunroof “There is a car called C637SRK which is an instance of a Golf TDi etc”

59  More information: to calculate a car’s cylinder size, you divide the total engine capacity by the number of cylinders  We can now add another slot to the car frame:  a procedure which discovers the information required, if the user asks what the value in the slot is

60 Name: car Subclass of: thing Slots: Name: Value: Restrictions: car subclass_of thing with wheels: 4, moved_by: engine, fuel: [value: unknown, type: [petrol,diesel]], cylinder_size: [value: unknown, access_rule: (if the engine_capacity of ?self is E & the cylinders of ?self is C & Ans := E/C then make_value Ans)] wheels 4 moved by engine fuel ?petrol or diesel cylinder size ?find total engine capacity, find no.of cylinders, no.of cylinders, divide first by second “to calculate a car’s cylinder size, you divide the total engine capacity by the number of cylinders”

61 Benefits of Frames  Makes programming easier by grouping related knowledge  Easily understood by non-developers  Expressive power  Easy to set up slots for new properties and relations  Easy to include default information and detect missing values

62 Drawbacks of Frames  No standards (slot-filler values)  More of a general methodology than a specific representation:  Frame for a classroom will be different for a professor and for a maintenance worker  No associated reasoning/inference mechanisms

63 Semantic networks (etc) and logic  How do we precisely define the semantics of a frame system or semantic network?  Modern trend is to have special knowledge representation languages which look a bit like frames to users, but which:  Use logic to define what relations mean  Don’t provide the full power of predicate logic, but a subset that allows efficient inference (May not want more than inheritance)

64 Other varieties of structured object  Knowledge representation researchers - particularly Roger Schank and his associates - devised some interesting variations on the theme of structured objects  In particular, they invented the idea of scripts (1973)  A script is a description of a class of events in terms of contexts, participants, and sub-events

65 Scripts  Rather similar to frames: uses inheritance and slots; describes stereotypical knowledge  (i.e. if the system isn't told some detail of what's going on, it assumes the "default" information is true)  but concerned with events

66 Scripts  Why represent knowledge in this way?  Because real-world events follow stereotyped patterns  Human beings use previous experience to understand verbal accounts; computers can use scripts instead  Because people, when relating events, leave large amounts of assumed detail out of their accounts  People don't find it easy to converse with a system that can't fill in missing conversational detail

67 Scripts  Scripts predict unobserved events  Scripts can build a coherent account from disjointed observations

68 Scripts  Commercial applications of script-like structured objects: work on the basis that a conversation between two people on a pre-defined subject will follow a predictable course  Certain items of information need to be exchanged.  Others can be left unsaid (because both people know what the usual answer would be, or can deduce it from what's been said already), unless (on this occasion) it's an unusual answer

69 Scripts  This sort of knowledge representation has been used in intelligent front-ends, for systems whose users are not computer specialists  It has been employed in story-understanding and news- report-understanding systems

70 Non-monotonic logic  Once true (or false) does not mean always true (or false)  As information arrives, truth values can change (Penelope is a bird, penguin, magic penguin)  Implementations  Circumscription  Bird(x) and not abnormal(x) -> flies(x)  We can assume not abnormal(x) unless we know abnormal(x)  Default logic  “x is true given x does not conflict with anything we already know”

71 Truth maintenance systems  These systems allow truth values to be changed during reasoning (belief revision)  When we retract a fact, we must also retract any other fact that was derived from it  Penelope is a bird(can fly)  Penelope is a penguin(cannot fly)  Penelope is magical(can fly)  Retract (Penelope is magical)(cannot fly)  Retract (Penelope is a penguin)(can fly)

72 Types of TMS  Justification based TMS  For each fact, track its justification (facts and rules from which it was derived)  When a fact is retracted, retract all facts that have justifications leading back to that fact, unless they have independent justifications  Assumption based TMS  Represent all possible states simultaneously  When a fact is retracted, change state sets  For each fact, use list of assumptions that make that fact true; each world state is a set of assumptions

73 Structured objects: final comments  A KR system with property inheritance should make a distinction between essential properties ('universal truths') and accidental properties  Some don't: some allow unrestrained overwriting of inherited properties  It's possible to generate incoherent structures in any of these systems  It's particularly easy when multiple inheritance is involved

74 Structured objects: final comments  At best, structured objects provide data & control structures which are more flexible than those in conventional languages, and so more suited to simulating human reasoning

75 Conclusion  Issues in KR  Logical representations  FOPL  Will focus on this next week  “Non-logical”  Semantic networks  Frames  Scripts ...

76 KBSs using structured objects  Two examples of KBSs using structured objects for knowledge representation

77 Explorer  Explorer is a natural-language interface for a database relating to oil-wells:  Contains geological knowledge, organised as structured objects  Interfaced to a graphics package that can draw oil-exploration maps  The geologist can converse with the program in unconstrained natural language, and in geological jargon

78 Bank customer advisory system  Bank customer advisory system:  Advises customer on opening a bank account, based on customer's needs/characteristics. This is a task that can consume a lot of the bank staff's time  Deduces customer's intended meaning  Does not annoy customer by asking for every detail; uses its knowledge about people to deduce what the answers must be

79 The CYC Project  A very large scale commercial research project designed to represent a large body of common-sense knowledge  Headed by Lenat and Guha, and based in Austin, Texas  Has been making steady progress since 1984

80 The CYC Project  It now ‘knows’ a huge collection of fragments of real- world knowledge such as:  Mothers are older than their children  You have to be awake to eat  You can usually know people’s noses, but not their hearts  If you cut a lump of peanut butter in half, each half is a lump of peanut butter, but if you cut a table in half, neither half is a table

81 The CYC Project  The ultimate objective is to give it enough knowledge to understand ordinary books, so that it can read them and expand its own knowledge  So far, it’s got to a stage where, when asked to find photos of “risky activities”, it located photos of people climbing mountains and doing white-water rafting


Download ppt "CSM6120 Introduction to Intelligent Systems Knowledge Representation 2."

Similar presentations


Ads by Google