Presentation is loading. Please wait.

Presentation is loading. Please wait.

Knowledge Representation

Similar presentations


Presentation on theme: "Knowledge Representation"— Presentation transcript:

1 Knowledge Representation

2 Outline General ontology Categories and objects Events and processes
Reasoning systems Internet shopping world Summary

3 Ontologies An ontology is a “vocabulary” and a “theory” of a
certain “part of reality” Special-purpose ontologies apply to restricted domains (e.g. electronic circuits) General-purpose ontologies have wider applicability across domains, i.e.  Must include concepts that cover many subdomains  Cannot use special “short-cuts” (such as ignoring time)  Must allow unification of different types of knowledge GP ontologies are useful in widening applicability of reasoning systems, e.g. by including time

4 Ontological engineering
Representing a general-purpose ontology is a difficult task called ontology engineering Existing GP ontologies have been created in different ways:  By team of trained ontologists  By importing concepts from database(s)  By extracting information from text documents  By inviting anybody to enter commonsense knowledge Ontological engineering has only been partially successful, and few large AI systems are based on GP ontologies (use special purpose ontologies)

5 Elements of a general ontology
Categories of objects Measures of quantities Composite objects Time, space, and change Events and processes Physical objects Substances Mental objects and beliefs

6 Top-level ontology of the world
Anything AbstractObjects Events Sets Numbers RepresentationObjects Intervals Places PhysicalObjects Processes Categories Sentences Measurements Moments Things Stuff Times Weights Animals Agents Solids Liquid Gas Humans

7 Upper Ontology The general framework of concepts is called an upper ontology because of the convention of drawing graphs with the general concepts at the top and the more specific concepts below them Of what use is an upper ontology? Consider the ontology for circuits that we studied It makes many simplifying assumptions: time is omitted completely; signals are fixed and do not propagate; the structure of the circuit remains constant. more general ontology would consider signals at particular times, and would include the wire lengths and propagation delays. This would allow us to simulate the timing properties of the circuit, and indeed such simulations are often carried out by circuit designers.

8 Categories and objects
Categories are used to classify objects according to common properties or definitions x x Tomates  Re d (x)  Round (x) Categories can be represented by  Predicates: Tomato(x)  Objects: The constant Tomatoes represents set of tomatoes (reification) Roles of category representations  Instance relations (is-a): x1 Tomatoes  Taxonomical hierarchies (Subset): Tomatoes  Fruit  Inheritance of properties  (Exhaustive) decompositions

9 Categories using FOL

10 Properties of categories
We say that two or more categories are disjoint if they have no members in common. exhaustive decomposition A disjoint exhaustive decomposition is known as a partition. The following examples illustrate these three concepts: The predicates us to define these concepts are For example, a bachelor is an unmarried adult male:

11 Objects and substance Need to distinguish between substance and
discrete objects Substance (“stuff”)  Mass nouns - not countable  Intrinsic properties  Part of a substance is (still) the same substance Discrete objects (“things”)  Count nouns - countable  Extrinsic properties  Parts are (generally) not of same category

12 Composite objects A composite object is an object that has other
objects as parts The PartOf relation defines the object containment, and is transitive and reflexive PartOf (x, y)  PartOf ( y, z)  PartOf (x, z) PartOf (x, x) Objects can be grouped in PartOf hierarchies, similar to Subset hierarchies The structure of the composite object describes how the parts are related

13 Composite objects For example, a biped has two legs attached to a body: For example, we might want to say “The apples in this bag weigh two pounds.” we need a new concept, which we will call a bunch. For example, if the apples are Apple1, Apple2, and Apple3, then BunchOf ({Apple1,Apple2,Apple3}) ∀x x∈ s ⇒ PartOf (x, BunchOf (s)) ∀ y [∀x x∈ s ⇒ PartOf (x, y)] ⇒ PartOf (BunchOf (s), y) logical minimization, which means defining an object as the smallest one satisfying certain conditions.

14 Measurements Need to be able to represent properties like height, mass, cost, etc. Values for such properties are measures Unit functions represent and convert measures Length( L1)  Inches(1.5)  Centimeters(3.81) l Centimeters(2.54  l)  Inches(l) Measures can be used to describe objects Mass(Tomato1)  Ki log rams(0.16) d d Days  Duration(d )  Hours(24) Non-numerical measures can also be represen- ted, but normally there is an order (e.g. >). Used in qualitative physics

15 Measurements Comparative difficulty
e1 ∈ Exercises ∧ e2 ∈Exercises ∧ Wrote(Norvig, e1) ∧ Wrote(Russell, e2) ⇒ Difficulty(e1) > Difficulty(e2) e1 ∈ Exercises ∧ e2 ∈Exercises ∧ Difficulty(e1) > Difficulty(e2) ⇒ ExpectedScore(e1) < ExpectedScore(e2)

16 Objects: Things and stuff
The real world can be seen as consisting of primitive objects (e.g., atomic particles) and composite objects built from them. There is, however, a significant portion of reality that seems to defy any obvious individuation—division into distinct objects. We give this portion the generic name stuff. count nouns, such as aardvarks, holes, and theorems, and mass nouns, such as butter, water, and energy. To represent stuff properly, we begin with the obvious. We need to have as objects in our ontology at least the gross “lumps” of stuff we interact with. b∈ Butter ∧ PartOf (p, b) ⇒ p ∈Butter b∈ Butter ⇒ MeltingPoint(b,Centigrade(30)) Intrinsic properties and extrinsic properties

17 Event calculus Event calculus: How to deal with change based on
representing points of time Reifies fluents and events  A fluent: At(Bilal, Berkeley)  The fluent is true at time t: T(At(Bilal, IQRA),t) Events are instances of event categories E1  Flyings  Flyer(E1, Bilal)  Origin(E1, SF )  Destination(E1, KHI) Event E1 took place over interval i  Happens(E1, i) Time intervals represented by (start, end) pairs  i = (t1, t2)

18 Event calculus predicates
T(f, t) Happens(e, i) Initiates(e, f, t) Fluent f is true at time t Event e happens over interval i Event e causes fluent f to start at t Terminates(e, f, t) Event e causes f to cease at t Clipped(f, t) Restored(f, i) Fluent f ceases to be true in int. i Fluent f becomes true in interval i

19 Events We assume a distinguished event, Start , that describes the initial state by saying which fluents are initiated or terminated at the start time. We define T by saying that a fluent holds at a point in time if the fluent was initiated by an event at some time in the past and was not made false (clipped) by an intervening event. A fluent does not hold if it was terminated by an event and not made true (restored) by another event. Happens(e, (t1, t2)) ∧ Initiates(e, f, t1) ∧ ¬Clipped(f, (t1, t)) ∧ t1 < t ⇒ T(f, t) Happens(e, (t1, t2)) ∧ Terminates(e, f, t1)∧ ¬Restored (f, (t1, t)) ∧ t1 < t ⇒ ¬T(f, t) where Clipped and Restored are defined by Clipped(f, (t1, t2)) ⇔ ∃ e, t, t3 Happens(e, (t, t3)) ∧ t1 ≤ t < t2 ∧ Terminates(e, f, t) Restored (f, (t1, t2)) ⇔ ∃ e, t, t3 Happens(e, (t, t3)) ∧ t1 ≤ t < t2 ∧ Initiates(e, f, t)

20 Processes The events we have seen so far are what we call discrete events Categories of events with sub-intervals are called process categories or liquid event categories

21 Time intervals Time intervals are partitioned into moments (zero duration) and extended intervals Partition(Moments,ExtendedIntervals,Intervals) i i Intervals  (i  Moments  Duration(i)  0) Functions Start and End delimit intervals i Interval(i)  Duration(i)  (Time(End (i))  Time(Start (i))) May use e.g. January 1, 1900 as arbitrary time 0 Time(Start(AD1900))=Seconds(0)

22 Relations between time intervals
j Meet(i, j) i Before(i, j) After(j, i) j i Can be expressed logically, e.g. i, j Meet (i, j)  Time(End (i))  Time(Start ( j)) j During(i, j ) i Overlap(i, j) Overlap(j, i) j

23 Mental events and mental objects
Need to represent beliefs in self and other agents, e.g. for controlling reasoning, or for planning actions that involve others How are beliefs represented?  Beliefs are reified as mental objects  Mental objects are represented as strings in a language  Inference rules for this language can be defined Rules for reasoning about logical agents’ use their beliefs a, p,q LogicalAgent (a)  Believes(a, p)  Believes(a," p  q")  Believes(a,q) a, p LogicalAgent (a)  Believes(a, p)  Believes(a,"Believes( Name(a), p)")

24 Knows(Lois, CanFly(Superman))
Mental events propositional attitudes that an agent can have toward mental objects: attitudes such as Believes, Knows, Wants, Intends, and Informs For example, suppose we try to assert that Lois knows that Superman can fly: Knows(Lois, CanFly(Superman)) if it is true that Superman is Clark Kent, then we must conclude that Lois knows that Clark can fly: (Superman = Clark) ∧ Knows(Lois , CanFly(Superman)) |= Knows(Lois, CanFly(Clark )) This property is called referential transparency

25 Modal Logic Modal logic is designed to address this problem.
Regular logic is concerned with a single modality, the modality of truth, allowing us to express “P is true.” Modal logic includes special modal operators that take sentences (rather than terms) as arguments. For example, “A knows P” is represented with the notation KAP, where K is the modal operator for knowledge. It takes two arguments, an agent (written as the subscript) and a sentence.

26 Semantic networks Graph representation of categories, objects,
relations, etc. (i.e. essentially FOL) Natural representation of inheritance and default values ∀x x∈ Persons ⇒ [∀ y HasMother (x, y) ⇒ y ∈ FemalePersons ] . ∀x x∈ Persons ⇒ Legs(x, 2) .

27 Semantic Network Human Is a Being Boy Is a Is a Needs Goes to Woman
Joe School Is a Food Has a child Kay

28 Other reasoning systems for categories
Description logics  Derived from semantic networks, but more formal  Supports subsumption, classification and consistency Circumscription and default logic  Formalizes reasoning about default values  Assumes default in absence of other input; must be able to retract assumption if new evidence occurs Truth maintenance systems  Supports belief revision in systems where retracting belief is permitted

29 Internet shopping world
An agent that understands and acts in an internet shopping environment The task is to shop for a product on the Web, given the user’s product description The product description may be precise, in which case the agent should find the best price In other cases the description is only partial, and the agent has to compare products The shopping agent depends on having product knowledge, incl. category hierarchies

30 PEAS specification of shopping agent
Performance goal  Recommend product(s) to match user’s description Environment  All of the Web Actions  Following links  Retrieve page contents Sensors  Web pages: HTML, XML

31 Outline of agent behavior
Start at home page of known web store(s)  Must have knowledge of relevant web addresses, such as etc. Spread out from home page, following links to relevant pages containing product offers  Must be able to identify page relevance, using product category ontologies, as well parse page contents to detect product offers Having located one or more product offers, agent must compare and recommend product  Comparison range from simple price ranking to complex tradeoffs in several dimensions

32 Following links The agent will have knowledge of a number of stores, for example: Amazon ∈OnlineStores ∧ Homepage(Amazon, “amazon.com”) . Ebay ∈OnlineStores ∧ Homepage(Ebay, “ebay.com”) . ExampleStore ∈OnlineStores ∧ Homepage(ExampleStore, “example.com”) a page is relevant to the query if it can be reached by a chain of zero or more relevant category links from a store’s home page, and then from one more link to the product offer. Relevant(page, query) ⇔ ∃ store, home store ∈OnlineStores ∧ Homepage(store, home) ∧ ∃url , url 2 RelevantChain(home, url 2, query) ∧ Link(url 2, url ) ∧ page = Contents(url ) RelevantChain(start , end, query) ⇔ (start = end) ∨ (∃ u, text LinkText(start, u, text ) ∧ RelevantCategoryName(query, text ) ∧ RelevantChain(u, end, query)) .

33 Following Links

34 Comparing offers ∃ c, offer c∈ LaptopComputers ∧ offer ∈ ProductOffers ∧ Manufacturer(c,IBM ) ∧ Model (c, ThinkBook970 ) ∧ ScreenSize(c, Inches(14)) ∧ ScreenType(c, ColorLCD) ∧ MemorySize(c,Gigabytes(2)) ∧ CPUSpeed (c,GHz (1.2)) ∧ OfferedProduct(offer, c) ∧ Store(offer , GenStore) ∧ URL(offer , “example.com/computers/34356.html”) ∧ Price(offer , $(399)) ∧ Date(offer ,Today)

35 Summary An ontology is an encoding of vocabulary and relationships. Special-purpose ontologies can be effective within limited domains A general-purpose ontology needs to cover a wide variety of knowledge, and is based on categories and an event calculus It covers structured objects, time and space, change, processes, substances, and beliefs The general ontology can support agent reasoning in a wide variety of domains, including the Internet shopping world


Download ppt "Knowledge Representation"

Similar presentations


Ads by Google