Knowledge Representation COMP 151 March 28, 2007 Based slides by B.J. Dorr as modified by Tom Lenaerts.

Slides:



Advertisements
Similar presentations
Knowledge Representation using First-Order Logic
Advertisements

1 Knowledge Representation Introduction KR and Logic.
First-Order Logic Chapter 8.
Knowledge Representation
Artificial Intelligence Knowledge Representation
First-Order Logic: Better choice for Wumpus World Propositional logic represents facts First-order logic gives us Objects Relations: how objects relate.
Knowledge Representation CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
CSE & CSE6002E - Soft Computing Winter Semester, 2011 Key Ideas of Knowledge Representation Ontology Ontological Engineering FOL and Categories.
Agents That Reason Logically Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 7 Spring 2004.
Logic.
Knowledge Representation
Knowledge Representation AIMA 2 nd Ed. Chapter 10.
1 Berendt: Advanced databases, first semester 2008, 1 Advanced databases – Conceptual modelling.
For Friday Finish chapter 10 No homework (get started on program 2)
Semantics For the Semantic Web: The Implicit, the Formal and The Powerful Amit Sheth, Cartic Ramakrishnan, Christopher Thomas CS751 Spring 2005 Presenter:
Dr. Samy Abu Nasser Faculty of Engineering & Information Technology Artificial Intelligence.
Ontology Development Kenneth Baclawski Northeastern University Harvard Medical School.
Knowledge representation
Protege OWL Plugin Short Tutorial. OWL Usage The world wide web is a natural application area of ontologies, because ontologies could be used to describe.
SLB /04/07 Thinking and Communicating “The Spiritual Life is Thinking!” (R.B. Thieme, Jr.)
Of 39 lecture 2: ontology - basics. of 39 ontology a branch of metaphysics relating to the nature and relations of being a particular theory about the.
For Friday Exam 1. For Monday No reading Take home portion of exam due.
Ming Fang 6/12/2009. Outlines  Classical logics  Introduction to DL  Syntax of DL  Semantics of DL  KR in DL  Reasoning in DL  Applications.
USCISIUSCISI Background Description Logic Systems Thomas Russ.
For Wednesday Read chapter 13 Homework: –Chapter 10, exercise 5 (part 1 only, don’t redo) Progress for program 2 due.
1 Chapter 10 Knowledge Representation. 2 KR previous chapters: syntax, semantics, and proof theory of propositional and first-order logic Chapter 10:
Pattern-directed inference systems
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
LOGIC AND ONTOLOGY Both logic and ontology are important areas of philosophy covering large, diverse, and active research projects. These two areas overlap.
1 Knowledge Representation CS 171/CS How to represent reality? Use an ontology (a formal representation of reality) General/abstract domain Specific.
Semantic web course – Computer Engineering Department – Sharif Univ. of Technology – Fall Knowledge Representation Semantic Web - Fall 2005 Computer.
EEL 5937 Ontologies EEL 5937 Multi Agent Systems Lecture 5, Jan 23 th, 2003 Lotzi Bölöni.
Computing & Information Sciences Kansas State University Lecture 13 of 42 CIS 530 / 730 Artificial Intelligence Lecture 13 of 42 William H. Hsu Department.
1 CS 2710, ISSP 2160 The Situation Calculus KR and Planning Some final topics in KR.
More on Description Logic(s) Frederick Maier. Note Added 10/27/03 So, there are a few errors that will be obvious to some: So, there are a few errors.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
1 CS 2710, ISSP 2610 Chapter 12 Knowledge Representation.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes Jan 17, 2012.
1 Logical Agents CS 171/271 (Chapter 7) Some text and images in these slides were drawn from Russel & Norvig’s published material.
AI Lecture 17 Planning Noémie Elhadad (substituting for Prof. McKeown)
Some Thoughts to Consider 8 How difficult is it to get a group of people, or a group of companies, or a group of nations to agree on a particular ontology?
For Wednesday Read chapter 13 No homework. Program 2 Any questions?
For Monday after Spring Break Finish chapter 12 Homework –Chapter 12, exercise 7.
Planning I: Total Order Planners Sections
1 CS 2710, ISSP 2160 Chapter 12, Part 1 Knowledge Representation.
Lecture 8-2CS250: Intro to AI/Lisp What do you mean, “What do I mean?” Lecture 8-2 November 18 th, 1999 CS250.
Of 29 lecture 15: description logic - introduction.
Some Thoughts to Consider 5 Take a look at some of the sophisticated toys being offered in stores, in catalogs, or in Sunday newspaper ads. Which ones.
Knowledge Representation. So far... The previous chapters described the technology for knowledge-based agents: – the syntax, semantics, and proof theory.
Artificial Intelligence Knowledge Representation.
1 CS 2710, ISSP 2160 Chapter 12 Knowledge Representation.
Artificial Intelligence Logical Agents Chapter 7.
OWL (Ontology Web Language and Applications) Maw-Sheng Horng Department of Mathematics and Information Education National Taipei University of Education.
ece 720 intelligent web: ontology and beyond
Knowledge and reasoning – second part
ece 627 intelligent web: ontology and beyond
Knowledge Representation
Knowledge Representation
Planning in Answer Set Programming
The Situation Calculus KR and Planning Some final topics in KR
Introduction to Situation Calculus
KNOWLEDGE REPRESENTATION
Artificial Intelligence I: knowledge repre- sentation
Knowledge and reasoning – second part
Artificial Intelligence I: knowledge repre- sentation
Knowledge Representation
Deniz Beser A Fundamental Tradeoff in Knowledge Representation and Reasoning Hector J. Levesque and Ronald J. Brachman.
Representations & Reasoning Systems (RRS) (2.2)
Knowledge Representation
Habib Ullah qamar Mscs(se)
Presentation transcript:

Knowledge Representation COMP 151 March 28, 2007 Based slides by B.J. Dorr as modified by Tom Lenaerts

Chapter 10 Outline 10.1 Ontological engineering 10.2 Categories and objects 10.3 Actions, situations and events 10.4 Mental events and mental objects 10.5 The Internet shopping world 10.6 Reasoning systems for categories 10.7 Reasoning with default information 10.8 Truth maintenance systems most important for where we’re going next

An ontology is … 1. A systematic account of Existence. 2. (From philosophy) An explicit formal specification of how to represent the objects, concepts and other entities that are assumed to exist in some area of interest and the relationships that hold among them. 3. The hierarchical structuring of knowledge about things by subcategorizing them according to their essential (or at least relevant and/or cognitive) qualities. This is an extension of the previous senses of "ontology" (above) which has become common in discussions about the difficulty of maintaining subject indices. The Free On-line Dictionary of Computing Section 10.1

Ontologies in AI For AI systems, what "exists" is that which can be represented.  When the knowledge about a domain is represented in a declarative language, the set of objects that can be represented is called the universe of discourse (UoD).  Definitions associate the names of entities in the UoD (classes, relations, functions or other objects) with human-readable text describing what the names mean, and formal axioms that constrain the interpretation and well-formed use of these terms.  Formally, an ontology is the statement of a logical theory. Ontological commitment  A set of agents that share the same ontology will be able to communicate about a domain of discourse without necessarily operating on a globally shared theory.  We say that an agent commits to an ontology if its observable actions are consistent with the definitions in the ontology. The Free On-line Dictionary of Computing Section 10.1

Ontological engineering How to create more general and flexible representations.  Concepts like actions, time, physical object and beliefs  Operates on a bigger scale than knowledge engineering Define general framework of concepts  Upper ontology Limitations of logic representation  Red, green and yellow tomatoes: exceptions and uncertainty Section 10.1

The upper ontology of the world Section 10.1

Domain-specific vs GP ontologies A general-purpose ontology should be applicable in more or less any special-purpose domain.  Add domain-specific axioms In any sufficiently demanding domain different areas of knowledge need to be unified.  Reasoning and problem solving could involve several areas simultaneously What do we need to express? Categories, Measures, Composite objects, Time, Space, Change, Events, Processes, Physical Objects, Substances, Mental Objects, Beliefs Section 10.1

Categories and objects KR requires the organization of objects into categories  Interaction at the level of the object  Reasoning at the level of categories Categories play a role in predictions about objects  Based on perceived properties Categories can be represented in two ways by FOL  Predicates: apple(x)  Reification of categories into objects: apples Category = set of its members Section 10.2

Category Organization Relation = inheritance :  All instance of food are edible, fruit is a subclass of food and apples is a subclass of fruit then an apple is edible. This defines a taxonomy Section 10.2

FOL and categories An object is a member of a category  MemberOf(BB 12,Basketballs) A category is a subclass of another category  SubsetOf(Basketballs,Balls) All members of a category have some properties   x (MemberOf(x,Basketballs)  Round(x)) All members of a category can be recognized by some properties   x (Orange(x)  Round(x)  Diameter(x)=9.5in  MemberOf(x,Balls)  MemberOf(x,BasketBalls)) A category as a whole has some properties  MemberOf(Dogs,DomesticatedSpecies) Section 10.2

Relations between categories Two or more categories are disjoint if they have no members in common:  Disjoint(s)  (  c 1,c 2 c 1  s  c 2  s  c 1  c 2  Intersection(c 1,c 2 ) ={})  Example; Disjoint({animals, vegetables}) A set of categories s constitutes an exhaustive decomposition of a category c if all members of the set c are covered by categories in s :  E.D.(s,c)  (  i i  c   c 2 c 2  s  i  c 2 )  Example: ExhaustiveDecomposition({Americans, Canadian, Mexicans},NorthAmericans). Section 10.2

Relations between categories A partition is a disjoint exhaustive decomposition:  Partition(s,c)  Disjoint(s)  E.D.(s,c)  Example: Partition({Males,Females},Persons). Categories can be defined by providing necessary and sufficient conditions for membership   x Bachelor(x)  Male(x)  Adult(x)  Unmarried(x) Section 10.2

Natural kinds Many categories have no clear-cut definitions (chair, bush, book). Tomatoes: sometimes green, red, yellow, black. Mostly round. One solution: category Typical(Tomatoes).   x, x  Typical(Tomatoes)  Red(x)  Spherical(x).  We can write down useful facts about categories without providing exact definitions. Section 10.2

Physical composition One object may be part of another:  PartOf(Bucharest,Romania)  PartOf(Romania,EasternEurope)  PartOf(EasternEurope,Europe) The PartOf predicate is transitive (and irreflexive), so we can infer that PartOf(Bucharest,Europe) More generally:   x PartOf(x,x)   x,y,z PartOf(x,y)  PartOf(y,z)  PartOf(x,z) Often characterized by structural relations among parts.  E.g. Biped(a)  Section 10.2

Measurements Objects have height, mass, cost,.... Values that we assign to these are measures Combine Unit functions with a number: Length(L 1 ) = Inches(1.5) = Centimeters(3.81). Conversion between units:  i Centimeters(2.54 x i)=Inches(i). Some measures have no scale: Beauty, Difficulty, etc.  Most important aspect of measures: is that they are orderable.  Don't care about the actual numbers. (An apple can have deliciousness.9 or.1.) Section 10.2

Mental events and objects So far, KB agents can have beliefs and deduce new beliefs What about knowledge about beliefs? What about knowledge about the inference process?  Requires a model of the mental objects in someone’s head and the processes that manipulate these objects. Relationships between agents and mental objects: believes, knows, wants, …  Believes(Lois,Flies(Clark)) with Flies(Clark) being a function … a candidate for a mental object (reification).  Agent can now reason about the beliefs of agents. Section 10.4

The Internet shopping world A Knowledge Engineering example An agent that helps a buyer to find product offers on the internet.  IN = product description (precise or  precise)  OUT = list of webpages that offer the product for sale. Environment = WWW Percepts = web pages (character strings)  Extracting useful information required. Section 10.5

The Internet shopping world Find relevant product offers RelevantOffer(page,url,query)  Relevant(page, url, query)  Offer(page)  Write axioms to define Offer(x)  Find relevant pages: Relevant(x,y,z) ? Start from an initial set of stores. What is a relevant category? What are relevant connected pages?  Require rich category vocabulary. Synonymy and ambiguity  How to retrieve pages: GetPage(url) ? Procedural attachment Compare offers (information extraction). Section 10.5

Reasoning systems for categories How to organise and reason with categories?  Semantic networks Visualize knowledge-base Efficient algorithms for category membership inference  Description logics Formal language for constructing and combining category definitions Efficient algorithms to decide subset and superset relationships between categories. Section 10.6

Semantic Networks Logic vs. semantic networks Many variations  All represent individual objects, categories of objects and relationships among objects. Allows for inheritance reasoning  Female persons inherit all properties from person.  Cfr. OO programming. Inference of inverse links  SisterOf vs. HasSister Section 10.6

Semantic network example Section 10.6

Semantic networks Drawbacks  Links can only assert binary relations  Can be resolved by reification of the proposition as an event Representation of default values  Enforced by the inheritance mechanism. Section 10.6

Description logics Are designed to describe defintions and properties about categories  A formalization of semantic networks Principal inference task is  Subsumption : checking if one category is the subset of another by comparing their definitions  Classification : checking whether an object belongs to a category.  Consistency: whether the category membership criteria are logically satisfiable. Section 10.6

Reasoning with Default Information “The following courses are offered: CS101, CS102, CS106, EE101”  Four (db) Assume that this information is complete (not asserted ground atomic sentences are false) = CLOSED WORLD ASSUMPTION Assume that distinct names refer to distinct objects = UNIQUE NAMES ASSUMPTION  Between one and infinity (logic) Does not make these assumptions Requires completion. Section 10.7

Truth maintenance systems Many of the inferences have default status rather than being absolutely certain  Inferred facts can be wrong and need to be retracted = BELIEF REVISION.  Assume KB contains sentence P and we want to execute TELL(KB,  P) To avoid contradiction: RETRACT(KB,P) But what about sentences inferred from P? Truth maintenance systems are designed to handle these complications. Section 10.8

Actions, events and situations Reasoning about outcome of actions is central to KB-agent. How can we keep track of location in FOL? Remember the multiple copies in PL. Representing time by situations (states resulting from the execution of actions). Situation calculus Section 10.3

Ontology of Situational Calculus Actions are logical terms  such as Forward, Turn(Right) Situations are terms consisting of the initial situation (S 0 ), and all situations generated by applying an action to S 0  Function Result(a,s) names situation that results when action a is applied to situation s Fluents are functions and predicates that vary from one situation to the next  Example:  Holding(G 1, S 0 ) Atemporal or eternal predicates and functions to not change as the result of any actions  Examples, predicate: Gold(G 1 ), function: LeftLegOf(Wumpus)

Actions, events and situations

Sequences of Actions and Planning Results of action sequences are determined by the individual actions. Result([], s) = s. Result([a|seq], s) = Result(seq, Result(a,s)). Projection task: an SC agent should be able to deduce the outcome of a sequence of actions. Planning task: find a sequence that achieves a desirable effect.

Situation Calculus Example Goal: get gold to square [1,1]. Initial state: At(o,x,S 0 )  [(o=Agent  x=[1,1])  (o=G 1  x=[1,2])].  Holding(G 1, S 0 ). Gold(G 1 ). Adjacent([1,1],[1,2])  Adjacent([1,2],[1,1]). Projection: prove that we achieve goal by: At(G 1,[1,1],Result([Go([1,1],[1,2]), Grab(G 1 ), Go([1,2],[1,1])], S 0 )). Planning: ask:  seq At( G 1,[1,1], Result(seq, S 0 )).

Describing Change Situation calculus requires two axioms to describe change:  Possibility axiom: when is it possible to do the action At(Agent,x,s)  Adjacent(x,y)  Poss(Go(x,y),s)  Effect axiom: describe changes due to action Poss(Go(x,y),s)  At(Agent,y,Result(Go(x,y),s)) Frame problem:  What stays the same?, How to represent all things that stay the same?  Frame axiom: describe non-changes due to actions At(o,x,s)  (o  Agent)   Holding(o,s)  At(o,x,Result(Go(y,z),s))

Representational Frame Problem If there are F fluents and A actions then we need AF frame axioms to indicate that other objects are stationary unless they are held.  write down the effect of each action Solution; describe how each fluent changes over time  Successor-state axiom: Action is possible  (Fluent is true in result state  action’s effect made it true  it was true before and the action didn’t change it) Pos(a,s)  (At(Agent, y, Result(a,s))  (a = Go(x,y))  (At(Agent, y, s)  a  Go(y,z))  Note that next state is completely specified by current state.  Each action effect is mentioned only once.

Other problems Ramification problem:  How to deal with secondary (implicit) effects?  Example: If the agent is carrying the gold and the agent moves then the gold moves too. Inferential frame problem:  How to decide EFFICIENTLY whether fluents hold in the future? Extensions:  Event calculus (when actions have a duration)  Process categories

Summary A general-purpose ontology organizes and connects various domain-specific knowledge bases Mental states of agents can be represented by sentences that denote beliefs Truth maintenance systems handle knowledge updates and revisions efficiently  Retraction of incorrect beliefs Semantic networks and description logics are special purpose representation systems use for organizing hierarchies of categories.  Inheritance allows properties of objects to be deduced from category membership.

Summary Closed-world assumption:  anything not explicitly true is false  Avoids having to specify lots of negative information  Interpret as default information that can be overridden by additional information Actions, events and time can be represented in situational calculus (or more expressive representations)  This enables agents to construct plans by logical inference  Chapter 11