Ontology.

Slides:



Advertisements
Similar presentations
Towards a Naive Geography Pat Hayes & Geoff Laforte IHMC University of West Florida.
Advertisements

Dr. David A Ferrucci -- Logic Programming and AI Lecture Notes Knowledge Structures Building the Perfect Object.
An Overview of Ontologies and their Practical Applications Gianluca Correndo
RDF Schemata (with apologies to the W3C, the plural is not ‘schemas’) CSCI 7818 – Web Technologies 14 November 2001 Van Lepthien.
Knowledge Representation
Knowledge Representation
For Friday Finish chapter 10 No homework (get started on program 2)
Ontologies - Design principles Cartic Ramakrishnan LSDIS Lab University of Georgia.
PR-OWL: A Framework for Probabilistic Ontologies by Paulo C. G. COSTA, Kathryn B. LASKEY George Mason University presented by Thomas Packer 1PR-OWL.
Copyright © 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 8 The Enhanced Entity- Relationship (EER) Model.
What is an Ontology? AmphibiaTree 2006 Workshop Saturday 8:45–9:15 A. Maglia.
The Enhanced Entity- Relationship (EER) Model
UML Class Diagrams: Basic Concepts. Objects –The purpose of class modeling is to describe objects. –An object is a concept, abstraction or thing that.
Legal provisions LLB Joanna Helios Wioletta Jedlecka.
INF 384 C, Spring 2009 Ontologies Knowledge representation to support computer reasoning.
OWL and SDD Dave Thau University of Kansas
RDF and OWL Developing Semantic Web Services by H. Peter Alesso and Craig F. Smith CMPT 455/826 - Week 6, Day Sept-Dec 2009 – w6d21.
For Friday Exam 1. For Monday No reading Take home portion of exam due.
Applying Rigidity to Standardizing OBO Foundry Candidate Ontologies A.Patrice Seyed and Stuart C. Shapiro Department of Computer Science Center for Cognitive.
OWL 2 in use. OWL 2 OWL 2 is a knowledge representation language, designed to formulate, exchange and reason with knowledge about a domain of interest.
For Wednesday Read chapter 13 Homework: –Chapter 10, exercise 5 (part 1 only, don’t redo) Progress for program 2 due.
Metadata. Generally speaking, metadata are data and information that describe and model data and information For example, a database schema is the metadata.
1 Knowledge Representation CS 171/CS How to represent reality? Use an ontology (a formal representation of reality) General/abstract domain Specific.
Artificial Intelligence Chapter 18. Representing Commonsense Knowledge.
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
EEL 5937 Ontologies EEL 5937 Multi Agent Systems Lecture 5, Jan 23 th, 2003 Lotzi Bölöni.
CSE KRR 1 Knowledge Representation Encoding real world knowledge in a formalism that allows us to access it and reason with it.
Repetition af Domæne model. Artifact influence emphasizing the Domain Model.
1 CS 2710, ISSP 2610 Chapter 12 Knowledge Representation.
Based on “A Practical Introduction to Ontologies & OWL” © 2005, The University of Manchester A Practical Introduction to Ontologies & OWL Session 2: Defined.
Object-Oriented Analysis and Design CHAPTERS 9, 31: DOMAIN MODELS 1.
MDA & RM-ODP. Why? Warehouses, factories, and supply chains are examples of distributed systems that can be thought of in terms of objects They are all.
Knowledge Representation. Keywordsquick way for agents to locate potentially useful information Thesaurimore structured approach than keywords, arranging.
For Wednesday Read chapter 13 No homework. Program 2 Any questions?
EEL 5937 Ontologies EEL 5937 Multi Agent Systems Lotzi Bölöni.
For Monday after Spring Break Finish chapter 12 Homework –Chapter 12, exercise 7.
Ontology and the lexicon Nicola Guarino and Christopher A. Welty(2004). An Overview of OntoClean Weber ( 張澄清 ) 2014/04/23 1.
Presented by Kyumars Sheykh Esmaili Description Logics for Data Bases (DLHB,Chapter 16) Semantic Web Seminar.
Of 29 lecture 15: description logic - introduction.
1 CS 2710, ISSP 2160 Chapter 12 Knowledge Representation.
Knowledge Representation Part I Ontology Jan Pettersen Nytun Knowledge Representation Part I, JPN, UiA1.
5 Systems Analysis and Design in a Changing World, Fourth Edition.
5 Chapter 5: Modeling Systems Requirements: Events and Things Systems Analysis and Design in a Changing World.
The Enhanced Entity- Relationship (EER) Model
Conceptual Design & ERD Modelling
Integrating SysML with OWL (or other logic based formalisms)
DOMAIN ONTOLOGY DESIGN
Enhanced Entity-Relationship (EER) Modeling
Knowledge Representation Part I Ontology
ece 720 intelligent web: ontology and beyond
ece 627 intelligent web: ontology and beyond
Ontology From Wikipedia, the free encyclopedia
Enhanced ER Modeling Transparencies
OBJECT RELATIONSHIPS, ATTRIBUTES, AND METHODS
Survey of Knowledge Base Content
CIS 375 Bruce R. Maxim UM-Dearborn
UML Class Diagrams: Basic Concepts
Ontology.
Lec 3: Object-Oriented Data Modeling
Object-Oriented Knowledge Representation
Nov. 29, 2001 Ontology Based Recognition of Complex Objects --- Problems to be Solved Develop Base Object Recognition algorithms that identify non-decomposable.
Ontology.
Manager’s Overview DoDAF 2.0 Meta Model (DM2) TBS dd mon 2009
CS4222 Principles of Database System
ENHANCED ENTITY-RELATIONSHIP (EER) MODEL
Pragmatics: Reference and inference
Enhanced Entity-Relationship (EER) Modeling
CGS 2545: Database Concepts Summer 2006
University of Manchester
Presentation transcript:

Ontology

Overview Knowledge bases, ontologies, and axioms Collections and structural relationships Basics of ontology design

Examples of Ontologies Domains in databases Classes in OO programming Types in AI and Logic

Some working definitions Ontology = a theory of the kinds of things there are and can be (WHAT CAN EXIST) Manifests in a representation by the vocabulary of predicates and certain relationships between them (often called structural relations) Knowledge Base = ontology + axioms (WHAT DOES EXIST) Axioms essential to constraining meaning towards intended models Domain Theory = KB + control knowledge (HOW TO USE THAT KNOWLEDGE) Control knowledge makes the KB usable for various tasks

Allegory of the Cave Ontology spans the cave and everything able to be experienced outside Knowledge Base includes only items that have been experienced Domain Theory does not allow for direct knowledge transfer. Hence communication is limited

Human Ontologies What we created/use What we live

Microworlds Limited domain Closed set B2B Integration

Some specifics

Continuants and Occurrents Stable properties/differentiates Occurrents In a state of flux Attributes may be analogous to prior versions, but they are not the same

Individuals vs. Collections Individuals are things that aren’t sets or collections TheEiffelTower, NeilArmstrong, Dog32 Collections are natural kinds or classes whose elements share important natural properties, i.e. some common attributes or relational commonalities. Tower, Astronaut, Dog Sets (as in the mathematical sense) have elements that might not have anything in common.

Some distinct types of individuals Discrete objects Cut them up and you get something different Cars, people Substances Cut them up and you get more of the same Water, sand Mobs Like substance, but worth reifying particular elements Mountains in a range, feathers on a bird Events Something happening over time with substructure Processes Something happening over time that is internally uniform

Collections Collections approximate categories Dog is the collection of all dogs The following are equivalent (Dog Dog32) (member Dog32 Dog) (isa Dog32 Dog) Comparison with psychological notion of category Typically no compact definition Organized via taxonomic relationships But no similarity effects, recognition criteria, exemplar-driven effects

Inheritance from collections Collection membership supports inference (forall ?x (=>(elephant ?x) (exists ?t (and (elephant-trunk ?t) (physical-part ?x ?t))))) Inheritance generally treated as monotonic (forall ?x(=>(elephant ?x)(grey ?x)) (and (elephant Clyde)(pink Clyde)) What’s This?

Structural relations express common representation patterns Genls Disjointness and partitions Type constraints on arguments

Genls (genls <sub> <super>) means genls is transitive (forall ?x(=>(<sub> ?x)(<super> ?y))) (subset <sub> <super>) genls is transitive Attribution/collection membership distributes across genls (=> (and (elephant Clyde) (genls elephant mammal) (genls mammal animal)) (animal Clyde))

Disjointness Taxonomic relationships support inference via exclusion Ex: (elephant Clyde) & (disjointWith animal plant) → (not (plant Clyde))

Type constraints on arguments Restrictions on types of arguments in a predicate are extremely common Ex: (forall (?x ?y ?z) (=> (fluid-connection ?x ?y ?z) (and (fluid-path ?x) (container ?y) (container ?z)))) Can express compactly by statements about reified collections that make intent clearer (arg1-isa fluid-connection fluid-path) (arg2-isa fluid-connection container) (arg3-isa fluid-connection container)

Building an Ontology

Top-Down vs. Bottom-Up Philosophers tend toward Top-Down Programmers tend toward Bottom-Up (sort of)

Ontology and KB Design Motivations for the design and use of an ontology: Sharing information about the structure of information. Reuse of domain knowledge Making domain assumptions explicit and changeable Separation domain knowledge from operational knowledge Analysis of domain knowledge

Designing a knowledge base Before you start: What is the domain you are trying to model? How will the knowledge base be used? And by whom? Is there an existing underlying ontology, or do we start from scratch?

Designing a knowledge base Concepts and structure What are the important concepts of your domain? How are they related? Are they individuals, collections? What are the sub- and super-classes for collections?

Designing a knowledge base Axioms What is important about a particular concept? What makes it what it is (and not something else)? What consequences arise from it?

The Cyc Upper Ontology

Example: Part-whole relationships

Example: Intangible Things and Individuals

Common Mistakes Don’t confuse individuals with collections: Car is a collection, Car54 is an individual Depending on the level of detail used in your knowledge base, Car might have subclasses: Car -> PassengerCar -> Sedan Avoid cycles in collection hierarchies Subclasses are transitive: Sedan is a subclass of Car Don’t make Car a subclass of Sedan!

Common Mistakes Don’t assign too much meaning to concept names TouristAttractionsInChicagoThatDoNotChargeAdmissionOnTuesdays is a bad concept name Chair as an isolated concept, without being a sub-or superclass of another concept and without any axioms, does not say anything about chairs. Concept names and their denotations are not necessarily the same.

Common Mistakes Too many/too few subclasses Don’t squeeze too many subclasses into a concept, don’t stretch the hierarchy unnecessarily. More than a dozen subclasses might indicate the need for additional intermediate concepts. A single subclass is a sign of a modeling problem (or simply unnecessary).

Common Mistakes Disjoint concepts Partitioning the ontology via disjoint concepts is useful for reasoning. But be careful! (disjointWith Dog Thing) is unwise

Ontology = “What is there?” Answer = “Everything” Willard Van Orman Quine