OWL abstract syntax and reasoning examples

Slides:



Advertisements
Similar presentations
Dr. David A Ferrucci -- Logic Programming and AI Lecture Notes Knowledge Structures Building the Perfect Object.
Advertisements

OWL - DL. DL System A knowledge base (KB) comprises two components, the TBox and the ABox The TBox introduces the terminology, i.e., the vocabulary of.
Copyright Irwin/McGraw-Hill Data Modeling Prepared by Kevin C. Dittman for Systems Analysis & Design Methods 4ed by J. L. Whitten & L. D. Bentley.
1 Knowledge Representation We’ve discussed generic search techniques. Usually we start out with a generic technique and enhance it to take advantage of.
Of 27 lecture 7: owl - introduction. of 27 ece 627, winter ‘132 OWL a glimpse OWL – Web Ontology Language describes classes, properties and relations.
OWL TUTORIAL APT CSA 3003 OWL ANNOTATOR Charlie Abela CSAI Department.
Chapter 8: Web Ontology Language (OWL) Service-Oriented Computing: Semantics, Processes, Agents – Munindar P. Singh and Michael N. Huhns, Wiley, 2005.
The Semantic Web – WEEK 11: Description Logic + OWL Featuring..MAD COWS.
1 An Introduction To The Semantic Web. 2 Information Access on the Web Find an mp3 of a song that was on the Billboard Top Ten that features a cowbell.
Chapter 8: Web Ontology Language (OWL) Service-Oriented Computing: Semantics, Processes, Agents – Munindar P. Singh and Michael N. Huhns, Wiley, 2005.
The Semantic Web Week 15 Reasoning with (and Visualising) Ontologies Module Website: Practical :Protégé-2000 WITH.
Description Logics. Outline Knowledge Representation Knowledge Representation Ontology Language Ontology Language Description Logics Description Logics.
The Semantic Web Week 14 Module Website: Lecture (SHORT): OWL PIZZAS Practical (LONGER): Getting to know Protégé-2000.
The Semantic Web Week 12 Term 1 Recap Lee McCluskey, room 2/07 Department of Computing And Mathematical Sciences Module Website:
USCISIUSCISI Loom: Basic Concepts Thomas A. Russ USC Information Sciences Institute.
From SHIQ and RDF to OWL: The Making of a Web Ontology Language
1 © Copyright 2010 Dieter Fensel and Federico Facca Semantic Web Web Ontology Language (OWL)
Editing Description Logic Ontologies with the Protege OWL Plugin.
1st 100 sight words.
Ontology.
An Introduction to Description Logics. What Are Description Logics? A family of logic based Knowledge Representation formalisms –Descendants of semantic.
1 MASWS Multi-Agent Semantic Web Systems: OWL Stephen Potter, CISA, School of Informatics, University of Edinburgh, Edinburgh, UK.
Protege OWL Plugin Short Tutorial. OWL Usage The world wide web is a natural application area of ontologies, because ontologies could be used to describe.
1 Berendt: Knowledge and the Web, 1st semester 2014/2015, 1 Knowledge and the Web Inference on the Semantic.
OWL and SDD Dave Thau University of Kansas
OWL 2 Web Ontology Language. Topics Introduction to OWL Usage of OWL Problems with OWL 1 Solutions from OWL 2.
Building an Ontology of Semantic Web Techniques Utilizing RDF Schema and OWL 2.0 in Protégé 4.0 Presented by: Naveed Javed Nimat Umar Syed.
USCISIUSCISI Background Description Logic Systems Thomas Russ.
Description Logics. What Are Description Logics? A family of logic based KR formalisms – Descendants of semantic networks and KL-ONE – Describe domain.
Michael Eckert1CS590SW: Web Ontology Language (OWL) Web Ontology Language (OWL) CS590SW: Semantic Web (Winter Quarter 2003) Presentation: Michael Eckert.
Description Logics.
1 Berendt: Advanced databases, 1st semester 2012/2013, 1 Advanced databases – Inference on the Semantic Web.
© O. Corcho, MC Suárez de Figueroa Baonza 1 OWL and SWRL Protégé 4: Building an OWL Ontology Mari Carmen Suárez-Figueroa, Oscar Corcho {mcsuarez,
Artificial Intelligence 2004 Ontology
DAML+OIL: an Ontology Language for the Semantic Web.
OilEd An Introduction to OilEd Sean Bechhofer. Topics we will discuss Basic OilEd use –Defining Classes, Properties and Individuals in an Ontology –This.
Knowledge Representation. Keywordsquick way for agents to locate potentially useful information Thesaurimore structured approach than keywords, arranging.
Practical RDF Chapter 12. Ontologies: RDF Business Models Shelley Powers, O’Reilly SNU IDB Lab. Taikyoung Kim.
ONTOLOGY ENGINEERING Lab #2 – September 8,
A Bayesian Perspective to Semantic Web – Uncertainty modeling in OWL Jyotishman Pathak 04/28/2005.
Description Logics Dr. Alexandra I. Cristea. Description Logics Description Logics allow formal concept definitions that can be reasoned about to be expressed.
Semantic Web for the Working Ontologist Dean Allemang Jim Hendler SNU IDB laboratory.
Ontology Engineering Lab #4 - September 23, 2013.
Ccs.  Ontologies are used to capture knowledge about some domain of interest. ◦ An ontology describes the concepts in the domain and also the relationships.
1 Letting the classifier check your intuitions Existentials, Universals, & other logical variants Some, Only, Not, And, Or, etc. Lab exercise - 3b Alan.
BBY 464 Semantic Information Management (Spring 2016) Ontologies and OWL: Web Ontology Language Yaşar Tonta & Orçun Madran [yasartonta,
Ontology Design Patterns Dr Nicholas Gibbins –
OWL (Ontology Web Language and Applications) Maw-Sheng Horng Department of Mathematics and Information Education National Taipei University of Education.
COP Introduction to Database Structures
Web Ontology Language (OWL) Anna Fensel
Common MBSE Modeling Questions and How Ontology Helps
Knowledge Representation Part II Description Logic & Introduction to Protégé Jan Pettersen Nytun.
Description Logics.
ece 720 intelligent web: ontology and beyond
ece 627 intelligent web: ontology and beyond
OWL abstract syntax and reasoning examples
Lab exercise - 3a Alan Rector & colleagues
Discrete Structure II: Introduction
Survey of Knowledge Base Content
Where does one end and the other start?
ece 720 intelligent web: ontology and beyond
OWL and Inference: Practical examples Sean Bechhofer
Service-Oriented Computing: Semantics, Processes, Agents
Foundations of the Semantic Web: Ontology Engineering
Animating the reference terminology – showing classifiers at work
Description Logics.
Ontology Editors.
ONTOMERGE Ontology translations by merging ontologies Paper: Ontology Translation on the Semantic Web by Dejing Dou, Drew McDermott and Peishen Qi 2003.
CIS Monthly Seminar – Software Engineering and Knowledge Management IS Enterprise Modeling Ontologies Presenter : Dr. S. Vasanthapriyan Senior Lecturer.
Presentation transcript:

OWL abstract syntax and reasoning examples Adapted from slides by Raphael Volz, AIFB

OWL Abstract Syntax Introduced in OWL Web Ontology Language Semantics and Abstract Syntax Useful notation, see here for examples Uses a kind of functional notation, e.g. Class(pp:duck partial pp:animal) ObjectProperty(pp:has_pet domain(pp:person) range(pp:animal)) Individual(pp:Walt value(pp:has_pet pp:Huey) value(pp:has_pet pp:Louie) value(pp:has_pet pp:Dewey))

Namespaces Namespace(pp = <http://cohse.semanticweb.org/ontologies/people#>)

Partial and complete definitions Description logics reason with definitions They prefer to have complete descriptions A complete definition includes both necessary conditions and sufficient conditions Often impractical or impossible, e.g. natural kinds Primitive definition is partial or incomplete Limits classification that can be done automatically Example: Primitive: a Person Defined: Parent = Person with at least one child

Partial and complete definitions in Owl Partial definitions typically made using one or more rdfs:subClassOf relations :Parent rdfs:subClassOf :Person Knowing that john is a parent, it is necessary that he is a person Complete definitions are made with owl:equivalentClass :Parent owl:equivalentClass [a owl:inetrsection (:Person [owl:restriction …])] Knowing that john is a person and has a child is sufficient to conclude he is a parent

Definition vs. Assertion A definition is used to describe intrinsic proper-ties of an object. The parts of a description have meaning as a part of a composite description of an object An assertion is used to describe an incidental property of an object. Asserted facts have meaning on their own. Example: “a black telephone” Could be either a description or an assertion, depending on the meaning and import of “blackness” on the concept telephone.

Definition versus Assertion In English, “a black telephone” is ambiguous (1) A black telephone is a common sight in an office (2) A black telephone is on the corner of my desk KR languages should not be ambiguous so typically distinguish between descriptions of classes and descriptions of individuals KR languages often also allow additional assertions to be made that are not part of the definition (In OWL called annotation properties)

Classification is very useful Classification is a powerful kind of reasoning that is very useful Many expert systems can be usefully thought of as doing “heuristic classification” Logical classification over structured descriptions and individuals is also quite useful. But… can classification ever deduce something about an individual other than what classes it belongs to? And what does that tell us?

Incidental properties If we allow incidental properties (e.g., ones that don’t participate in the description mechanism) then these can be deduced via classification This is the purpose of owl’s annotationProperty An annotationProperty can be associated with a definition (partial or complete) It is not checked when reasoning about subsumption or instance checking

Declaring classes in OWL Naming a new class “plant”: Class(pp:plant partial) Naming some “special plants”: Class(pp:grass partial pp:plant) Class(pp:tree partial pp:plant) Alternative Declaration: Class(pp:grass partial) Class(pp:tree partial) SubClassOf(pp:grass pp:plant) SubClassOf(pp:tree pp:plant)

Declaring Properties in OWL: I A simple property: ObjectProperty(pp:eaten_by) Properties may be inverse to each other: ObjectProperty(pp:eats inverseOf(pp:eaten_by)) Domain and Ranges: ObjectProperty(pp:has_pet domain(pp:person) range(pp:animal))

Declaring Properties in OWL: II Datatype Properties: DataProperty(pp:service_number range(xsd:integer)) Property Hierarchy: SubPropertyOf(pp:has_pet pp:likes) Algebraic properties: ObjectProperty(pp:married_to Symmetric) ObjectProperty(pp:ancestor_of Transitive) ObjectProperty(pp:passport_nr Functional)

Individuals in OWL Individual(pp:Tom type(owl:Person)) Individual(pp:Dewey type(pp:duck)) Individual(pp:Rex type(pp:dog) value(pp:is_pet_of pp:Mick)) Individual(pp:Mick type(pp:male) value(pp:reads pp:NYPost) value(pp:drives pp:Fiat_500) value(pp:name "Mick"ˆxsd:string))

What follows from these descriptions? Entailment Quiz What follows from these descriptions?

Quiz # 1 Class(pp:old+lady complete intersectionOf(pp:elderly pp:female pp:person)) Class(pp:old+lady partial intersectionOf( restriction(pp:has_pet allValuesFrom(pp:cat)) restriction(pp:has_pet someValuesFrom(pp:animal))))

Quiz #1 - Solution Every old lady must have a pet cat. (Because she must have some pet and all her pets must be cats.)

What can be said about mad cows ? Quiz #2 Class(pp:cow partial pp:vegetarian) Class(pp:mad+cow complete intersectionOf(pp:cow restriction(pp:eats someValuesFrom(intersectionOf(pp:brain restriction(pp:part_of someValuesFrom pp:sheep)))))) What can be said about mad cows ?

Quiz # 2 - Solution There can be no mad cows. (Because cows, as vegetarians, don’t eat anything that is a part of an animal.)

Quiz #3 ObjectProperty(pp:has_pet domain(pp:person) range(pp:animal)) What are Minnie and Tom ? Quiz #3 ObjectProperty(pp:has_pet domain(pp:person) range(pp:animal)) Class(pp:old+lady complete intersectionOf(pp:elderly pp:female pp:person)) Class(pp:old+lady partial intersectionOf(restriction(pp:has_pet allValuesFrom(pp:cat)) restriction(pp:has_pet someValuesFrom(pp:animal)))) Individual(pp:Minnie type(pp:elderly) type(pp:female) value(pp:has_pet pp:Tom))

Quiz #3 - Solution Minnie must be a person (because pet owners are human) and thus is an old lady. Thus Tom must be a cat (because all pets of old ladies are cats).

Quiz #4 Class(pp:animal+lover complete intersectionOf(pp:person restriction(pp:has_pet minCardinality(3)))) Individual(pp:Walt type(pp:person) value(pp:has_pet pp:Huey) value(pp:has_pet pp:Louie) value(pp:has_pet pp:Dewey)) DifferentIndividuals(pp:Huey pp:Louie pp:Dewey) What is Walt ?

Quiz #4 - Solution Walt must be an animal lover. Note that stating that Walt is a person is redundant.

What are Mick and the National_ Enquirer ? Quiz #5 What are Mick and the National_ Enquirer ? Class(pp:van partial pp:vehicle) Class(pp:driver partial pp:adult) Class(pp:driver complete intersectionOf(restriction(pp:drives someValuesFrom(pp:vehicle)) pp:person)) Class(pp:white+van+man complete intersectionOf(pp:man restriction(pp:drives someValuesFrom(intersectionOf(pp:white+thing pp:van))))) Class(pp:white+van+man partial restriction(pp:reads allValuesFrom pp:tabloid)) Individual(pp:Q123+ABC type(pp:white+thing) type(pp:van)) Individual(pp:Mick type(pp:male) value(pp:reads pp:National_Enquirer) value(pp:drives pp:Q123+ABC))

Quiz #5 - Solution Mick drives a white van, so he must be an adult (because all drivers are adults). As Mick is male, thus he is a white van man, so any paper he reads must be a tabloid, thus the National Enquirer is a tabloid.