n-ary relations OWL modeling problem when n≥3

Slides:



Advertisements
Similar presentations
Dr. Leo Obrst MITRE Information Semantics Information Discovery & Understanding Command & Control Center February 6, 2014February 6, 2014February 6, 2014.
Advertisements

Charting the Potential of Description Logic for the Generation of Referring Expression SELLC, Guangzhou, Dec Yuan Ren, Kees van Deemter and Jeff.
Ontologies and Databases Ian Horrocks Information Systems Group Oxford University Computing Laboratory.
Chronos: A Tool for Handling Temporal Ontologies in Protégé
An Introduction to RDF(S) and a Quick Tour of OWL
April 15, 2004SPIE1 Association in Level 2 Fusion Mieczyslaw M. Kokar Christopher J. Matheus Jerzy A. Letkowski Kenneth Baclawski Paul Kogut.
Knowledge Representation
C-OWL: contextualizing ontologies Fausto Giunchiglia October 22, 2003 Paolo Bouquet, Fausto Giunchiglia, Frank van Harmelen, Luciano Serafini, and Heiner.
CS 355 – Programming Languages
Ontologies and the Semantic Web by Ian Horrocks presented by Thomas Packer 1.
A Probabilistic Framework for Information Integration and Retrieval on the Semantic Web by Livia Predoiu, Heiner Stuckenschmidt Institute of Computer Science,
Knowledge Acquisitioning. Definition The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
Developing Ideas for Research and Evaluating Theories of Behavior
Knowledge Mediation in the WWW based on Labelled DAGs with Attached Constraints Jutta Eusterbrock WebTechnology GmbH.
Reasoning with context in the Semantic Web … or contextualizing ontologies Fausto Giunchiglia July 23, 2004.
An Introduction to Description Logics. What Are Description Logics? A family of logic based Knowledge Representation formalisms –Descendants of semantic.
Knowledge representation
A Logic for Decidable Reasoning about Services Yilan Gu Dept. of Computer Science University of Toronto Mikhail Soutchanski Dept. of Computer Science Ryerson.
Of 39 lecture 2: ontology - basics. of 39 ontology a branch of metaphysics relating to the nature and relations of being a particular theory about the.
 Knowledge Acquisition  Machine Learning. The transfer and transformation of potential problem solving expertise from some knowledge source to a program.
Ming Fang 6/12/2009. Outlines  Classical logics  Introduction to DL  Syntax of DL  Semantics of DL  KR in DL  Reasoning in DL  Applications.
MPEG-7 Interoperability Use Case. Motivation MPEG-7: set of standardized tools for describing multimedia content at different abstraction levels Implemented.
3 DERIVATIVES. In this section, we will learn about: Differentiating composite functions using the Chain Rule. DERIVATIVES 3.5 The Chain Rule.
Next-generation databases Active databases: when a particular event occurs and given conditions are satisfied then some actions are executed. An active.
ISBN Chapter 3 Describing Semantics -Attribute Grammars -Dynamic Semantics.
Temporal Reasoning and Planning in Medicine Frame-Based Representations and Description Logics Yuval Shahar, M.D., Ph.D.
Semantic web course – Computer Engineering Department – Sharif Univ. of Technology – Fall Description Logics: Logic foundation of Semantic Web Semantic.
WSMO Discovery Realization in Semantic Web Fred Michael Stollberg - 03 November
Semantic web course – Computer Engineering Department – Sharif Univ. of Technology – Fall Knowledge Representation Semantic Web - Fall 2005 Computer.
Mapping Guide Mapping Ontologies and Data Sets in RDF/RDFS/OWL2 Michel Böhms.
Using Several Ontologies for Describing Audio-Visual Documents: A Case Study in the Medical Domain Sunday 29 th of May, 2005 Antoine Isaac 1 & Raphaël.
Overview Concept Learning Representation Inductive Learning Hypothesis
3.2 Semantics. 2 Semantics Attribute Grammars The Meanings of Programs: Semantics Sebesta Chapter 3.
3 DERIVATIVES.  Remember, they are valid only when x is measured in radians.  For more details see Chapter 3, Section 4 or the PowerPoint file Chapter3_Sec4.ppt.
Semantic Web BY: Josh Rachner and Julio Pena. What is the Semantic Web? The semantic web is a part of the world wide web that allows data to be better.
Some Thoughts to Consider 8 How difficult is it to get a group of people, or a group of companies, or a group of nations to agree on a particular ontology?
Of 38 lecture 6: rdf – axiomatic semantics and query.
1 September 2003 Brokering Services A Transformation-Based Perspective Stephen Potter & Marco Schorlemmer
Using OWL 2 For Product Modeling David Leal Caesar Systems April 2009 Henson Graves Lockheed Martin Aeronautics.
Mathematical Service Matching Using Description Logic and OWL Kamelia Asadzadeh Manjili
Ccs.  Ontologies are used to capture knowledge about some domain of interest. ◦ An ontology describes the concepts in the domain and also the relationships.
Definition and Technologies Knowledge Representation.
Knowledge Representation Lecture 2 out of 5. Last Week Intelligence needs knowledge We need to represent this knowledge in a way a computer can process.
OWL (Ontology Web Language and Applications) Maw-Sheng Horng Department of Mathematics and Information Education National Taipei University of Education.
COP Introduction to Database Structures
Human Computer Interaction Lecture 21 User Support
Integrating SysML with OWL (or other logic based formalisms)
Service-Oriented Computing: Semantics, Processes, Agents
ece 627 intelligent web: ontology and beyond
Web Service Modeling Ontology (WSMO)
Using Rules with Ontologies in the Semantic Web
Rules, RIF and RuleML.
Knowledge Representation
Service-Oriented Computing: Semantics, Processes, Agents
ece 720 intelligent web: ontology and beyond
Introduction Artificial Intelligent.
Entity Relationship Diagrams
Developing and Evaluating Theories of Behavior
ece 627 intelligent web: ontology and beyond
What would be our focus ? Geometry deals with Declarative or “What is” knowledge. Computer Science deals with Imperative or “How to” knowledge 12/25/2018.
Linking Guide Michel Böhms.
Semantic Information Modeling for Federation
Service-Oriented Computing: Semantics, Processes, Agents
What would be our focus ? Geometry deals with Declarative or “What is” knowledge. Computer Science deals with Imperative or “How to” knowledge 2/23/2019.
Ontologies and Databases
This Lecture Substitution model
Test-Driven Ontology Development in Protégé
Deniz Beser A Fundamental Tradeoff in Knowledge Representation and Reasoning Hector J. Levesque and Ronald J. Brachman.
Chapter 6b: Database Design Using the E-R Model
CIS Monthly Seminar – Software Engineering and Knowledge Management IS Enterprise Modeling Ontologies Presenter : Dr. S. Vasanthapriyan Senior Lecturer.
Presentation transcript:

n-ary relations OWL modeling problem when n≥3 E.g. complex events Known solution with reification and binary projections The situation content pattern and its specializations (time-indexed situations, various event models, etc.) are a good approximation, modulo argument ordering, and identity constraint two new patterns: ordereddescription, and orderedsituation 1

ordered description pattern enables to represent e.g.: buys(x,y,z,t,p) Buyer(x), Seller(y), Product(z), Time(t), Place(p) Person(x), Person(y) Time(t*) only contextual distinctions in basic n-ary pattern no distinction in basic n-ary pattern! 2

ordered situation pattern OWL2 punning here for a version directed at situations directly

still missing identity constraint still misses identity constraint, e.g. given the predicate: buys(x,y,z,t,p) [P1] axiom A1 (identity constraint) holds: ∀(x,y,z,t,p)(buys(x,y,z,t,p) = buys(x,y,z,t,p)) [A1] A1 is trivial in FOL now, if P1 is morphed by the following predicates: buys(r), buys1(r,x), buys2(r,y), buys3(r,z), buys4(r,t), buys5(r,p) [P2] axiom A1 needs be morphed as follows (in case no list variables are present): buys(r) ⟷ ∃!(x,y,z,t,p)(buys1(r,x) ∧ buys2(r,y) ∧ buys3(r,z) ∧ buys4(r,t) ∧ buys5(r,p)) [A2] but axiom A2 cannot be expressed in OWL; we need either a rule, or a programmatic mechanism (e.g. a wizard, or a manual procedure) that ensures that any buys_n is functional since OWA holds for OWL, no reasoner will infer that two individuals A and B occurring both in the place of e.g. x in buys1(r,x) are the same individual; it’ll only detect an inconsistency in case A owl:differentFrom B still, this pattern does not ensure (in a model-theoretical way) that two instances B1 and B2 of buys(r) with the same relations to the same individuals are the same individual: for this additional inference, we need either a SPARQL or RIF formula (here a RIF example in next slide) 4

forall ?e (?b1 [owl:sameAs :?b2] ← and ( ?e [:hasSetting ?b1] ?e [:hasSetting ?b2] not ( ?f [:hasSetting ?b1] not (?f [:hasSetting ?b2]))) in case of list variables, a more complex pattern should be devised forall ?s ?e ?c 5

... however ... Many problem-solving tasks require reasoning over alternative conceptualizations of a same situation or a set of facts classical problem: views, schema evolution, multiple ontologies over the same data, ontology matching, ... traditionally, the universe of facts is not the same as the universe of concepts 6

... however/2 ... Semantic web and punning capabilities prove it’s important to have a clear shared universe of discourse Also traditional “hard” tasks need it legal knowledge (facts vs. cases vs. norms vs. meta-norms) services, planning and control (actual vs. expected facts) diagnoses and situation awareness (ground vs. (un)wanted facts) hypothesis testing and CBR (building/testing interpretations from facts) ... 7

Situation classification A common abstract task “Classifying a situation” according to possibly incomplete, alternative or loose constraints and concepts Constraints must be explicit and explicitly linked to the representation of a situation descriptionandsituation pattern enables the representation of situations and descriptions, and their given links entities of a situation that play roles from a description, values of a situation that fit parameters from a description, etc.). But it's very hard to represent in general how to: make a situation emerge out of scattered facts decide if a situation can be (partly or fully) classified under a description evaluate which description fits best a certain set of facts 8

Example I can get classification from this (also via rule) RegularSale equivalentClass (isSettingFor some (isClassifiedBy some (isConceptUsedIn value SaleModel))) But I cannot state that each thing in the setting (or at least n things) must be classified by a concept that is used in SaleModel 9

D&S example isConceptUsedIn Seller Buyer Product Time Place SaleModel isClassifiedBy satisfies Apple Mustafa iPad Nov2010 Shanghai Sale#1 isSettingFor rdf:type RegularSale

[1] OWL: minimal restriction or property chain RegularSale equivalentClass (isSettingFor some (isClassifiedBy some (isConceptUsedIn value SaleModel))) (isSettingFor o isClassifiedBy o isConceptUsedIn) subPropertyOf satisfies

[1b] OWL: maximal restriction (does not work because of OWA) RegularSale equivalentClass (isSettingFor only (isClassifiedBy some (isConceptUsedIn value SaleModel)))

[2] OWL: detailed list of restrictions or property chains RegularSale equivalentClass ( (isSettingFor some (isClassifiedBy some (Seller and (isConceptUsedIn value SaleModel))) (Buyer and (isConceptUsedIn value SaleModel))) (Product and (isConceptUsedIn value SaleModel))) ... ) property chain here needs special properties to chain, for each concept 13

[3] SPARQL basic with SELECT or ASK; CONSTRUCT also possible to mimick a rule, but requires both positive and negative conditions SELECT ?s WHERE { ?s rdf:type :Situation . ?s:isSettingFor ?e . OPTIONAL {?e :isClassifiedBy ?c . ?c :isConceptUsedIn :SaleModel } . FILTER (!bound(?c)) } OWL2 version OPTIONAL {?e rdf:type ?c . ?c :isConceptUsedIn :SaleModel } . 14

[4] RIF OWL1 version OWL2 version forall ?s ?e ?c (?s [:satisfies :SaleModel] | ?s [rdf:type :RegularSale] ← and ( ?e [:hasSetting ?s] ?e [:isClassifiedBy ?c] ?c [:isConceptUsedIn :SaleModel])) OWL2 version ?e [rdf:type ?c] 15