Experimentation phase 2. October 11th 2005 1 © Raúl García-Castro Experimentation Phase 2 Raúl García-Castro October 11th, 2005 Interoperability Working.

Slides:



Advertisements
Similar presentations
1 June 22, 2010 Sheila Gropp True-Up Update June 15, 2010 Posting Formula Rate Customer Meeting.
Advertisements

Dr. Bruce A. Scharlau, AHDIT, ES2002 E-Business Workshop AHDIT: Ad Hoc Data Interoperability Tool Dr. Bruce A. Scharlau Dept. of Computing Science University.
Dr. Bruce A. Scharlau, AHDIT, August 2002 AHDIT: Ad Hoc Data Interoperability Tool Dr. Bruce A. Scharlau Dept. of Computing Science University of Aberdeen.
Using XSLT for Interoperability: DOE and The Traveling Domain Experiment Monday 20 th of October, 2003 Antoine Isaac, Raphaël Troncy and Véronique Malaisé.
Improvements on the benchmark suites. October 10th © Raúl García-Castro Improvements on the benchmark suites Raúl García-Castro October 10th, 2005.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
Method for developing the benchmark suites. October 10th © Raúl García-Castro Method for developing the benchmark suites Raúl García-Castro October.
Chapter 6: Modeling and Representation Service-Oriented Computing: Semantics, Processes, Agents – Munindar P. Singh and Michael N. Huhns, Wiley, 2005.
Wrap up – Day 1. October 10th © Raúl García-Castro Wrap up – Day 1 Raúl García-Castro October 10th, 2005 Interoperability Working Days October 10th-11th,
Exploiting Synergy between Ontology and Recommender Systems Middleton, S. T., Alani, H. & De Roure D. C. (2002) Semantic Web Workship 2002 Presented by.
PR-OWL: A Framework for Probabilistic Ontologies by Paulo C. G. COSTA, Kathryn B. LASKEY George Mason University presented by Thomas Packer 1PR-OWL.
Programming with Objects: Class Libraries and Reusable Code.
PROMPT: Algorithm and Tool for Automated Ontology Merging and Alignment Natalya Fridman Noy and Mark A. Musen.
Approaching Web-Based Expertise with Semantic Web Kimmo Salmenjoki: Department of Computer Science, University of Vaasa, Vagan Terziyan: Department.
Methodologies, tools and languages for building ontologies. Where is their meeting point? Oscar Corcho Mariano Fernandez-Lopez Asuncion Gomez-Perez Presenter:
SemanTic Interoperability To access Cultural Heritage Frank van Harmelen Henk Matthezing Peter Wittenburg Marjolein van Gendt Antoine Isaac Lourens van.
Evaluating Ontology-Mapping Tools: Requirements and Experience Natalya F. Noy Mark A. Musen Stanford Medical Informatics Stanford University.
OntoStudio & KAON results of interoperability benchmark 10. october 2005 Markus Zondler, University of Karlsruhe.
In The Name Of God. Jhaleh Narimisaei By Guide: Dr. Shadgar Implementation of Web Ontology and Semantic Application for Electronic Journal Citation System.
1 Foundations V: Infrastructure and Architecture, Middleware Deborah McGuinness and Peter Fox CSCI Week 9, October 27, 2008.
BiodiversityWorld GRID Workshop NeSC, Edinburgh – 30 June and 1 July 2005 Metadata Agents and Semantic Mediation Mikhaila Burgess Cardiff University.
Blaz Fortuna, Marko Grobelnik, Dunja Mladenic Jozef Stefan Institute ONTOGEN SEMI-AUTOMATIC ONTOLOGY EDITOR.
Updated Performance Management for Exempt Staff Fall 2009.
Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking.
Templates. The Problem Supplier X A range on the data sheet.
RDF and OWL Developing Semantic Web Services by H. Peter Alesso and Craig F. Smith CMPT 455/826 - Week 6, Day Sept-Dec 2009 – w6d21.
Key Foundational Layers Classification, Vocab, Web Stds –Classification Content Model (Taxonomy) –Web Std Tech Requirements W3C, cloud, online, offline.
Taxonomic RuleML Tab Onut Iosif-Viorel Sandeep Singh.
WebODE and its Ontology Management APIs. April 8th © Ontology Engineering Group WebODE and its Ontology Management APIs Ontology Engineering Group.
11111 Benchmarking in KW. Sep 10th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez September 10th, 2004 Benchmarking.
Benchmarking the interoperability of ODTs. April 7th © Raúl García-Castro, Asunción Gómez-Pérez Benchmarking the interoperability of ontology development.
OWL 2 in use. OWL 2 OWL 2 is a knowledge representation language, designed to formulate, exchange and reason with knowledge about a domain of interest.
1 Ontology-based Semantic Annotatoin of Process Template for Reuse Yun Lin, Darijus Strasunskas Depart. Of Computer and Information Science Norwegian Univ.
Development Process and Testing Tools for Content Standards OASIS Symposium: The Meaning of Interoperability May 9, 2006 Simon Frechette, NIST.
Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 27th, 2004 Benchmarking.
SEMANTIC WEB FOR A HOSPITAL
Export experiments in Corese. October 10th Export experiments in Corese Olivier Corby October 10th, 2005 Interoperability Working Days October 10th-11th,
Semantic Web Services Landscape Ontolog Tutorial Nov. 6, 2003 Bob Smith, Ph.D. Professor Emeritus, CSU Tall Tree Labs-Semtation USA Christian Fillies Semtation,
Using Several Ontologies for Describing Audio-Visual Documents: A Case Study in the Medical Domain Sunday 29 th of May, 2005 Antoine Isaac 1 & Raphaël.
BioRAT: Extracting Biological Information from Full-length Papers David P.A. Corney, Bernard F. Buxton, William B. Langdon and David T. Jones Bioinformatics.
FAO of the UN Library and Documentation Systems Division AOS workshop Beijing April 04 Tutorial 2: Ontology Tools Boris Lauser Food and Agriculture Organization.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
Semantic Publishing Benchmark Task Force Fourth TUC Meeting, Amsterdam, 03 April 2014.
The Role of Embedded Metadata in Visual Resources Mira Basara DSPS January 2015.
Export experiments in WebODE. October 10th © Raúl García-Castro Export experiments in WebODE Raúl García-Castro October 10th, 2005 Interoperability.
Ontology-based Student Modelling Desislava Paneva Institute of Mathematics and Informatics Bulgarian Academy of Sciences
OWL & Protege Introduction Dongfang Xu Ph.D student, School of Information, University of Arizona Sept 10, 2015.
ELIS – Multimedia Lab PREMIS OWL Sam Coppens Multimedia Lab Department of Electronics and Information Systems Faculty of Engineering Ghent University.
11 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez May 13th, 2004 Benchmarking.
1/(27) GATE Ontology Tools GATE Training Course October 2006 Kalina Bontcheva
Ontologizing EDI doug foxvog 23 July Ontologizing EDI What is EDI? EDI Data Types Ontologizing of EDI Ontologizing Invoice Message Type Summary.
Appendix A Becoming an Adobe Certified Associate.
Conclusion and follow-up. October 10th © Raúl García-Castro Conclusion and follow-up Raúl García-Castro October 11th, 2005 Interoperability Working.
Characterizing Knowledge on the Semantic Web with Watson Mathieu d’Aquin, Claudio Baldassarre, Laurian Gridinoc, Sofia Angeletou, Marta Sabou, Enrico Motta.
Import experiments in Protégé. October 10th © Raúl García-Castro Import experiments in Protégé Raúl García-Castro October 10th, 2005 Interoperability.
Of 24 lecture 11: ontology – mediation, merging & aligning.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
Database Applications – Microsoft Access Lesson 5 Shared Data.
Web Design Unit 5.
DOMAIN ONTOLOGY DESIGN
Development of the Amphibian Anatomical Ontology
Business Practices Subcommittee Update
Semantic Web Technologies
Semantic Web - Ontologies
Lesson 1.6 Inverse Functions
League of Legends Update: 4/2/17
IDEAS Core Model Concept
V261 Final Elisabeth Litvak.
Day 1: Introduction to Ramp Tools
Knowledge Representation Mandatory Exercise II IKT
Presentation transcript:

Experimentation phase 2. October 11th © Raúl García-Castro Experimentation Phase 2 Raúl García-Castro October 11th, 2005 Interoperability Working Days October 10th-11th, 2005

Experimentation phase 2. October 11th © Raúl García-Castro Experimentation phase 2 Once the RDF(S) importers and exporters have been evaluated, the second phase will cover the evaluation of the ontology exchange between ontology development tools. Goal: To obtain the elements of the knowledge model of an ontology development tool that can be interchanged with another ontology development tool using RDF(S) for exchanging ontologies. Tool X RDF(S) Tool Y ExportImport Interchange

Experimentation phase 2. October 11th © Raúl García-Castro Interoperability Benchmark Suite The interoperability benchmark suite is identical to the RDF(S) Export Benchmark Suite. WebODE RDF(S) files Protégé RDF(S) files KAON RDF(S) files OntoStudio RDF(S) files DOE RDF(S) files Corese RDF(S) files Tool X IMPORT Export files are available from previous experimentation phase.

Experimentation phase 2. October 11th © Raúl García-Castro Steps to follow The expected ontologies that should be obtained when importing the RDF(S) files that the other tool exported are already defined. They are each tool’s instantiation of the corresponding export benchmarks. To import into our tool the RDF(S) files with the ontologies. To compare the ontologies imported in our tool with the expected ontologies. IdDescriptionWebODE’s instatiationProtégé’s instantiation (expected result) E44Export one instance that has an object property with another instance of the same class Export one concept that has a relation with itself with a cardinality of "N" and one instance of the concept that has a relation instance with another instance of the same concept Export just one class with a template slot with a cardinality of 1 and of type Instance of itself, and two instances of the class related by the slot.

Experimentation phase 2. October 11th © Raúl García-Castro Document with the results The process followed for executing the benchmark suite (including any modifications performed in the tool). The results obtained in each benchmark: –The expected result of the benchmark. –Knowedge added/lost in the exchange. –If the tool passes the benchmark or does not. –If not, the reason for not passing the benchmark. –If the tool does not pass the benchmark and is corrected to pass it, and the changes performed. Comments on the results. Comments on the benchmark suites. Comments on improving the interoperability between the tools.

Experimentation phase 2. October 11th © Raúl García-Castro Timeline October 24th, 2005Benchmark suite and RDF(S) files on the web January 2nd, 2006Interoperability results

Experimentation phase 2. October 11th © Raúl García-Castro Experimentation Phase 2 Raúl García-Castro October 11th, 2005 Interoperability Working Days October 10th-11th, 2005