11 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez May 13th, 2004 Benchmarking.

Slides:



Advertisements
Similar presentations
Dr. Leo Obrst MITRE Information Semantics Information Discovery & Understanding Command & Control Center February 6, 2014February 6, 2014February 6, 2014.
Advertisements

Mitsunori Ogihara Center for Computational Science
AVATAR: Advanced Telematic Search of Audivisual Contents by Semantic Reasoning Yolanda Blanco Fernández Department of Telematic Engineering University.
Database Planning, Design, and Administration
Using XSLT for Interoperability: DOE and The Traveling Domain Experiment Monday 20 th of October, 2003 Antoine Isaac, Raphaël Troncy and Véronique Malaisé.
A Stepwise Modeling Approach for Individual Media Semantics Annett Mitschick, Klaus Meißner TU Dresden, Department of Computer Science, Multimedia Technology.
Using the Semantic Web to Construct an Ontology- Based Repository for Software Patterns Scott Henninger Computer Science and Engineering University of.
PR-OWL: A Framework for Probabilistic Ontologies by Paulo C. G. COSTA, Kathryn B. LASKEY George Mason University presented by Thomas Packer 1PR-OWL.
Software Quality Metrics
PROMPT: Algorithm and Tool for Automated Ontology Merging and Alignment Natalya Fridman Noy and Mark A. Musen.
PROMPT: Algorithm and Tool for Automated Ontology Merging and Alignment Natalya F. Noy and Mark A. Musen.
TOOL TO DESIGN FOR CONSTRUCTION WORKER SAFETY An Article Written by Gambatese, Hinze and Haas 1997.
OntoWeb SIG 2: Ontology Language Standards Heiner Stuckenschmidt Vrije Universiteit Amsterdam With contributions from: Ian Horrocks and Frank van Harmelen.
Methodologies, tools and languages for building ontologies. Where is their meeting point? Oscar Corcho Mariano Fernandez-Lopez Asuncion Gomez-Perez Presenter:
Software Process and Product Metrics
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 8 Slide 1 Tools of Software Development l 2 types of tools used by software engineers:
Editing Description Logic Ontologies with the Protege OWL Plugin.
Evaluating Ontology-Mapping Tools: Requirements and Experience Natalya F. Noy Mark A. Musen Stanford Medical Informatics Stanford University.
Application of PDM Technologies for Enterprise Integration 1 SS 14/15 By - Vathsala Arabaghatta Shivarudrappa.
Cluj Napoca, 28 August IEEE International Conference on Intelligent Computer Communication and Processing Digital Libraries Workshop Towards.
Managing Large RDF Graphs (Infinite Graph) Vaibhav Khadilkar Department of Computer Science, The University of Texas at Dallas FEARLESS engineering.
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
Overview of the Database Development Process
Ontology Development Kenneth Baclawski Northeastern University Harvard Medical School.
An Experimental Assessment of Semantic Web-based Integration Support - Industrial Interoperability Focus - Nenad Anicic, Nenad Ivezic, Serm Kulvatunyou.
Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking.
PART IV: REPRESENTING, EXPLAINING, AND PROCESSING ALIGNMENTS & PART V: CONCLUSIONS Ontology Matching Jerome Euzenat and Pavel Shvaiko.
September 30, 2002EON 2002Slide 1 Integrating Ontology Storage and Ontology-based Applications A lesson for better evaluation methodology Peter Mika:
WebODE and its Ontology Management APIs. April 8th © Ontology Engineering Group WebODE and its Ontology Management APIs Ontology Engineering Group.
11111 Benchmarking in KW. Sep 10th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez September 10th, 2004 Benchmarking.
Benchmarking the interoperability of ODTs. April 7th © Raúl García-Castro, Asunción Gómez-Pérez Benchmarking the interoperability of ontology development.
Configuration Management (CM)
Of 33 lecture 10: ontology – evolution. of 33 ece 720, winter ‘122 ontology evolution introduction - ontologies enable knowledge to be made explicit and.
Košice, 10 February Experience Management based on Text Notes The EMBET System Michal Laclavik.
Development Process and Testing Tools for Content Standards OASIS Symposium: The Meaning of Interoperability May 9, 2006 Simon Frechette, NIST.
EU Project proposal. Andrei S. Lopatenko 1 EU Project Proposal CERIF-SW Andrei S. Lopatenko Vienna University of Technology
CORPORUM-OntoExtract Ontology Extraction Tool Author: Robert Engels Company: CognIT a.s.
Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 27th, 2004 Benchmarking.
1 Open Ontology Repository: Architecture and Interfaces Ken Baclawski Northeastern University 1.
Knowledge Representation of Statistic Domain For CBR Application Supervisor : Dr. Aslina Saad Dr. Mashitoh Hashim PM Dr. Nor Hasbiah Ubaidullah.
Ontologies Come of Age Deborah L. McGuinness Stanford University “The Semantic Web: Why, What, and How, MIT Press, 2001” Presented by Jungyeon, Yang.
Using Several Ontologies for Describing Audio-Visual Documents: A Case Study in the Medical Domain Sunday 29 th of May, 2005 Antoine Isaac 1 & Raphaël.
Cmpe 589 Spring 2006 Lecture 2. Software Engineering Definition –A strategy for producing high quality software.
Software Engineering and Ontological Engineering: Contributions, Perspectives, Challenges and Learned Lessons Marcelo José Siqueira C. de Almeida
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
FAO of the UN Library and Documentation Systems Division AOS workshop Beijing April 04 Tutorial 2: Ontology Tools Boris Lauser Food and Agriculture Organization.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
A Metrics Program. Advantages of Collecting Software Quality Metrics Objective assessments as to whether quality requirements are being met can be made.
Towards a Glossary of Activities in the Ontology Engineering Field Mari Carmen Suárez-Figueroa and Asunción Gómez-Pérez {mcsuarez, Ontology.
ESIP Semantic Web Products and Services ‘triples’ “tutorial” aka sausage making ESIP SW Cluster, Jan ed.
Architecture for an Ontology and Web Service Modelling Studio Michael Felderer & Holger Lausen DERI Innsbruck Frankfurt,
Export experiments in WebODE. October 10th © Raúl García-Castro Export experiments in WebODE Raúl García-Castro October 10th, 2005 Interoperability.
1 Class exercise II: Use Case Implementation Deborah McGuinness and Peter Fox CSCI Week 8, October 20, 2008.
1 Open Ontology Repository initiative - Planning Meeting - Thu Co-conveners: PeterYim, LeoObrst & MikeDean ref.:
Jens Hartmann York Sure Raphael Volz Rudi Studer The OntoWeb Portal.
State of Georgia Release Management Training
Manufacturing Systems Integration Division Development Process and Testing Tools for Content Standards Simon Frechette National Institute of Standards.
A Portrait of the Semantic Web in Action Jeff Heflin and James Hendler IEEE Intelligent Systems December 6, 2010 Hyewon Lim.
PDS4 Demonstration Management Council Face-to-Face Flagstaff, AZ August 22-23, 2011 Sean Hardman.
Ontologies for the Semantic Web Prepared By: Tseliso Molukanele Rapelang Rabana Supervisor: Associate Professor Sonia Burman 20 July 2005.
Experimentation phase 2. October 11th © Raúl García-Castro Experimentation Phase 2 Raúl García-Castro October 11th, 2005 Interoperability Working.
Versatile Information Systems, Inc International Semantic Web Conference An Application of Semantic Web Technologies to Situation.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
Chapter 9 Database Planning, Design, and Administration Transparencies © Pearson Education Limited 1995, 2005.
Technische Universität München © Prof. Dr. H. Krcmar An Ontology-based Platform to Collaboratively Manage Supply Chains Tobias Engel, Manoj Bhat, Vasudhara.
Working meeting of WP4 Task WP4.1
Chapter 11: Software Configuration Management
Chapter 11: Software Configuration Management
Software Requirements Specification (SRS) Template.
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 8 Slide 1 Tools of Software Development l 2 types of tools used by software engineers:
Presentation transcript:

11 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez May 13th, 2004 Benchmarking Ontology Technology

22 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Table of contents Benchmarking Experimental Software Engineering Measurement Ontology Technology Evaluation Conclusions and Future Work

33 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Benchmark vs Benchmarking BenchmarkBenchmarking IS A TestContinuous process PURPOSE Measure Evaluate Search for best practices – Measure – Evaluate Improve TARGET Method System Product Service Process

44 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Benchmarking Classification Participants involved Internal benchmarkingOne organization Competitive benchmarkingDirect competitor Functional/industry benchmarkingCompetitors in the same industry Generic benchmarkingCompetitors in any industry

55 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Experimental Software Engineering ExperimentExperimentation IS A TestProcess PURPOSE Discover Demonstrate Clarify Evaluate Predict Understand Improve TARGET Software process Software product Software process Software product

66 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Experiment Classification Number of projects OneMore than one Number of teams per project OneSingle projectMulti-project variation More than one Replicated projectBlocked subject-project

77 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Measurement Measurable entities: ResourceProduct Process Attributes Internal attributes. Measured in terms of the entity itself. External attributes. Measured with respect to how the entity relates to its environment.

88 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Methodologies PlanDesignImplement Execute AnalyzeInformChange Benchmarking PlanningAnalysisIntegrationAction PlanCollectAnalyzeAdapt Experimentation DefinitionPlanningOperationInterpretation Experiment contextExperiment design Conducting the experiment and data collection AnalysisPresentation of results Interpretation of results Measurement Define objectives Assign responsibilities Do research Define initial metrics Get tools for collection and analysis Create a metrics database Publicize the collection of the metrics Establish a training class in software metrics Establish a mechanism for changing InitializationRequirements definition Component design Component buildImplementation

99 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Table of contents Benchmarking Experimental Software Engineering Measurement Ontology Technology Evaluation Conclusions and Future Work

10 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez General framework for ontology tool evaluation OntoWeb deliverable 1.3: Tool comparison according to different criteria: Ontology building tools Ontology merge and integration tools Ontology evaluation tools Ontology-based annotation tools Ontology storage and querying tools

11 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Ontology building tool evaluation AuthorsToolsCriteria Duineveld et al., 1999 Ontolingua, WebOnto, ProtégéWin, Ontosaurus, ODE General properties that can also be found in other types of programs Ontology properties found in the tools Cooperation properties when constructing an ontology Stojanovic and Motik, 2002OilEd, OntoEdit, Protégé-2000Ontology evolution requirements fulfilled by the tool Sofia Pinto et al., 2002Protégé-2000Support provided in ontology reuse processes Time and effort for developing an ontology Usability EON 2002KAON, Loom, OilEd, OntoEdit, OpenKnoME, Protégé-2000, SemTalk, Terminae, WebODE Expressiveness of the knowledge model attached to the tool Usability Reasoning mechanisms Scalability Gómez-Pérez and Suárez- Figueroa, 2003 OilEd, OntoEdit, Protégé-2000, WebODE Ability to detect taxonomic anomalies Lambrix et al., 2003Chimaera, DAG-Edit, OilEd, Protégé-2000 General criteria (availability, functionality, multiple instance, data model, reasoning, sample ontologies, reuse, formats, visualization, help, shortcuts, stability, customization, extendibility, multiple users) User interface criteria (relevance, efficiency, attitude and learnability). EON 2003DOE, OilEd, Protégé-2000, SemTalk, WebODE Interoperability Amount of knowledge lost during exports and imports Corcho et al., 2004WebODETemporal efficiency Stability

12 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Other ontology tool evaluation AuthorType of toolCriteria Giboin et al., 2002Ontology-based tools (CoMMA) Usability Utility Sure and Iosif, 2002Ontology-based search tools (QuizRDF and Spectacle) Information nding time Mistakes during a search Mistakes during a search Noy and Musen, 2002Ontology merging tools (Prompt) Precision and recall of the tools suggestions Difference between result ontologies Lambrix and Edberg, 2003Ontology merging tools (Prompt and Chimaera) General criteria (availability, stability) Merging criteria (functionality, assistance, precision and recall of suggestions, time to merge) User interface criteria (relevance, efciency, attitude and learnability) Guo et al., 2003Ontology repositories (DLDB) Load time Repository size Query response time Completeness Euzenat, 2003Ontology alignment methodsDistance between alignments Amount of resource consumed (time, memory, user input, etc.)

13 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Workload generation for ontology tools Tempich and Volz, 2003Ontology classification (DAML ontology library): Taxonomic nature Description logic style Database schema-like OntoWeb D 1.3, 2002OntoGenerator Guo et al., 2003Univ-Bench Artificial data generator Corcho et al., 2004Workload generated by test definition

14 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez RDF and OWL test suites W3C RDF Core Working Group: RDF test suite W3C Web Ontology Working Group: OWL test suite Check the correct usage of the tools that implement RDF and OWL KBs Illustrate the resolution of different issues considered by the Working Groups

15 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Description Logics systems comparison DL’98Measure DL systems’ performance Haarslev and Möller, 1999 Evaluate optimisation strategies for Abox reasoning DL’99Evaluate generic DL systems’ features: logic implemented, availability, future plans, etc. Elhaik et al., 1998Randomly generation of Tboxes and Aboxes according to probability distributions Ycart and Rousset, 2000 Defined a natural probability distribution of Aboxes associated to a given Tbox

16 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Table of contents Benchmarking Experimental Software Engineering Measurement Ontology Technology Evaluation Conclusions and Future Work

17 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Conclusions We present an overview of the main research areas involved in benchmarking. There is no common benchmarking methodology, although they are general and similar. Evaluation studies concerning ontology technology have been scarce. In the last years the effort devoted to evaluate ontology technology is significantly growing. Most of the evaluation studies are of qualitative nature. Few of them involve quantitative data and controlled environments.

18 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Future Work 1.State of the Art (month 6) 2.First draft of a methodology (month 12) 3.Identification of criteria (month 12) 4.Identification of metrics (month 12) 5.Definition of test beds for benchmarking (month 12) 6.Development of first versions of prototypes of tools (month 18) 7.Benchmarking of ontology development tools according to the criteria and test beds produced (month 18) Goals (18 months):

19 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez May 13th, 2004 Benchmarking Ontology Technology