Evaluating Ontology-Mapping Tools: Requirements and Experience Natalya F. Noy Mark A. Musen Stanford Medical Informatics Stanford University.

Slides:



Advertisements
Similar presentations
1 Using Ontologies in Clinical Decision Support Applications Samson W. Tu Stanford Medical Informatics Stanford University.
Advertisements

Schema Matching and Query Rewriting in Ontology-based Data Integration Zdeňka Linková ICS AS CR Advisor: Július Štuller.
Chapter 11 user support. Issues –different types of support at different times –implementation and presentation both important –all need careful design.
Learning to Map between Ontologies on the Semantic Web AnHai Doan, Jayant Madhavan, Pedro Domingos, and Alon Halevy Databases and Data Mining group University.
SECOND MIDTERM REVIEW CS 580 Human Computer Interaction.
Kunal Narsinghani Ashwini Lahane Ontology Mapping and link discovery.
A general-purpose text annotation tool called Knowtator is presented. Knowtator facilitates the manual creation of annotated corpora that can be used for.
Kyriakos Kritikos (ΥΔ) Miltos Stratakis (MET)
Copyright WebGiro AB, All rights reserved. ECIMF Project Group Meeting, Brussels, Knowledge Engineering Tools and Semantic Translation.
Internet Reasoning Service: Progress Report Wenjin Lu and Enrico Motta Knowledge Media Institute Monica Crubézy Stanford Medical Informatics.
Software Testing and Quality Assurance
Help and Documentation zUser support issues ydifferent types of support at different times yimplementation and presentation both important yall need careful.
A Review of Ontology Mapping, Merging, and Integration Presenter: Yihong Ding.
1 / 31 CS 425/625 Software Engineering User Interface Design Based on Chapter 15 of the textbook [SE-6] Ian Sommerville, Software Engineering, 6 th Ed.,
Knowledge Modelling: Foundations, Techniques and Applications Enrico Motta Knowledge Media Institute The Open University United Kingdom.
PROMPT: Algorithm and Tool for Automated Ontology Merging and Alignment Natalya Fridman Noy and Mark A. Musen.
PROMPT: Algorithm and Tool for Automated Ontology Merging and Alignment Natalya F. Noy Stanford Medical Informatics Stanford University.
Biological Ontologies Neocles Leontis April 20, 2005.
PROMPT: Algorithm and Tool for Automated Ontology Merging and Alignment Natalya F. Noy and Mark A. Musen.
Protégé An Environment for Knowledge- Based Systems Development Haishan Liu.
PROMPT: Algorithm and Tool for Automated Ontology Merging and Alignment Natalya Fridman Noy and Mark A. Musen.
Building Knowledge-Driven DSS and Mining Data
1 CIS607, Fall 2005 Semantic Information Integration Presentation by Amanda Hosler Week 6 (Nov. 2)
Methodologies, tools and languages for building ontologies. Where is their meeting point? Oscar Corcho Mariano Fernandez-Lopez Asuncion Gomez-Perez Presenter:
OIL: An Ontology Infrastructure for the Semantic Web D. Fensel, F. van Harmelen, I. Horrocks, D. L. McGuinness, P. F. Patel-Schneider Presenter: Cristina.
State of the Art Ontology Mapping By Justin Martineau.
Semantic Interoperability Jérôme Euzenat INRIA & LIG France Natasha Noy Stanford University USA.
BiodiversityWorld GRID Workshop NeSC, Edinburgh – 30 June and 1 July 2005 Metadata Agents and Semantic Mediation Mikhaila Burgess Cardiff University.
Knowledge Representation Ontology are best delivered in some computable representation Variety of choices with different: –Expressiveness The range of.
SWE 316: Software Design and Architecture – Dr. Khalid Aljasser Objectives Lecture 11 : Frameworks SWE 316: Software Design and Architecture  To understand.
Design Science Method By Temtim Assefa.
Of 39 lecture 2: ontology - basics. of 39 ontology a branch of metaphysics relating to the nature and relations of being a particular theory about the.
Taxonomic RuleML Tab Onut Iosif-Viorel Sandeep Singh.
Košice, 10 February Experience Management based on Text Notes The EMBET System Michal Laclavik.
February 24, 2006 ONTOLOGIES Helena Sofia Pinto ( )
Elizabeth Furtado, Vasco Furtado, Kênia Sousa, Jean Vanderdonckt, Quentin Limbourg KnowiXML: A Knowledge-Based System Generating Multiple Abstract User.
Environment Change Information Request Change Definition has subtype of Business Case based upon ConceptPopulation Gives context for Statistical Program.
Ontologies Come of Age Deborah L. McGuinness Stanford University “The Semantic Web: Why, What, and How, MIT Press, 2001” Presented by Jungyeon, Yang.
Discovering Descriptive Knowledge Lecture 18. Descriptive Knowledge in Science In an earlier lecture, we introduced the representation and use of taxonomies.
A Context Model based on Ontological Languages: a Proposal for Information Visualization School of Informatics Castilla-La Mancha University Ramón Hervás.
KNOWLEDGE REPRESENTATION Ontologies Communication – Network Management Technologies Rashid Mijumbi Barcelona, April 2011.
Using Several Ontologies for Describing Audio-Visual Documents: A Case Study in the Medical Domain Sunday 29 th of May, 2005 Antoine Isaac 1 & Raphaël.
THE SUPPORTING ROLE OF ONTOLOGY IN A SIMULATION SYSTEM FOR COUNTERMEASURE EVALUATION Nelia Lombard DPSS, CSIR.
Christoph F. Eick University of Houston Organization 1. What are Ontologies? 2. What are they good for? 3. Ontologies and.
Chap#11 What is User Support?
Ontology Mapping in Pervasive Computing Environment C.Y. Kong, C.L. Wang, F.C.M. Lau The University of Hong Kong.
Working with Ontologies Introduction to DOGMA and related research.
A View-based Methodology for Collaborative Ontology Engineering (VIMethCOE) Ernesto Jiménez Ruiz Rafael Berlanga Llavorí Temporal Knowledge Bases Group.
EEL 5937 Ontologies EEL 5937 Multi Agent Systems Lecture 5, Jan 23 th, 2003 Lotzi Bölöni.
Issues in Ontology-based Information integration By Zhan Cui, Dean Jones and Paul O’Brien.
GEM: The GAAIN Entity Mapper Naveen Ashish, Peehoo Dewan, Jose-Luis Ambite and Arthur W. Toga USC Stevens Neuroimaging and Informatics Institute Keck School.
11 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez May 13th, 2004 Benchmarking.
Copy right 2004 Adam Pease permission to copy granted so long as slides and this notice are not altered Ontology Overview Introduction.
Stefan Decker Stanford University Mike Dean BBN Technologies.
Henrik Eriksson Department of Computer and Information Science Linkoping University SE Linkoping, Sweden Raymond W. Fergerson Yuval Shahar Stanford.
Experimentation phase 2. October 11th © Raúl García-Castro Experimentation Phase 2 Raúl García-Castro October 11th, 2005 Interoperability Working.
Supporting Collaborative Ontology Development in Protégé International Semantic Web Conference 2008 Tania Tudorache, Natalya F. Noy, Mark A. Musen Stanford.
Instance Discovery and Schema Matching With Applications to Biological Deep Web Data Integration Tantan Liu, Fan Wang, Gagan Agrawal {liut, wangfa,
Protégé/2000 Advanced Tools for Building Intelligent Systems Mark A. Musen Stanford University Stanford, California USA.
Of 24 lecture 11: ontology – mediation, merging & aligning.
Design Evaluation Overview Introduction Model for Interface Design Evaluation Types of Evaluation –Conceptual Design –Usability –Learning Outcome.
DOMAIN ONTOLOGY DESIGN
ece 627 intelligent web: ontology and beyond
Semantic Web - Ontologies
Sergiy Vilkomir January 20, 2012
Block Matching for Ontologies
Semantic Markup for Semantic Web Tools:
Chapter 11 user support.
State of the Art Ontology Mapping
Building Ontologies with Protégé-2000
Presentation transcript:

Evaluating Ontology-Mapping Tools: Requirements and Experience Natalya F. Noy Mark A. Musen Stanford Medical Informatics Stanford University

Types Of Ontology Tools There is not just ONE class of ONTOLOGY TOOLS Ontology Tools Development Tools Protégé-2000, OntoEdit OilEd, WebODE, Ontolingua Mapping Tools PROMPT, ONION, OBSERVER, Chimaera, FCA-Merge, GLUE

Evaluation Parameters for Ontology-Development Tools Interoperability with other tools Ability to import ontologies from other languages Ability to export ontologies to other languages Expressiveness of the knowledge model Scalability Extensibility Availability and capabilities of inference services Usability of tools

Evaluation Parameters For Ontology-Mapping Tools Can try to reuse evaluation parameters for development tools, but: Ontology Tools Development ToolsMapping Tools Different tasks, inputs, and outputs Similar tasks, inputs, and outputs

Development Tools Domain knowledge Ontologies to reuse Requirements Domain ontology Create an ontology InputOutputTask

Mapping Tools: Tasks C=Merge(A, B) AB iPROMPT, Chimaera Map(A, B) AB Anchor-PROMPT, GLUE FCA-Merge AB Articulation ontology ONION

Mapping Tools: Inputs Classes Shared instances Instance data DL definitions Slots and facets Slots and facets iPROMPTChimaeraGLUEFCA-MergeOBSERVER

Mapping Tools: Outputs and User Interaction GUI for interactive merging iPROMPT, Chimaera Lists of pairs of related terms Anchor-PROMPT, GLUE FCA-Merge List of articulation rules ONION

Can We Compare Mapping Tools? Yes, we can! We can compare tools in the same group How do we define a group?

Architectural Comparison Criteria Input requirements Ontology elements Used for analysis Required for analysis Modeling paradigm Frame-based Description Logic Level of user interaction: Batch mode Interactive User feedback Required? Used?

Architectural Criteria (cont’d) Type of output Set of rules Ontology of mappings List of suggestions Set of pairs of related terms Content of output Matching classes Matching instances Matching slots

From Large Pool To Small Groups Space of mapping tools Architectural criteria Performance criterion (within a single group)

Resources Required For Comparison Experiments Source ontologies Pairs of ontologies covering similar domains Ontologies of different size, complexity, level of overlap “Gold standard” results Human-generated correspondences between terms Pairs of terms, rules, explicit mappings

Resources Required (cont’d) Metrics for comparing performance Precision (how many of the tool’s suggestions are correct) Recall (how many of the correct matches the tool found) Distance between ontologies Use of inference techniques Analysis of taxonomic relationships (a-la OntoClean) Experiment controls Design Protocol Suggestions that the tool produced Operations that the user performed Suggestions that the user followed

Where Will The Resources Come From? Ideally, from researchers that do not belong to any of the evaluated projects Realistically, as a side product of stand- alone evaluation experiments

Evaluation Experiment: iPROMPT iPROMPT is A plug-in to Protégé-2000 An interactive ontology-merging tool iPROMPT uses for analysis Class hierarchy Slots and facet values iPROMPT matches Classes Slots Instances

Evaluation Experiment 4 users merged the same 2 source ontologies We measured Acceptability of iPrompt’s suggestions Differences in the resulting ontologies

Sources Input: two ontologies from the DAML ontology library CMU ontology: Employees of academic organization Publications Relationships among research groups UMD ontology: Individals CS departments Activities

Experimental Design User’s expertise: Familiar with Protégé-2000 Not familiar with PROMPT Experiment materials: The iPROMPT software A detailed tutorial A tutorial example Evaluation files Users performed the experiment on their own. No questions or interaction with developers.

Experiment Results Quality of iPROMPT suggestions: Recall: 96.9% Precision: 88.6% Resulting ontologies Difference measure: fraction of frames that have different name and type Ontologies differ by ~30%

Limitations In The Experiment Only 4 participants Variability in Protégé expertise Recall and precision figures without comparison to other tools are not very meaningful Need better distance metrics

Research Questions Which pragmatic criteria are most helpful in finding the best tool for a task How do we develop a “gold standard” merged ontology? Does such an ontology exist? How do we define a good distance metric to compare results to the gold standard? Can we reuse tools and metrics developed for evaluating ontologies themselves?