111111 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking.

Slides:



Advertisements
Similar presentations
The Next Generation Grid Kostas Tserpes, NTUA Beijing, 22 of June 2005.
Advertisements

DELOS Highlights COSTANTINO THANOS ITALIAN NATIONAL RESEARCH COUNCIL.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
Analytical Evidence on Research & Innovation in the Danube Region Progress of WP4 Vienna, Béla Kardon, PhD; RCISD
Enabling Access to Sound Archives through Integration, Enrichment and Retrieval WP1. Project Management.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
EA Workflows 1 Establish EA Program 2 Select EA Framework 3
Data Catalogue Service Work Package 4. Main Objective: Deployment, Operation and Evaluation of a cataloguing service for scientific data. Why: Potential.
MethodAssess System Assessment. Methoda Computers Ltd 2 List of Subjects 1. Introduction 2. Actions and deliverables 3. Lessons and decisions.
WP5 – Knowledge Resource Sharing and Management Kick-off Meeting – Valkenburg 8-9 December 2005 Dr. Giancarlo Bo Giunti Interactive Labs S.r.l.
Unit 8: Tests, Training, and Exercises Unit Introduction and Overview Unit objectives:  Define and explain the terms tests, training, and exercises. 
UGDIE PROJECT MEETING Bled September WP6 – Assessment and Evaluation Evaluation Planning  Draft Evaluation plan.
Lecture 3 Managing your project How? Milestones Deliverables Meeting tutors.
Unit Slides by UK Versity.  Unit aims:  This unit aims to help the learner with an opportunity to develop their project management and research skills.
CONTACT SEMINAR November 2008 Project management tools.
Introduction to Systems Analysis and Design
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU Introduction Mapping approaches to quality management in the.
Project Human Resource Management
Qatar Planning Council 1 Best Statistical Information to Support Qatar’s Progress Statistical Capacity Building for Information Society in Qatar.
1-2 Training of Process FacilitatorsTraining of Coordinators 4-1.
PILOT PROJECT: External audit of quality assurance system on HEIs Agency for Science and Higher Education Zagreb, October 2007.
1 Framework Programme 7 Guide for Applicants
Org Name Org Site CMM Assessment Kick-off Meeting Dates of assessment.
The Preparatory Phase Proposal a first draft to be discussed.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
EMI SA2: Quality Assurance (EMI-SA2 Work Package) Alberto Aimar (CERN) WP Leader.
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
EMI SA2: Quality Assurance (EMI-SA2 Work Package) Alberto Aimar (CERN) WP Leader.
Preparing for the Launch Mohammed El- Affendi. Launch Major Tasks  The Launch is performed according to script “LAU1”, table 3.1 in the book (page 39),
11111 Benchmarking in KW. Sep 10th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez September 10th, 2004 Benchmarking.
Benchmarking the interoperability of ODTs. April 7th © Raúl García-Castro, Asunción Gómez-Pérez Benchmarking the interoperability of ontology development.
Quality Management (WP5) Roman CHIRCA Agency for Innovation and Technological Transfer TecTNet ………... This project has been funded with support from the.
Chapter 7: A Summary of Tools Focus: This chapter outlines all the customer-driven project management tools and techniques and provides recommendations.
Effective Project Management: Traditional, Agile, Extreme Presented by (facilitator name) Managing Complexity in the Face of Uncertainty Ch08: How to Close.
FP OntoGrid: Paving the way for Knowledgeable Grid Services and Systems Communication in the consortium Review meeting Delft,
Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 27th, 2004 Benchmarking.
Georgia Institute of Technology CS 4320 Fall 2003.
APEC-TEL Broadband Study TEL03/2009A – status report Bangkok, May 20 th.
P1516.4: VV&A Overlay to the FEDEP 20 September 2007 Briefing for the VV&A Summit Simone Youngblood Simone Youngblood M&S CO VV&A Proponency Leader
LEONARDO TRANSFER OF INNOVATION PROJECT “MEDIA TECH: The future of media industry using innovative technologies ” No. LLP-LdV-ToI-11-CY Kick-off.
LAW&ICT Shared Virtual Campus, Zaragoza Meeting, October model for technical support to LAW&ICT Shared Virtual Campus: a proposal Selahattin Kuru.
HASSO-PLATTNER-INSTITUT for IT Systems Engineering at the University of Potsdam Semantische Methoden in SAP Enterprise Service Architekturen Dominik Kuropka.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
Towards a Glossary of Activities in the Ontology Engineering Field Mari Carmen Suárez-Figueroa and Asunción Gómez-Pérez {mcsuarez, Ontology.
23 March 2012, Luxembourg MGSC STATISTICS LITHUANIA PROCEDURE FOR MONITORING THE IMPLEMENTATION OF THE PEER REVIEW Audronė Miškinienė Head.
Six Sigma Continuous Improvement Training Introduction to Benchmarking Six Sigma Simplicity.
WSMO in Knowledge Web 2nd SDK cluster f2f meeting Rubén Lara Digital Enterprise.
11 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez May 13th, 2004 Benchmarking.
January 2013 Action Plan Skills Building: Module 4 Implementing and Evaluating the Action Plan.
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Conclusion and follow-up. October 10th © Raúl García-Castro Conclusion and follow-up Raúl García-Castro October 11th, 2005 Interoperability Working.
Number: TR/06/B/F/PP/ WASTE-TRAIN VOCATIONAL TRAINING, EDUCATION, CONVEYING INFORMATION ON UP-TO-DATE WASTE MANAGEMENT PRACTICES TO DECISION MAKERS/STAFF.
Jens Hartmann York Sure Raphael Volz Rudi Studer The OntoWeb Portal.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
1 WP7 Presentation WP7 -- Project Evaluation Computer Engineering and Networks Laboratory, ETH Zurich Lukas Ruf.
HNSciCloud Project MSc in Project Engineering delivered by Professor Gilles Vallet Oxford Academics for Computing Science Department, University of Chester.
Experimentation phase 2. October 11th © Raúl García-Castro Experimentation Phase 2 Raúl García-Castro October 11th, 2005 Interoperability Working.
EMI INFSO-RI SA2: Quality Assurance Status Report Alberto Aimar(SA2) SA2 Leader EMI First EC Review 22 June 2011, Brussels.
Developing National Capability for Integrated Border Management (IBM) in Lebanon Project Funded by the European Union Implemented by the International.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
P3 Business Analysis. 2 Section F: Project Management F1.The nature of projects F2. Building the Business Case F4. Planning,monitoring and controlling.
2050AP Project WP5: “Conclusions” UPM Madrid 11 de Octubre 2013.
MICHAEL Culture Association WP4 Integration of existing data structure into Europeana ATHENA, WP4 Working group technical meeting Konstanz, 7th of May.
UNECE / Eurostat Workshop on Implementing CSPA – Geneva – June 2016 ESSNet on Sharing Common Functionalities
CCSA Medium-term Work Programme
ESSnet SCFE – Wiesbaden workshop - Introduction
X-DIS/XBRL Phase 2 Kick-Off
Data collection and validation support for the management of the ESF
Joint Application Development (JAD)
Presentation transcript:

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking in WP 2.1

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Index 1.Progress 2.Deliverable 2.1.4

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D2.1.1: Benchmarking SoA D2.1.4: Benchmarking Methodology, criteria, test suites D2.1.6: Benchmarking building tools Benchmarking querying, reasoning, annotation Benchmarking semantic web service FinishedStartedNot started Progress: Benchmarking activities timeline

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Ontology Technology/Methods EvaluationBenchmarking Desired attributes Weaknesses Comparative analysis... Continuous improvement Best practices Measurement Experimentation What has been done? in D Survey of Scalability Techniques for Reasoning with Ontologies Overview of benchmarking, experimentation, and measurement State of the Art of Ontology-based Technology Evaluation Recommendations

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez What are we doing? T Benchmarking methodology, criteria, and test suites General evaluation criteria: Interoperability Scalability Robustness Benchmark suites for: Interoperability Scalability Robustness Benchmarking supporting tools: Workload generators Test generators Monitoring tools Statistical packages... Benchmarking results: Comparative analysis Compliance with norms Weaknesses Recommendations on tools Recommendations on practices Benchmarkin g Methodology Ontology tools: Ontology building tools Annotation tools Querying and reasoning services Semantic Web Services technology GOAL: Provide a framework for benchmarking activities in WP 2.1 (and maybe other WPs)

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Experiment results: test 1 test 2 test 3... Experiment results: test 1 test 2 test 3... What will be done? T 2.1.6: Benchmarking of ontology building tools Tools/Partners:... Benchmarking results: Comparative analysis Compliance with norms Weaknesses Recommendations on tools Recommendations on practices Benchmark suites: RDF(S) Import capability OWL Import capability RDF(S) Export capability OWL Export capability... Interoperability Do the tools import/export from/to RDF(S)/OWL? Are the imported/exported ontologies the same? Is there any knowledge loss during import/export?... UPM Experiment results: test 1 test 2 test 3... NO OK Benchmarkin g ontology building tools Benchmarking supporting tools: Workload generators Test generators Monitoring tools Statistical packages...

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Index 1.Progress 2.Deliverable 2.1.4

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Deliverable outline 1.Introduction 2.Benchmarking methodology 3.Building test suites for ontology tools 4.General supporting tools for benchmarking 5.Benchmarking ontology development tools and tool suites 6.Benchmarking ontology-based annotation tools 7.Benchmarking ontology querying tools and inference engines 8.Benchmarking semantic web service technology 9.Conclusion 10.Glossary D 2.1.4: Specification of a methodology, general criteria, and test suites for benchmarking ontology tools

Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Processes D 2.1.4: Benchmarking methodology Processes InputsOutputs Task 1Task n... Plan 1 Goals identification 2 Subject identification 3 Management involvement 4 Participant identification 5 Planning and resource allocation 6 Partner selection Experiment 7 Experiment definition 8 Experiment execution 9 Experiment results analysis Improve 10 Report writing 11 Findings communication 12 Findings implementation 13 Recalibration Methodology processes Methodology: Benchmarking process is: Planned Collaborative More Semantic Web oriented More KW oriented

10 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Benchmarking methodology Plan 1.- Benchmarking goals identification Goals depend on the organisation’s vision, objectives, and strategies. 2.- Benchmarking subject identification 3.- Management involvement Inform the organisation's management about the benefits of the benchmarking study and its costs. Management support is needed to proceed and when implementing changes based on the benchmarking. 4.- Participant identification Identify and contact the members of the organisation that are involved with the selected tool. Select and train the members of the benchmarking team. 5.- Benchmarking planning and resource allocation The planning must consider time and resources. The planning must be integrated into the organisation's planning. Analyse the current tools in the organisation. Select, understand, and document the tool whose improvement would significantly benefit the organisation, according to: end user needs or expectations, organisational goals, etc. 6.- Benchmarking partner selection Identify, collect, and analyze information about the tools that are considered the best. Select the tools to benchmark with and make contact with someone in their organisations. The partner organisations may not belong to KW. Not all ‘best in class’ tools are developed by KW partners.

11 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Benchmarking methodology Experiment 7.- Experiment definition 8.- Experiment execution 9.- Experiment results analysis Determine the experimentation plan and method. Define the experiment that will be performed. The experiment must collect not just the data on the performance of the tools but the reasons of this performance. Communicate the partners the experimentation plan and method and agree on it. Perform the experiment according to the experimentation plan and method. The collected data must be documented and prepared for analysis. Compare the results obtained from the experiments and the practices that lead to these results. Document findings in a report, including the best practices found (if any).

12 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Benchmarking methodology Improve 10.- Benchmarking report writing The benchmarking report must provide an understandable summary of the benchmarking study with: An explanation of the benchmarking process followed. The results and conclusions of the experiments. The recommendations on improving the tools Benchmarking findings communication Findings must be communicated to all the organisation (including identified participants) and to the benchmarking partners. Collect and analyze any feedback received Benchmarking findings implementation Define a planning for the implementation of the benchmarking findings. Implement the necessary changes in order to achieve the desired results. Periodically monitor the benchmarked tool Recalibration Recalibrate the benchmarking process using the lessons learnt. The benchmarking process should be repeated forever in order to obtain a continuous improvement.

13 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Building test suites for ontology tools How to develop a test suite. The desirable properties that a test suite should have.

14 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: General supporting tools for benchmarking List of tools that can be useful when performing benchmarking activities, like: Test generators Workload generators Monitoring tools Statistical packages...

15 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Benchmarking ontology... tools 1.Candidate tools List of candidate tools to be benchmarked: Description Reasons for inclusion 2.General evaluation criteria Ontology... tools functionalities with the general evaluation criteria that can be used when evaluating or when benchmarking these functionalities. Related to WP 2.1 topics (scalability, robustness, and interoperability). 3.Test suites Test suites for ontology... tools related to WP 2.1 topics (scalability, robustness, and interoperability). 4.Supporting tools Supporting tools specific to ontology... tools. 5.Conclusion Development Annotation Querying/inference Semantic Web Service

16 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Glossary Definitions of terms used in the deliverable: Benchmark Benchmarking Benchmarking partner Best practice Interoperability Robustness Scalability...

17 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez D 2.1.4: Tasks and responsibilities D Specification of a methodology, criteria, and test suites for benchmarking ontology tools Raúl García-Castro (UPM) 1.- IntroductionRaúl García-Castro (UPM) 2.- Benchmarking methodologyRaúl García-Castro (UPM) 3.- Building test suites for ontology toolsRaúl García-Castro (UPM) 4.- General supporting tools for benchmarkingRaúl García-Castro (UPM) 5.- Benchmarking ontology development tools and tool suites Raúl García-Castro (UPM) 6.- Benchmarking ontology-based annotation tools? Raúl asks Sheffield. 7.- Benchmarking ontology querying tools and inference engines Holger Wache 8.- Benchmarking semantic web service technology? Holger asks WP2.4 leader 9.- ConclusionRaúl García-Castro (UPM) 10.- GlossaryAll contributors

18 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez 19 NovContributions of the partnersto Raúl 26 NovDraft v0: compilation of the parts (before next meeting) 17 Dec Draft v1: complete to Quality Assessor (WP leader) 7 Jan Draft v2: reviewed by QAto Quality Controller (Holger asks Matteo Bonifacio or Roberta Cuel) 28 JanDraft v3: reviewed by QCto Quality Assurance Coordinator 14 FebFinal versionto European Commission D 2.1.4: Time schedule

19 Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking in WP 2.1