111111 Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 27th, 2004 Benchmarking.

Slides:



Advertisements
Similar presentations
DELOS Highlights COSTANTINO THANOS ITALIAN NATIONAL RESEARCH COUNCIL.
Advertisements

Strategic Management & Planning
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
Data Catalogue Service Work Package 4. Main Objective: Deployment, Operation and Evaluation of a cataloguing service for scientific data. Why: Potential.
Unit 8: Tests, Training, and Exercises Unit Introduction and Overview Unit objectives:  Define and explain the terms tests, training, and exercises. 
CONTACT SEMINAR November 2008 Project management tools.
Introduction to Systems Analysis and Design
Conducting the IT Audit
Queensland Public Service Capability and Leadership Framework (CLF) 1.
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU Introduction Mapping approaches to quality management in the.
Project Human Resource Management
Quality Management ISO 9001 For TM. What is Quality Quality is the degree to which product or service possesses a desired combination of attributes C:
Qatar Planning Council 1 Best Statistical Information to Support Qatar’s Progress Statistical Capacity Building for Information Society in Qatar.
AWARE PROJECT – AGEING WORKFORCE TOWARDS AN ACTIVE RETIREMENT Alberto Ferreras-Remesal Institute of Biomechanics of Valencia IFA 2012 – Prague – May 31th.
IT Project Management Cheng Li, Ph.D. August 2003.
PILOT PROJECT: External audit of quality assurance system on HEIs Agency for Science and Higher Education Zagreb, October 2007.
Training of Process Facilitators Training of Process Facilitators.
Project “Transfer of Quality Ensurance Tools for European Rural Tourism Sector” (QUALITOOL)
EMI SA2: Quality Assurance (EMI-SA2 Work Package) Alberto Aimar (CERN) WP Leader.
Employability skills workshop This work has been produced on behalf of the National Quality Council with funding provided through the Australian Government.
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
Benchmarking in WP 2.1. Sep 28th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 28th, 2004 Benchmarking.
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 13: Assessing Maternity Protection in practice Maternity.
11111 Benchmarking in KW. Sep 10th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez September 10th, 2004 Benchmarking.
Benchmarking the interoperability of ODTs. April 7th © Raúl García-Castro, Asunción Gómez-Pérez Benchmarking the interoperability of ontology development.
Supreme Audit Office of the Slovak Republic, Chair of the CBC Sub-Committee Promote Peer Reviews.
Luisa Franconi Integration, Quality, Research and Production Networks Development Department Unit on microdata access ISTAT Essnet on Common Tools and.
Switch off your Mobiles Phones or Change Profile to Silent Mode.
Effective Project Management: Traditional, Agile, Extreme Presented by (facilitator name) Managing Complexity in the Face of Uncertainty Ch08: How to Close.
Managing Organizational Change A Framework to Implement and Sustain Initiatives in a Public Agency Lisa Molinar M.A.
Process Assessment Method
Georgia Institute of Technology CS 4320 Fall 2003.
Knowledge Representation of Statistic Domain For CBR Application Supervisor : Dr. Aslina Saad Dr. Mashitoh Hashim PM Dr. Nor Hasbiah Ubaidullah.
SmartNets Results Overview SmartNets SmartNets Methods.
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Copyright  2005 McGraw-Hill Australia Pty Ltd PPTs t/a Australian Human Resources Management by Jeremy Seward and Tim Dein Slides prepared by Michelle.
Training and Development Prof R K Singh AIMA CME.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
Towards a Glossary of Activities in the Ontology Engineering Field Mari Carmen Suárez-Figueroa and Asunción Gómez-Pérez {mcsuarez, Ontology.
Change Management 1 Intro To Business. Intro to Business Defining change management Individual change management Organizational change management Who.
23 March 2012, Luxembourg MGSC STATISTICS LITHUANIA PROCEDURE FOR MONITORING THE IMPLEMENTATION OF THE PEER REVIEW Audronė Miškinienė Head.
Kathy Corbiere Service Delivery and Performance Commission
Six Sigma Continuous Improvement Training Introduction to Benchmarking Six Sigma Simplicity.
WSMO in Knowledge Web 2nd SDK cluster f2f meeting Rubén Lara Digital Enterprise.
11 Benchmarking Ontology Technology. May 13th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro, Asunción Gómez-Pérez May 13th, 2004 Benchmarking.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
What is project management?
Fundamentals of Governance: Parliament and Government Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Conclusion and follow-up. October 10th © Raúl García-Castro Conclusion and follow-up Raúl García-Castro October 11th, 2005 Interoperability Working.
Number: TR/06/B/F/PP/ WASTE-TRAIN VOCATIONAL TRAINING, EDUCATION, CONVEYING INFORMATION ON UP-TO-DATE WASTE MANAGEMENT PRACTICES TO DECISION MAKERS/STAFF.
Jens Hartmann York Sure Raphael Volz Rudi Studer The OntoWeb Portal.
A D A P T A T I O N STAP Could contribute to the clarification of the scientific and technical issues underpinning the debate Brainstorming on Social Issues.
Ontologies for the Semantic Web Prepared By: Tseliso Molukanele Rapelang Rabana Supervisor: Associate Professor Sonia Burman 20 July 2005.
Experimentation phase 2. October 11th © Raúl García-Castro Experimentation Phase 2 Raúl García-Castro October 11th, 2005 Interoperability Working.
Continual Service Improvement Methods & Techniques.
Switch off your Mobiles Phones or Change Profile to Silent Mode.
Import experiments in Protégé. October 10th © Raúl García-Castro Import experiments in Protégé Raúl García-Castro October 10th, 2005 Interoperability.
1 © ATHENA Consortium 2006 Dynamic Requirements Definition System Interoperability Issues Mapping Nicolas Figay, EADS ATHENA M30 Intermediate Audit 4.-5.
in Construction Industry
Using the Checklist for SDMX Data Providers
Monitoring and Evaluation using the
CCSA Medium-term Work Programme
Overview What is evaluation? Why an evaluation framework?
ESS VIP ICT Project Task Force Meeting 5-6 March 2013.
Effective Project Management: Traditional, Agile, Extreme
ESS.VIP Validation Item 5.1
needs assessment summary and next steps for continuous improvement
Presentation transcript:

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 27th, 2004 Benchmarking Methodology

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Ontology Technology/Methods EvaluationBenchmarking Desired attributes Weaknesses Comparative analysis... Continuous improvement Best practices Measurement Experimentation What has been done? in D Survey of Scalability Techniques for Reasoning with Ontologies Overview of benchmarking, experimentation, and measurement State of the Art of Ontology-based Technology Evaluation Recommendations

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez What are we doing? T Benchmarking methodology, criteria, and test suites General evaluation criteria: Interoperability Scalability Robustness Benchmark suites for: Interoperability Scalability Robustness Benchmarking supporting tools: Workload generators Test generators Monitoring tools Statistical packages... Benchmarking results: Comparative analysis Compliance with norms Weaknesses Recommendations on tools Recommendations on practices Benchmarkin g Methodology Ontology tools: Ontology building tools Annotation tools Querying and reasoning services Semantic Web Services technology GOAL: Provide a framework for benchmarking activities in WP 2.1

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Experiment results: test 1 test 2 test 3... Experiment results: test 1 test 2 test 3... What will be done? T 2.1.6: Benchmarking of ontology building tools Tools/Partners:... Benchmarking results: Comparative analysis Compliance with norms Weaknesses Recommendations on tools Recommendations on practices Benchmark suites: RDF(S) Import capability OWL Import capability RDF(S) Export capability OWL Export capability... Interoperability Do the tools import/export from/to RDF(S)/OWL? Are the imported/exported ontologies the same? Is there any knowledge loss during import/export?... UPM Experiment results: test 1 test 2 test 3... NO OK Benchmarkin g ontology building tools Benchmarking supporting tools: Workload generators Test generators Monitoring tools Statistical packages...

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Benchmarking methodology Processes InputsOutputs Task 1Task n... Plan 1 Goals identification 2 Subject identification 3 Management involvement 4 Participant identification 5 Planning and resource allocation 6 Partner selection Experiment 7 Experiment definition 8 Experiment execution 9 Experiment results analysis Improve 10 Report writing 11 Findings communication 12 Findings implementation 13 Recalibration Methodology processes Methodology: Benchmarking process is: Planned Collaborative More Semantic Web oriented More KW oriented

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Benchmarking methodology Plan 1.- Benchmarking goals identification Goals depend on the organisation’s vision, objectives, and strategies. 2.- Benchmarking subject identification 3.- Management involvement Inform the organisation's management about the benefits of the benchmarking study and its costs. Management support is needed to proceed and when implementing changes based on the benchmarking. 4.- Participant identification Identify and contact the members of the organisation that are involved with the selected tool. Select and train the members of the benchmarking team. 5.- Benchmarking planning and resource allocation The planning must consider time and resources. The planning must be integrated into the organisation's planning. Analyse the current tools in the organisation. Select, understand, and document the tool whose improvement would significantly benefit the organisation, according to: end user needs or expectations, organisational goals, etc. 6.- Benchmarking partner selection Identify, collect, and analyze information about the tools that are considered the best. Select the tools to benchmark with and make contact with someone in their organisations. The partner organisations may not belong to KW. Not all ‘best in class’ tools are developed by KW partners.

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Benchmarking methodology Experiment 7.- Experiment definition 8.- Experiment execution 9.- Experiment results analysis Determine the experimentation plan and method. Define the experiment that will be performed. The experiment must collect not just the data on the performance of the tools but the reasons of this performance. Communicate the partners the experimentation plan and method and agree on it. Perform the experiment according to the experimentation plan and method. The collected data must be documented and prepared for analysis. Compare the results obtained from the experiments and the practices that lead to these results. Document findings in a report, including the best practices found (if any).

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Benchmarking methodology Improve 10.- Benchmarking report writing The benchmarking report must provide an understandable summary of the benchmarking study with: An explanation of the benchmarking process followed. The results and conclusions of the experiments. The recommendations on improving the tools Benchmarking findings communication Findings must be communicated to all the organisation (including identified participants) and to the benchmarking partners. Collect and analyze any feedback received Benchmarking findings implementation Define a planning for the implementation of the benchmarking findings. Implement the necessary changes in order to achieve the desired results. Periodically monitor the benchmarked tool Recalibration Recalibrate the benchmarking process using the lessons learnt. The benchmarking process should be repeated forever in order to obtain a continuous improvement.

Benchmarking Methodology. Sep 27th, 2004 © R. García-Castro, A. Gómez-Pérez Raúl García-Castro Asunción Gómez-Pérez September 27th, 2004 Benchmarking Methodology