Uncontrolled copy when printed I Burnell

Slides:



Advertisements
Similar presentations
© NERC All rights reserved NERC Data Catalogue Service Patrick Bell NERC (British Geological Survey)
Advertisements

ELTSS Alignment to Nationwide Interoperability Roadmap DRAFT: For Stakeholder Consideration in response to public comment.
Spatial Data Infrastructure: Concepts and Components Geog 458: Map Sources and Errors March 6, 2006.
CS 4310: Software Engineering
Complete and Integrated Lifecycle Management. Challenges 1.
ISO STANDARDS TRAINING & CONSULTING
© Rheinmetall Defence 2013 The Geospatial Catalogue and Database Repository (GCDR) and the Knowledge Management System (KMS) Shane Reschke – Technical.
S T A M © 2000, KPA Ltd. Software Trouble Assessment Matrix Software Trouble Assessment Matrix *This presentation is extracted from SOFTWARE PROCESS QUALITY:
Information Management in British Telecom Jon Hill.
Distributed Access to Data Resources: Metadata Experiences from the NESSTAR Project Simon Musgrave Data Archive, University of Essex.
Communications support for the Vodafone EMF community Pre-read for EMF Leader Workshop, 8 April 2008 Dianne Sullivan & Ros Young.
EMM – SAP Error Messages Management Tool OMKAR Solutions Inc. Version 1.0.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
Data Analysis Superintendents Trust. Increase test scores and graduation rates through targeted efforts and investments that lead to student success Proactively.
The Geographic Information System of the European Commission (GISCO) By Albrecht Wirthmann, GISCO, Eurostat ESPON.
Copyright 2010, The World Bank Group. All Rights Reserved. Recommended Tabulations and Dissemination Section B.
EFGS – 10 November 2015 – Vienna UN-GGIM: Europe Work Group A European Core Data François Chirié (France)
Impact Research 1 Enabling Decision Making Through Business Intelligence: Preview of Report.
Statistical process model Workshop in Ukraine October 2015 Karin Blix Quality coordinator
Data Services for Commissioners Presented by Ming Tang
Beach Modelling: Lessons Learnt from Past Scheme Performance Project: SC110004/S Project Summary.
Systems Development Process and Methodologies Dr. T. Ravichandran.
GKR 2113: INTRODUCTION TO DESIGN PROCESS
Presented by Alex Mitchell Joy Mkhasibe
Annual Professional Development Conference
UNECE-CES Work session on Statistical Data Editing
Midwest Biopharmaceutical Statistics Workshop
Chapter 10 Software Quality Assurance& Test Plan Software Testing
Washington, DC USA Luis Bermudez March
ICT PSP 2011, 5th call, Pilot Type B, Objective: 2.4 eLearning
Computer Aided Software Engineering (CASE)
Towards connecting geospatial information and statistical standards in statistical production: two cases from Statistics Finland Workshop on Integrating.
ServiceNow Implementation Knowledge Management
EOB Methodology Overview
MANAGEMENT INFORMATION SYSTEM MEHTAP PARLAK Industrial Engineering Department, Dokuz Eylul University, Turkey 1.
Standards for success in city IT and construction projects
Project Roles and Responsibilities
CMMI – Staged Representation
Customization Guidelines for BMC Remedy IT Service Management 7.5
Goal, Question, and Metrics
QAD Operational Metrics Working Exceptionally!
Information Technology (IT)
Water Quality Portal Data Tools
Knowledge Utility results from Rigor in Methods & Relevance in Content
Geospatial Data Use and sharing Concepts
Core Platform The base of EmpFinesse™ Suite.
Metadata in the modernization of statistical production at Statistics Canada Carmen Greenough June 2, 2014.
Continuity Guidance Circular Webinar
Ola Nordbeck Statistics Norway
Assessment Workshop Title of the Project (date)
LOSD Publication Deirdre Lee
Capacity building on the use of Geospatial Data and Technologies
About this Template Dear Colleague, This template is provided by Valooto to help you communicate the facts about your need for a CPQ (Configure Price Quote)
MSDI training courses feedback MSDIWG10 March 2019 Busan
Presentation to SISAI Luxembourg, 12 June 2012
Dissemination Working Group John Allen
Streamlining of monitoring and reporting under WFD, Nitrates Directive and EEA's SoE –concept paper DG Environment.
The role of metadata in census data dissemination
Subject Name: SOFTWARE ENGINEERING Subject Code:10IS51
Reportnet 3.0 Database Feasibility Study – Approach
Is Copernicus benefitting from INSPIRE?
Street Manager Training approach
Implementation Business Case
QoS Metadata Status 106th OGC Technical Committee Orléans, France
© Fresh Thoughts Consulting
Australian and New Zealand Metadata Working Group
Introduction to reference metadata and quality reporting
Quality of Service and Experience – Metrics Survey
Palestinian Central Bureau of Statistics
WISE and INSPIRE By Albrecht Wirthmann, GISCO, Eurostat
Presentation transcript:

Uncontrolled copy when printed I Burnell 22/09/2018 Brief to Quality of Service and Experience DWG – Development of a Data and Services Quality Model Introduction Approach Quality Models Requirements Assessments Conclusions Questions I Burnell OFFICIAL 22 September 2018 © Crown copyright 2013 Dstl © Crown copyright 2012 Dstl

So having provided some context, I would now like to look at the AGIS project. The project was delivered by the GeoCore consortium which brought together leaders in GEOINT understanding, technology and research from both industry and academia. It was led by Envitia (OGC Member) with Helyx (OGC Member) , QinetiQ and the University of Nottingham (OGC Member) all members of the core team. Other companies were brought in as required. The AGIS project ran for 4 years in total and completed the 3rd phase of the research in March 2016. OFFICIAL 22 September 2018 © Crown copyright 2013 Dstl Copyright Envitia

Introduction The quality of services (and intrinsically the data provided by those services) needs to be described and monitored in order for the Spatial Data Infrastructure (SDI) to be able to optimise its delivery of services. ‘Quality’ is an emerging subject for Defence with which it is relatively unfamiliar and there is a lack of consensus as to the contributing factors of spatial data and service quality and therefore no common definition: Web service quality is inconsistently defined and implementation specific. Data quality definitions focus on the production process and have limited value in the context of use. There is no overarching policy specifically covering quality guidance, metrics and implementation in the context of SDI across the range of functions that impact on quality, i.e. from the supply of data at one end, through to search and discovery via web services at the other. Aim of Research Data and Service Quality and Optimisation built on previous AGIS research and focussed on: Identifying quality use cases and requirements covering the range of activities that impact on ‘quality’ Defining quality factors and create a Data Quality Model (DQM) and Web Service Quality Model (WSQM) for use within the context of SDI Investigating COTS tools and methodologies regarding management of data quality and web services and assess these against the requirements identified. Copyright Envitia

Approach Define quality use cases and requirements: Create use cases and requirements based on stakeholder input. Investigate prioritised use cases using current examples of best practices and identifying suitable methodologies/approaches from stakeholders and academic input. Create quality models: Review current definitions of web service and data quality and create a draft data quality model (DQM) and web service quality model (WSQM). Assess COTS tools and methodologies: Assess planned tool implementation for service monitoring and optimisation under current work (i.e. LogRhythm). Identify and assess relevant methodology for providing quality summaries to users (i.e. Content Maturity Model). Identify and assess COTS data management tools against key features derived from DQM and requirements stages. Define quality use cases and requirements: Create use cases and requirements based on stakeholder input. Investigate prioritised use cases using current examples of best practices and identifying suitable methodologies/approaches from stakeholders and academic input. Create quality models: Review current definitions of web service and data quality and create a draft data quality model (DQM) and web service quality model (WSQM). Assess COTS tools and methodologies: Assess planned tool implementation for service monitoring and optimisation under current work (i.e. LogRhythm). Identify and assess relevant methodology for providing quality summaries to users (i.e. Content Maturity Model). Identify and assess COTS data management tools against key features derived from DQM and requirements stages. Copyright Envitia

Data Quality Model SDI DQM Data Quality Model (DQM) Includes parameters to cover users’ perception of quality, based on GeoViQua. SDI Data Quality Model (DQM) is standards based and includes three strands: contextual, quantitative and qualitative SDI DQM Quantitative Contextual Qualitative Data Quality Model (DQM) Includes parameters to cover users’ perception of quality, based on GeoViQua. SDI Data Quality Model (DQM) is standards based and includes three strands: contextual, quantitative and qualitative Copyright Envitia

Web Service Quality Model Web Service Quality Model (WSQM) A ‘core set’ of parameters was created by comparing W3C, INSPIRE, and MOD requirements ‘Core set’ was compared with OASIS WSQF and assessed for applicability to Defence SDI WSQM is based on OASIS WFQF Web Service Quality Model (WSQM) A ‘core set’ of parameters was created by comparing W3C, INSPIRE, and other requirements ‘Core set’ was compared with OASIS WSQF and assessed for applicability to Defence SDI WSQM is based on OASIS WFQF Copyright Envitia

Requirements 1 Acquire or capture data Data Manager is able to check for, and correct, data errors in a timely and consistent way (using automation whenever possible). Data Manager should be able to populate metadata as efficiently as possible (using automation wherever possible) to ensure completeness and quality of metadata. Data Manager should be able to measure, monitor and assess the quality of data and services in order to understand issues and trends. Manage and maintain data quality Manage data updates Manage and optimise Web services Consumer should be able to give qualitative feedback (e.g. ratings on dataset suitability) in order to augment subsequent user assessments of ‘quality’ during search and discovery. Consumer should be able to view a summary of ‘quality’ in the most appropriate form for their purposes (e.g. on-the-fly messages, or a visual representation of quality), whilst ensuring that they do not suffer from ‘information overload’. Web Manager should be able to measure, monitor and visualise the quality of data and services in order to understand issues and trends and optimise service Publish or disseminate geospatial resources Find Geospatial Resources Data and service use cases and high level requirements were identified Three use cases were prioritised for further investigation, as a result of input ‘Manage and maintain data quality’; ‘Manage and optimise web services’; and ‘Use geospatial resources’. Use Geospatial Resources Augment Business Intelligence Copyright Envitia

Requirements 2 Case studies or approaches were identified for each prioritised use case to further refine requirements and identify gaps. Data management case study identified the following shortfalls: Limited re-use of quality measures Limited use of corrective measures Lack of quality monitoring Review of Defence web service management found: Lack of metrics gathered for monitoring and optimisation of service Reliance on bespoke systems with no enterprise solution Case studies or approaches were identified applicable for each: Data management: Shortfalls in current practice included: limited re-use of quality measures, limited corrective measures, lack of monitoring Web service monitoring at Best: Provided effective performance metrics and usage monitoring The tools bespoke nature limits its re-use potential Feedback mechanisms and quality summaries: Aspiration is for user based comments and ratings These could improve provision of more intelligent search and discovery Review of feedback mechanisms and quality summaries found: The aspiration is for user based comments and ratings Copyright Envitia

Assessments An RFI was sent to mainstream COTS data management tool vendors. The tools were scored based on RFI responses, the use of software demos, background information and the results of industry assessment by Gartner Inc. Key findings were: The top three products emerged as Talend Data Integration and Data Quality, SAP Business Objects Business Intelligence and Business Objects Data Integrator, and Oracle Enterprise Data Quality. PROS: quicker, more efficient QA process; proactive issue management by monitoring trends; consistent processes producing repeatable metrics; quality summaries that were easily disseminated and interpreted. CONS: incompatibility with GIS formats; inability/limited ability to perform spatial analysis; out of the box functionality focussed on ‘customer’ type data. The focus was on mainstream COTS Data Management tools. A Request for Information (RFI) was sent to twelve vendors, which incorporated user requirements and core functionality such as: the ability to monitor quality trends, the ability to disseminate Enterprise wide quality information; the ability to predefine and reuse quality rules; the use of techniques such as standardisation and data linking for error detection. Copyright Envitia

Conclusions User feedback: Qualitative parameters, such as ‘user feedback’ are seen as high importance and are not included in quality specifications published by standards bodies. The Content Maturity Model (CMM) describes an ideal methodology for communicating data quality to decision makers, as it uses a combination of producer and user centric input. However, it relies on user feedback mechanisms to be in place. Data quality management: ISO 19157 describes quantitative parameter metrics, but there is a requirement for a common or minimum set to be defined, which will impact on the tool required to capture them. The COTS data quality management tools assessed all provide advanced techniques for quality assurance and can provide additional metrics to support the SDI DQM. They all support quicker, more efficient workflows for quality management and the ability to communicate and monitor results effectively, however, ‘quick fixes’ could be provided using existing or free tools. User feedback: Qualitative parameters, such as ‘user feedback’ are seen as high importance and are not included in quality specifications published by standards bodies. The Content Maturity Model (CMM) describes an ideal methodology for communicating data quality to decision makers, as it uses a combination of producer and user centric input. However, it relies on user feedback mechanisms to be in place. Data quality management: ISO 19157 describes quantitative parameter metrics, but there is a requirement for a common or minimum set to be defined, which will impact on the tool required to capture them. The COTS data quality management tools assessed all provide advanced techniques for quality assurance and can provide additional metrics to support the SDI DQM. They all support quicker, more efficient workflows for quality management and the ability to communicate and monitor results effectively, however, ‘quick fixes’ could be provided using existing or free tools. OFFICIAL 22 September 2018 © Crown copyright 2013 Dstl Copyright Envitia

Conclusions Web service optimisation: Definitions of quality parameters for web services are highly variable, as is the method used to capture quantitative metrics (e.g. response time). These need to be defined in common implementation guidance. The WSQM parameters cover functional and non-functional requirements of a service and their definitions and example metrics should form the basis for future web service requirement documents. Implementation of LogRhythm software as part of SDI supports the provision of SDI WSQM metrics with the exception of the ‘business value quality’ parameter and configuration management within the ‘manageability quality’ parameter. Web service optimisation: Definitions of quality parameters for web services are highly variable, as is the method used to capture quantitative metrics (e.g. response time). These need to be defined in common implementation guidance. The WSQM parameters cover functional and non-functional requirements of a service and their definitions and example metrics should form the basis for future web service requirement documents. Implementation of LogRhythm software as part of SDI supports the provision of SDI WSQM metrics with the exception of the ‘business value quality’ parameter and configuration management within the ‘manageability quality’ parameter. OFFICIAL 22 September 2018 © Crown copyright 2013 Dstl Copyright Envitia

Questions OFFICIAL 22 September 2018 © Crown copyright 2013 Dstl