Quality of Service and Experience – Metrics Survey

Slides:



Advertisements
Similar presentations
The European Umbrella Organisation for GI On-going Metadata Initiatives in Europe Christian Chenez Gael Kermarrec.
Advertisements

Your logo here WasteMINZ Mid-Year Roundup Wellington.
Unit 8: Implementation, Part II Seminar Wednesday pm ET.
Use & Users, Community Needs, Information Needs and Behaviors Sajjad ur Rehman.
Data Collection Techniques
2 March 2017 Jevgenija Sevcova, EIFL Programmes and events coordinator
Building evaluation in the Department of Immigration and Citizenship
Marketing Strategies for the Use of Research4Life Resources
Evaluation of Priority Gender Equality
The Government’s perspective on measuring disability employment
Refine the HR Organizational Structure and Optimize Department Efficiency Whether your organization is requiring you to grow or asking you to cut down.
Hillingdon CCG CCG 360o stakeholder survey 2014 Summary report.
Towards more flexibility in responding to users’ needs
Assisting with the Nursing Process
Ian Bird GDB Meeting CERN 9 September 2003
Washington, DC USA Luis Bermudez March
A GUIDE FOR PROJECT MANAGERS.
PMRIPT Portal Team.
Aurora A. Rubio INDICATORS ON COMMUNITY ACCESS TO ICT:
Spatineo Monitor QoS Measuring — a Peak Under the Hood
OiRA – Support provided by the Agency
Copyright © 2016 Open Geospatial Consortium
Engaging Families in the Assessment Process
Workforce Engagement in Safety
DG Environment, Unit D.2 Marine Environment and Water Industry
Uncontrolled copy when printed I Burnell
Project Audit and Closure
Communication requirements How will this indicator be measured?
Relate to Clients on a business level
Community Technology Assessments
Assessing your total rewards offer
Chapter 8 Using secondary data
Minimum Contract Level Collaborative Self Assessment October 2011 Chris Bealey, LSIS Adviser Emphasise for contractors (lead/sub –contractors): existing.
Marketing Strategies for the Use of Research4Life Resources
Interaction with resource providers: selection, SLA, support
Monitoring vs Evaluation
QoSE DWG Key Tasks nd OGC Technical Committee
ESS Vision 2020: ESS.VIP Validation
WIS Strategy – WIS 2.0 Submitted by: Matteo Dell’Acqua(CBS) (Doc 5b)
Methodological dimensions
Introduction to the PRISM Framework
Workforce Engagement Survey
High level seminar on the implementation of the
QUALITY ASSURANCE AND IMPROVEMENT PROGRAM
Fahrig, R. SI Reorg Presentation: DCSI
There is a significant amount of diversity across the 38 rural councils in terms of the challenges faced, as well as capacity, resourcing and uptake.
Ulrich Wiedner / Carlo Guaraldo
User Views on Quality Reporting
What to do with your data?
Marketing Strategies for the Use of Research4Life Resources
Harrow CCG CCG 360o stakeholder survey 2014 Summary report.
INSPIRE Development of Implementing Rules
Minimum Contract Level Collaborative Self Assessment October 2011 Chris Bealey, LSIS Adviser Emphasise for contractors (lead/sub –contractors): existing.
Safety Committee Meeting January Meeting 16.
European Institute of Public Administration (NL)
Customer Empowerment Working Group
AICT5 – eProject Project Planning for ICT
Feedback summary forms Tool 9
Strategic Research Advice Rationale
Chapter 8 Using secondary data
Changes in the Canadian Census of Population Program
Project Audit and Closure
Street Manager Training approach
Sustainability and scalability Croatian Institute of Public Health
Tracie Wills Senior Commissioning Officer
Annual General Meeting – 6 December 2017
Quality of Service Experience DWG OGC Discussion Paper – Status and Overview OGC TC Orléans – March 21, 2018 Cindy Mitchell, Natural Resources Canada.
QoS Metadata Status 106th OGC Technical Committee Orléans, France
ESS Vision and VALIDATION
Enhancing Learning in Practice
Presentation transcript:

Quality of Service and Experience – Metrics Survey

Purpose and Approach The QoSE DWG wanted to assess the current status of OGC members interest and usage of quality of service metrics and monitoring Survey conducted via Google docs in late 2017 Circulated on OGC mailing lists and promoted via various channels available to QoSE DWG members (such as Canadian agencies)

Summary of results Only 20 responses – but a range of organisations However the small sample size should create caution in analysing results

Summary of results There is clear recognition of the importance of measuring the ongoing quality of web services.

Summary of results Most already monitor their web services – suggesting there are sufficient tooling and infrastructure to enable this

Summary of results The focus of web service monitoring is response time and availability as key metrics Few measure response content, accuracy or conformance to standards This suggests any standardisation must support response time and availability There may be a gap in tooling, best practices or standards for measuring content or accuracy

Summary of results Most responders use internal tools to monitor these metrics – this could suggest a lack of COTS or FOSS implementations (the lack of standardisation even in the wider IT industry may play a role here)

Summary of results Most responders do not let their users know their performance Could a lack of standardisation in metric definition be playing a role here?

Summary of results Most responders do not include these metrics in an SLA Potentially suggests these are not or not seen as critical services

Summary of results Most responders do see potential value in standardised methods of communicating quality of service Are there two areas here – standard definitions and measurement of metrics and standard ways of communicating them?

Questions? Michael Gordon Michael.Gordon@os.uk @G0rd82