A Basis for Better Decision Making: Improvements to Quality Reviews and Reporting Catherine Bremner and Sarah Tucker Office for National Statistics.

Slides:



Advertisements
Similar presentations
Data Quality Assurance and Dissemination International Workshop on Energy Statistics Aguascalientes, Mexico.
Advertisements

Toward a Vision for a National System of Natural and Environmental Resource Indicators.
Halton Housing Trust Customer Scrutiny Panel An introduction to our Service Reviews.
The Aged Care Standards and Accreditation Agency Ltd Continuous Improvement in Residential Aged Care.
1 European Conference on Quality in Official Statistics Rome, 8-11 July 2008 Improving the quality and the quality assessment of the Labour Force Survey.
The application of “Value Engineering” tools to risk assess the outputs of an NSI Graham Sharp Manager, Continuous Improvement Zone ONS, UK
PERFORMANCE FOR ALL The Project & the System. A HE project co-ordinated by University of Bristol, open to HE internationally. Developing the requirements.
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Action Implementation and Monitoring A risk in PHN practice is that so much attention can be devoted to development of objectives and planning to address.
Nurse Revalidation.
Building a Compliance Risk Monitoring Program HCCA Compliance Institute New OrleansApril 19, 2005 Lois Dehls Cornell, Esq. Assistant Vice President, Deputy.
Implementation of Project Governance at the Center Level
Doctoral Study in Social Work Sue Lawrence
Conducting the IT Audit
Data Analysis in the Water Industry: A Good-Practice Guide with application to SW Deborah Gee, Efthalia Anagnostou Water Statistics User Group - Scottish.
COMPGZ07 Project Management Presentations Graham Collins, UCL
Please read before using this briefing This presentation forms the basis of a workshop for operational managers and other relevant staff to review quality.
Quality assurance activities at EUROSTAT CCSA Conference Helsinki, 6-7 May 2010 Martina Hahn, Eurostat.
1 Collaborative Provision and External Examining Nicola Clarke Centre for Academic Standards and Quality Enhancement (CASQE)
REFERENCE METADATA FOR DATA TEMPLATE Ales Capek EUROSTAT.
NAVCA Quality Award Andrea Allez Performance Improvement Manager Excellent service for local groups.
Guidance for Editors Working with care maps © 2008 Map of Medicine Ltd. Commercial and in confidence.
European Conference on Quality in Official Statistics, Rome 8-11 July Satisfying User and Partner Needs- the Use of Specific Reviews at Eurostat.
© OECD A joint initiative of the OECD and the European Union, principally financed by the EU. Quality Assurance José Viegas Ribeiro IGF, Portugal SIGMA.
Contracts Monitoring Framework Development Social Care Procurement – Contracts Monitoring SERVICE USER SURVEY.
1.  Describe an overall framework for project integration management ◦ RelatIion to the other project management knowledge areas and the project life.
SESSION 5 INTRODUCTION AND REASONS FOR REFORM SCOA TONING PRESENTED BY: NATIONAL TREASURY.
Introduction to EAP Profile 36. Overview Development of EAP Profile 36 Transition to EAP Profile 36 Update to EAP Profile Expiry Date.
Key outputs from the GSS Quality Task Force and how these can help you… Jill Pobjoy (ONS), Emma Newman (ONS) and Nick Woodhill (DASA)
Lunch & Learn – PMO April 2014 Summary April 1 st April – 30 th June 2014.
Supporting Industry Change Planning: Risk & Milestone Assessment Process & Tools 07/07/2014.
Monitoring public satisfaction through user satisfaction surveys Committee for the Coordination of Statistical Activities Helsinki 6-7 May 2010 Steve.
Use of Student Conciliators at the University of Glamorgan Denise Williams Deputy Academic Registrar ©University of Glamorgan.
Census Quality: another dimension! Paper for Q2008 conference, Rome Louisa Blackwell Quality Assurance Manager, 2011 Census.
Get Your "Party" Started: Establishing a Successful Third-party Evaluation Martha Thurlow, Ph.D. & Vitaliy Shyyan, Ph.D.—National Center on Educational.
Mr. Walter Balfe, Development Officer, FETAC Provider Self Evaluation of Programmes and Services Walter Balfe Development Officer – FETAC 4 October 2007.
Professional Certificate in Electoral Processes Understanding and Demonstrating Assessment Criteria Facilitator: Tony Cash.
ST-09-01: Catalyzing Research and Development (R&D) Funding for GEOSS Florence Béroud, EC Jérome Bequignon, ESA Kathy Fontaine, US ST Kick-off Meeting.
Update on Performance Measures Pilot and Development of the Cancer Plan Index Presented by Deborah Porterfield, MD, MPH RTI International Presented at.
Process Quality in ONS Rachel Skentelbery, Rachael Viles & Sarah Green
What will this presentation do? Explain what Single Assessment Process is and where it comes from Explain how Single Assessment will improve older peoples.
United Nations Economic Commission for Europe Statistical Division Data Initiatives: The UNECE Gender Database and Website Victoria Velkoff On behalf of.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Tax Administration Diagnostic Assessment Tool MODULE 11 “POA 9: ACCOUNTABILITY AND TRANSPARENCY”
PHE Local Intelligence Contribution David Meechan, Director for Knowledge & Intelligence (East Midlands), Public Health England.
3rd Stage Review: Lead Reviewers Experience Outline: –Objectives –Overview of technical findings –Lessons Learned/Confirmed –Value of 3 rd Stage –Lead.
Unit 8: Implementation, Part II Seminar Wednesday pm ET.
Z26 Project Management Presentations Lecture 5b 9 th February 2006 Graham Collins, UCL.
CHANGE READINESS ASSESSMENT Measuring stakeholder engagement and attitude to change.
13 November, 2014 Seminar on Quality Reports QUALITY REPORTS EXPERIENCE OF STATISTICS LITHUANIA Nadiežda Alejeva Head, Price Statistics.
EXPLORER project Elizabeth Lunt Project Manager De Montfort University.
November | 1 CONTINUING CARE COUNCIL Report to Forum Year
1 Recent developments in quality related matters in the ESS High level seminar for Eastern Europe, Caucasus and Central Asia countries Claudia Junker,
Innovation and Development in Official Statistics: communicating for users #DigitalDay Michael Hardie, Office for National Statistics.
Valorisation: getting added value from projects Elli Georgiadou Middlesex University School of Science and Technology.
What is revalidation? Every three years, at the point of your renewal of registration, you need to show that, as a professional, you are living by the.
IS&T Project Reviews September 9, Project Review Overview Facilitative approach that actively engages a number of key project staff and senior IS&T.
EIAScreening6(Gajaseni, 2007)1 II. Scoping. EIAScreening6(Gajaseni, 2007)2 Scoping Definition: is a process of interaction between the interested public,
Small Charities Challenge Fund (SCCF) Guidance Webinar
Summary overview of Quality Information for
GCSE Health and Social Care April 2016
E-Learning Advisory Group Meeting
Quality assurance in official statistics
Rolling Review of Education Statistics
Summary overview of Quality Information for
Improving information exchange:
User Views on Quality Reporting
What is revalidation? Every three years, at the point of your renewal of registration, you need to show that, as a professional, you are living by the.
Changing How We Communicate Quality and Methods to Our Users
Presentation transcript:

A Basis for Better Decision Making: Improvements to Quality Reviews and Reporting Catherine Bremner and Sarah Tucker Office for National Statistics

Motivation Quality Centre is Centre of methodological expertise in statistical quality and respondent burden Working over the past 12 months to streamline current quality initiatives and drive innovation

Outline Introduction to quality reporting Quality reporting developments Using quality reports to drive quality improvements The new quality review process at ONS

Quality Reporting Overview Why do we report on Quality? Give users enough information to decide on suitable uses for the data Meet obligations under Code of Practice Meet requirements from Eurostat

Quality and Methodology Information Static quality information Updated on annual basis or when there are major methodological changes Includes: history of output details of statistical process reports against European Statistical System Quality Dimensions Extensively quality assured

Quality and Methodology Information

Current format 5 years old – time to review Changes to website – accessibility More information on user personas - Inquiring Citizen, Data Forager and Expert Analyst Need to change the way we present the data – content unlikely to change much as meets our obligations

Reviewing Presentation of Quality Information Aims to extend the reach of quality information to user personas – alongside current QMIs Investigations currently focussing on information that users need so they can’t misuse data First steps: consulting with data producers, methodologists and other stakeholders

Reviewing Presentation of Quality Information One size does not fit all Pool of suggested pieces of information to choose from Fit together to give picture of how not to misuse data Held focus groups/workshops and individual meetings to gain thoughts on what should be considered for inclusion

Internal Consultation Ongoing – Points Raised Four focus groups have been held - 2 each for output managers and quality assurers Individual meetings with output managers and other stakeholders Some themes are coming through consistently: uncertainty – is it an estimate etc, periodicity, coherence, accuracy and health warnings. As expected, emphasis is different across type of outputs – admin data vs. Survey

Next steps Compiling a shortlist of these vital pieces of information Work with data producers to produce pilot User test Review results Propose action

Quality Reviews at ONS Previously carried out Annual Quality Reviews Made use of Quality, Methodology and Harmonisation Tool (QMHT) Responding to feedback, developed a new process during summer 2014

Regular Quality Reviews (RQR) Face to face meeting between Output manager, Deputy Director, G6/G7 Methodologist and Quality Centre representative Makes use of existing information from Value Engineering and Quality and Methodology Information (QMI) reports, which are discussed at the meeting Quality Assurance (DD) walk-through incorporated into the meeting

Regular Quality Reviews (RQR) G6/G7 methodologist produces bespoke recommendations, which will be presented at Management/Project boards Output reviewed once every 3 years Supports the Code of Practice Piloted on 3 outputs during summer 2014 Process rolled out in November 2014; 13 reviews conducted to date

Why are RQRs required? Support requirements of the Code of Practice Assesses the output Adds value to output through bespoke recommendations Allows recommendations to be addressed, improving the output

What do we expect? What we already require - QMI, Quality Assurance (DD) walkthrough, Quality checklist Quality Centre collate all the relevant documents e.g. QMI, Value Engineering, Assessment Reports to be reviewed Output Manager, DD, and team members (as required) to meeting

Recommendations – Adding Value Aim to write up within two weeks of the meeting Assigned a RAG status Prioritised from a Methodological and ONS view Timetable of when/how recommendations are dealt with depend on what they are Range from a quick win – update QMI or longer term plan of getting work onto business plan Role for Quality Centre in monitoring

Relevance for GSS Regular Quality Review approach is a tool that could be used by GSS Expected that senior statistician could carry out review Alternatively, support from Methodology Advisory Service The approach will be included in the GSS quality resources

Contacts Quality Reporting Sarah Tucker Regular Quality Reviews Catherine Bremner,