Performance Measurement: Indicators Sara Grainger Statistical Support for Public Bodies Office of the Chief Statistician.

Slides:



Advertisements
Similar presentations
Quality Improvement in the ONS Cynthia Z F Clark Frank Nolan Office for National Statistics United Kingdom.
Advertisements

The Managing Authority –Keystone of the Control System
FINANCIAL AUDIT METHODOLOGY PETER CARLILL UK NATIONAL AUDIT OFFICE.
Auditing, Assurance and Governance in Local Government
1 Auditing in the Public Interest Records Management in the Victorian Public Sector Audit objective Audit had two objectives : The first objective was.
Scottish Government Experience of National Statistics Assessments Janette Purbrick Office of the Chief Statistician TTSAC - 28 April 2010.
Learning Objectives LO1 Understand the traditional approach to dealing with accounting uncertainty and going concern. LO2 Describe the main concepts of.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
8. Evidence-based management Step 3: Critical appraisal of studies
 National Hospice & Palliative Organization, 2008 SEC. § : QUALITY ASSESSMENT AND PERFORMANCE IMPROVEMENT SUBPART C: PATIENT CARE.
Towards a diversified review model for FE colleges First and second thoughts with no formal status! Dr Iain MacRobert and other HMI thinking out loud!
AUDIT COMMITTEE FORUM TM ACF Roundtable IT Governance – what does it mean to you as an audit committee member July 2010 The AUDIT COMMITTEE FORUM TM is.
S17: Field work. Session Objectives  To explain the manner in which field audit is carried out.  To explain the nature of evidence and the different.
Presentation for APP review to Portfolio Committee 14 April 2015.
Purpose of the Standards
Dimensions of ‘Community’ in Scotland : Patterns, Futures and Challenges for Policing SNS & SIMD Matt Perkins Office of the Chief Statistician Scottish.
Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on performance monitoring in the public services,
UNSD-DFID Project on Improving National Development Indicators: Director’s Meeting 15 – 17 October 2014, New York, NY Matthias Reister, Senior Statistician,
Are the results valid? Was the validity of the included studies appraised?
Performance Measurement and Analysis for Health Organizations
Department of Water Affairs (DWA) and Water Trading entity (WTE) Predetermined Objectives – 2013/14 March 2013.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
S7: Audit Planning. Session Objectives To explain the need for planning To explain the need for planning To outline the essential elements of planning.
Understanding Meaning and Importance of Competency Based Assessment
Core Facts Local Authority Network Event James Boyce, Friday 11 December.
Audit Planning. Session Objectives To explain the need for planning To outline the essential elements of planning process To finalise the audit approach.
Understanding sample survey data. Underlying Concept A sample statistic is our best estimate of a population parameter If we took 100 different samples.
Where to find Scottish Government statistics.
School of Health Systems and Public Health Monitoring & Evaluation of HIV and AIDS Programs Data Quality Wednesday March 2, 2011 Win Brown USAID/South.
Roadmap for a monitoring framework for the post-2015 development agenda OWG on Sustainable Development Goals Informal meeting on measuring progress (17.
for statistics based on multiple sources
5-4-1 Unit 4: Sampling approaches After completing this unit you should be able to: Outline the purpose of sampling Understand key theoretical.
Quality Assurance Programme of the Canadian Census of Population Expert Group Meeting on Population and Housing Censuses Geneva July 7-9, 2010.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Improving Local Indicators Project 3 rd Consultation Workshop David Hume Chair of Project Board.
Understanding Outcome Measures David McPhee Communities Analytical Services.
Tax Administration Diagnostic Assessment Tool MODULE 11 “POA 9: ACCOUNTABILITY AND TRANSPARENCY”
Consistency of Moderation Languages What is moderation? Moderation is a process where teachers compare judgements to either confirm or adjust them.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Audit of predetermined objectives PFMA Reputation promise/mission The Auditor-General of South Africa has a constitutional mandate and, as the.
Workplace based assessment for the nMRCGP. nMRCGP Integrated assessment package comprising:  Applied knowledge test (AKT)  Clinical skills assessment.
Transforming lives through learning
The NPF & Scotland Performs: Analytical Underpinning and Challenges Mairi Spowage Office of the Chief Statistician 23 rd March 2009.
Monitoring and Evaluation in the GMS Learning Program 7 – 18 May 2012, Mekong Institute, Khon Kaen, Thailand Randy S. Balaoro, CE, MM, PMP Data Sampling.
Measuring Change and Monitoring Targets Using Administrative Data: Some examples (not based on housing data) Matt Perkins Office of the Chief Statistician.
What is Good Monitoring? Environmental Evaluator’s Network Ottawa, September 2010.
Statistics & the CHMA Charles Brown Assistant Statistician Tel: Mailbox:
The NPF & Scotland Performs: Analytical Underpinning and Challenges Mairi Spowage Office of the Chief Statistician 9 th June 2009.
Overview of Programme of the Working Group on Flash Estimates of GDP Roberto Barcellan European Commission - Eurostat.
Scottish Neighbourhood Statistics (SNS) and the Scottish Index of Multiple Deprivation (SIMD) Matt Perkins Office.
Revised Quality Assurance Arrangements for Registered Training Organisations Strengthening our commitment to quality - COAG February 2006 September 2006.
TYNE AND WEAR FIRE AND RESCUE SERVICE ‘Creating The Safest Community’ Evaluation in the Fire and Rescue Service Vicki Parnaby.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
Performance indicators: good, bad, and ugly The report of the Royal Statistical Society working party on Performance Monitoring in the Public Services.
Level 4 Diploma in Dance Teaching
Presented by Deborah Eldridge, CAEP Consultant
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
The move from a rule based system to a risk based system Challenges for the competent authorities October 2017.
Presenter: Christi Melendez, RN, CPHQ
Technical assistance strategies to improve expenditure estimates and price deflators Joseph Mariasingham Economic Research and Regional Cooperation Department.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Dr. Olivier Thunus UNECE Task Force Vice-Chair
Monitoring and Evaluation using the
16 May 2018 Briefing to the Portfolio Committee of the Department of Sport and Recreation portfolio on the review of the draft APP.
SDG Global Indicator Framework
Evolution of Urban Audit
Data collection and validation support for the management of the ESF
Presentation transcript:

Performance Measurement: Indicators Sara Grainger Statistical Support for Public Bodies Office of the Chief Statistician

Indicators of outcomes SG want an outcome based approach… Not an indicators based approach But outcomes are rarely directly measureable Indicators give an indication of whether the outcome is being achieved

Outline of presentation 1.Important criteria for good outcome indicators 2.What data are available and how they can be accessed 3.Where to find more information and access support and guidance on measuring outcomes

Criteria for indicators 1. Relevant and unambiguous The indicator should be clearly and directly relevant to at least one of the high level outcomes that are being sought. It may not be a direct measure of the outcome but it should be a clear and unambiguous indicator of progress toward that outcome. The definition should allow for non-experts to understand the indicator and there should be no possibility of misinterpretation.

The definition of the indicator should be harmonised with any similar measures being used in other frameworks, performance management systems, legislation or national or international conventions. Criteria for indicators 2. Harmonised with other frameworks and concepts

The data should be published regularly The time lag between recording and reporting of data should be minimal, The data should be easily accessible to all Criteria for indicators 3. Timely and accessible

For data from surveys: The data should be precise enough to measure change (i.e. the confidence intervals should be narrow enough that it can be reliably reported whether or not the target has been achieved at the relevant geography or for the given sub- group). The data should be based on a sample that is representative of the relevant population and collected using recognised best practice in surveys. Criteria for indicators 4. Statistically Robust

For data from administrative systems: All bias and error in the data should be recognised and the implications assessed against the usefulness of the data. There should be minimal risk to changes in systems and recording practice over time. The data should be fully quality assured and auditable. Criteria for indicators 4. Statistically Robust

The cost of collecting the data to a sufficient quality standard should be outweighed by the usefulness and utility of the data. Criteria for indicators 5. Affordable

Contains a HUGE amount of data, mostly at data zone level Easy to download, map, compare across time or area The SNS team are happy to provide demonstrations and assistance

CONTACT US!!

More information Performance indicators: good, bad, and ugly A report by the RSS working party on performance monitoring in the public services Scottish Government Statistics Group Methodology Glossary

Where to go for support The Analysts Network: Includes details of the SOLACE led “Improving Local Indicators Project” – all welcome to get involved! Statistical Support for Public Bodies Branch: