The application of “Value Engineering” tools to risk assess the outputs of an NSI Graham Sharp Manager, Continuous Improvement Zone ONS, UK

Slides:



Advertisements
Similar presentations
MONITORING OF SUBGRANTEES
Advertisements

Data Quality Assurance and Dissemination International Workshop on Energy Statistics Aguascalientes, Mexico.
Planning and preparing the national HES EHES Training seminar, Rome 11 February 2010 Päivikki Koponen.
Achieve Benefit from IT Projects. Aim This presentation is prepared to support and give a general overview of the ‘How to Achieve Benefits from IT Projects’
Identifying enablers & disablers to change
Data Quality – UK activities Iain Macleay Head of Energy Balances, Prices and Publications 27 September 2013.
Workshop on Energy Statistics, China September 2012 Data Quality Assurance and Data Dissemination 1.
The quality framework of European statistics by the ESCB Quality Conference Vienna, 3 June 2014 Aurel Schubert 1) European Central Bank 1) This presentation.
Renewals Reporting Short Term Solution - Analysis Phase Scoping & Approach 9 th June 2014.
YJB TOOLKITS: Disproportionality YJB owner: Sue Walker Dept: Performance May (2011) Version 1.0.
Purpose of the Standards
© Statistisches Bundesamt, Institute for Research and Development in Federal Statistics Statistisches Bundesamt DESAP - A New Self Assessment Checklist.
Quality assuring the UK business register Andrew Allen.
6  Methodology: DMAIC Robert Setaputra. PDCA / PDSA PDCA / PDSA is a continuous quality improvement tool. PDCA is introduced by Shewhart. PDSA is Deming’s.
Release & Deployment ITIL Version 3
Key Performance Indicators - KPI’s
The use and convergence of quality assurance frameworks for international and supranational organisations compiling statistics The European Conference.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Impact assessment framework
Use of survey (LFS) to evaluate the quality of census final data Expert Group Meeting on Censuses Using Registers Geneva, May 2012 Jari Nieminen.
Quality assurance activities at EUROSTAT CCSA Conference Helsinki, 6-7 May 2010 Martina Hahn, Eurostat.
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
Quality Assessments of Statistical Production Processes in Eurostat Pierre Ecochard and Małgorzata Szczęsna
Switch off your Mobiles Phones or Change Profile to Silent Mode.
Eurostat Overall design. Presented by Eva Elvers Statistics Sweden.
CZECH STATISTICAL OFFICE 1 The Quality Metadata System In the Czech Statistical Office Work Session on Statistical Metadata (METIS)
Capacity Self-Assessment as a management tool for organisational development planning u A model used for the Ministry of Foreign Affairs and European Integration,
National Commission for Academic Accreditation & Assessment Developmental Reviews at King Saud University and King Faisal University.
Current and Future Applications of the Generic Statistical Business Process Model at Statistics Canada Laurie Reedman and Claude Julien May 5, 2010.
European Conference on Quality in Official Statistics Session 26: Quality Issues in Census « Rome, 10 July 2008 « Quality Assurance and Control Programme.
The role of Statistical Computing in delivering quality Amy Large Statistical Computing Branch Survey Methodology & Statistical Computing Division, Research,
Supporting Industry Change Planning: Risk & Milestone Assessment Process & Tools 07/07/2014.
20081 E-learning Lecture-10: EVALUATING THE IMPACTS OF E-LEARNING week 12- Semester-4/ 2009 Dr. Anwar Mousa University of Palestine Faculty of Information.
Paul Hardiman and Rob Brown SMMT IF Planning and organising an audit.
Census Quality: another dimension! Paper for Q2008 conference, Rome Louisa Blackwell Quality Assurance Manager, 2011 Census.
Mr. Walter Balfe, Development Officer, FETAC Provider Self Evaluation of Programmes and Services Walter Balfe Development Officer – FETAC 4 October 2007.
Proposals on standardisation process in ESS, The Hague_ ESS net Preparation of Standardisation 1 Proposals on standardisation process.
Guidelines on Statistical Business Registers Draft Chapter 8: Quality of SBR Caterina Viviano, Monica Consalvi ISTAT Meeting of the Group of Experts on.
X Project Highlight Report – (Date) Achievements  Key high level outputs delivered by the project this month. Communications Key high level communications.
Effective Learning Support: The key to quality and success Enhancement of Learning Support.
1 C. ARRIBAS, D. LORCA, A. SALINERO & A. COLMENERO Measuring statistical quality at the Spanish National Statistical Institute.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
Process Quality in ONS Rachel Skentelbery, Rachael Viles & Sarah Green
A Basis for Better Decision Making: Improvements to Quality Reviews and Reporting Catherine Bremner and Sarah Tucker Office for National Statistics.
IAEA International Atomic Energy Agency Methodology and Responsibilities for Periodic Safety Review for Research Reactors William Kennedy Research Reactor.
United Nations Oslo City Group on Energy Statistics OG7, Helsinki, Finland October 2012 ESCM Chapter 8: Data Quality and Meta Data 1.
Quality Frameworks: Implementation and Impact Notes by Michael Colledge.
Kathy Corbiere Service Delivery and Performance Commission
Metadata and quality Hans Viggo Sæbø, Statistics Norway
Report Performance Monitor & Control Risk Administer Procurement MONITORING & CONTROLLING PROCESS.
Census quality evaluation: Considerations from an international perspective Bernard Baffour and Paolo Valente UNECE Statistical Division Joint UNECE/Eurostat.
First meeting of the Technical Cooperation Group for the Population and Housing Censuses in South East Europe Vienna, March 2010 POST-ENUMERATION.
29 th May Agenda 9.15 Arrival – Tea/Coffee served 9.30 Breakfast served 9.45 The importance of the Framework within the world of Service Companies
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
CHANGE READINESS ASSESSMENT Measuring stakeholder engagement and attitude to change.
ACF Office of Community Services (OCS) Community Services Block Grant (CSBG) Survey of Grantees Satisfaction with OCS Survey of Eligible Entities Satisfaction.
Statistical process model Workshop in Ukraine October 2015 Karin Blix Quality coordinator
AGRO PARKS “The Policy Cycle” Alex Page Baku November 2014.
1 Recent developments in quality related matters in the ESS High level seminar for Eastern Europe, Caucasus and Central Asia countries Claudia Junker,

Implementation of Quality indicators for administrative data
Dual Mode of Data Collection – A New Approach in the Population, Housing and Dwelling Census in Slovakia in 2011 European Conference on Quality in Official.
Survey phases, survey errors and quality control system
Measuring Data Quality and Compilation of Metadata
Survey phases, survey errors and quality control system
Quality Assurance in Population and Housing Censuses
Aurora De Santis, Riccardo Carbini Istat, Italy
Project Management Process Groups
How we are Managing our RAG Status.....
Metadata on quality of statistical information
Presentation transcript:

The application of “Value Engineering” tools to risk assess the outputs of an NSI Graham Sharp Manager, Continuous Improvement Zone ONS, UK

Problem & goal statements Problem Statement No consistent means of assessing the risks of our statistical outputs in a standardised way Need a strategic approach to be taken to prioritise improvements Goal Statement To develop a risk assessment methodology and deliver a scored risk assessment of ONS statistical outputs by the end of 2012.

Scope & solution requirements In Scope: Evaluate all statistical outputs and the statistical system(s)/tool(s) which are used to produce them Consider the entire GSBPM Consider a number of dimensions of risk Solution requirements: Measure the risk of each ONS output Outputs to be compared against each other Allow drill down capabilities to identify root causes of scores Simple to apply Capable of self-assessment by output managers

High level project stages Agree coverage/scope of project with Directors Communication with staff Design solution dimensions and weighting Set up solution template Pilot with business areas Refine tool if necessary Collect information from all business areas Collate into template tool and produce RAG status per system/output QA results Produce analysis output

Value Engineering Wikipedia Definition: “Value engineering (VE) is a systematic method to improve the "value" of goods or products and services by using an examination of function” Applied to statistical outputs, it provides a systematic risk assessment against a number of dimensions

Dimensions – Are they ‘fit for purpose’? Sources Methods Systems Processes Quality Users & Reputation People Census data Admin data Survey data System named and reason for red/amber provided European dimensions of Relevance; Accuracy; Timeliness & Punctuality; Accessibility & Clarity; Comparability; Coherence Are there sufficient skilled and trained people working on the output? Data acquisition /Questionnaire design; Coverage of data; Processing, edit & imputation; Analysis; Disclosure Data collection & preparation; Results & analysis User feedback; Future user needs; Reputation

Process for first implementation Review of template with each Deputy Director (DD) 3 pilot sessions with outputs managers Updated template Self assessment by output managers Quality assured by DD, data collection areas and Methodologists Importance weights reviewed by Directors Challenges responded to by output managers/DDs Results collated

Scoring process No issues or N/A0N/A Some improvements possible3Comments In need of attention9Comments OutputSystemsSummary score ASub 1Sub 2Sub 39 BSub 1Sub 2Sub 33 CSub 1Sub 2Sub 33 Output Sources Methods Systems Processes Quality Users & Reputatio n People Summary score Weighting Composite score A B C DD to complete: Confirm the relative importance of the output to users and the impact to ONS reputation if results were erroneous. Low =1 Med = 2 High =3

Overall assessment % red overall 21.4%18.7% % amber overall 46.8%48.1% % green overall 32.0%33.2% Baseline measure of ‘% red overall’ used as a KPI in ONS business planning Data collection carried out in November 2012 and again in November 2013

Highest scoring outputs Analysis of scores in November 2012 and November 2013 allowed identification of top ten scoring outputs – i.e. highest risk Reasons for movements over the year also analysed to identify existing mitigating actions and identification of what remains to be addressed

Analyse - Boxplots Comparison of scores after 1 year Identify improvements made Confirm new candidate surveys for improvement Review cause of outliers and corrective action required

Findings by Dimensions - Processes Percentage of outputs scoring RAG by category Number of red outputs by Division for this category Percentage of RAG outputs for each Division for this category

Use of analysis to date (1) Prioritisation of National Statistics Quality Reviews Input to survey action plans - identifying and prioritising key improvements required Identifying local continuous improvement initiatives Prioritising developments and influencing budget allocations Sense checking where we are currently investing in developments

Use of analysis to date (2) Deploying our skilled people to reduce risks in key areas Improving communications on outputs Highlighting where we need careful stakeholder handling

Conclusion Model meets the intended purpose Gaining in popularity and application Has become a key tool in the risk assessment of ONS outputs Need to be aware that this is based on self assessment, but mitigating actions in place Should be used as part of a wider range of risk & quality assessment tools

Questions