EU Code of Practice Peer Review 2006 – 8 :A Peer’s Perspective Frank Nolan Office for National Statistics UK.

Slides:



Advertisements
Similar presentations
19-20 September 2013, IBGE, Rio de Janeiro, Brazil
Advertisements

Presented by: Denise Sjahkit SURINAME. Introduction Overview of the main policy issues Scope Current compilation practices Data-sources Requirements for.
30 January – February 1,2013 Kingston, Jamaica The Statistical Institute of Jamaica.
Quality Improvement in the ONS Cynthia Z F Clark Frank Nolan Office for National Statistics United Kingdom.
Ireland Stat Fiachra Kennedy Central Expenditure Evaluation Unit Department of Public Expenditure & Reform.
The Role and Management o of the Scrutiny Unit Jessica Mulley, Head of the Scrutiny Unit June 2013.
Auditing, Assurance and Governance in Local Government
THE APRM MONITORING PROCESS MOZAMBIQUE EXPERIENCE Workshop on Harmonizing the Zambian APRM NPoA with the NDP and MTEF Oct. 2014, Lusaka 1.
Program Improvement Unit Collaborating to increase student achievement and fundamentally improve the interaction between the teacher and the students to.
The Danish statistical system - experience of coordination Lars Thygesen & Kirsten Wismer.
National Statistical System and Data Dissemination Sok Kosal, Deputy Director General,National Institute of Statistics Ministry of Planning, Cambodia International.
Safeguarding trust in Irish Official Statistics A Code of Practice for the Irish Statistical System Pádraig Dalton Director General Central Statistics.
Summary overview of Quality Information for Overseas Trade Statistics (OTS) and Regional Trade Statistics (RTS) HMRC Trade Statistics Version No:1.0 Date:12/11/2013.
Safeguarding trust in Irish Official Statistics A Code of Practice for the Irish Statistical System Ken Moore, Central Statistics Office European Conference.
Implementation of the EFQM Excellence Model in the ICBS Galit Ben Aharon.
Leading the way to open data clarity Inaugural Public Sector benchmark survey on Open Data - February 2013 Media slides.
System of Environmental-Economic Accounting SEEA Implementation Guide and Diagnostic Tool Alessandra Alfieri UNSD.
Benchmarks and Benchmarking in the UK - Lessons Learned Catherine Connor Quality Enhancement Unit London Metropolitan University.
“Strengthening the National Statistical System of RM” Joint Project By 2011, public institutions with the support of civil society organizations (CSOs)
System of Environmental-Economic Accounting SEEA Implementation Guide and Diagnostic Tool and Suggested Structure for Assessment United Nations Statistics.
The evaluation of research units at HCERES
Quality assurance activities at EUROSTAT CCSA Conference Helsinki, 6-7 May 2010 Martina Hahn, Eurostat.
Ministry of State for Administrative Development Towards Meaningful ICT Indicators for Developing Countries Dr. Ahmed M. Darwish EGYPT Government and Education.
Background information Recruitment and Retention issue has been part of HOSPEEM and EPSU work programme since the early stages of the hospital social.
Harnessing a multi-stakeholder platform for improved land governance in Malawi Ivy Luhanga – Principal Secretary, Paul Jere – Land Governance Consultant,
Monitoring public satisfaction through user satisfaction surveys Committee for the Coordination of Statistical Activities Helsinki 6-7 May 2010 Steve.
The Italian Statistical System 88 th DGINS The future of the European Statistical System Palermo, Italy Palazzo dei Normanni 19 th -20 th September 2002.
Hallgrímur Snorrason Management seminar on global assessment Session 8: Planning, programming and priority setting under budgetary restraints; human resource.
 A review of role and responsibility assignments for those involved in research administration and compliance functions has progressed through a final.
DETERMINE Working document # 4 'Economic arguments for addressing social determinants of health inequalities' December 2009 Owen Metcalfe & Teresa Lavin.
INFORMING THE MESSAGE The importance of statistics for the farming sector Rowena Dwyer IFA Chief Economist 22 nd November 2012.
Implementation of the European Statistics Code of Practice Yalta September 2009 Pieter Everaers, Eurostat.
Africa Programme on Gender Statistics Status of implementation United Economic Commission for Africa Meeting of Committee of Directors General November.
Institutional and legal framework of the national statistical system: the national system of official statistics Management seminar on global assessment.
Peer Review of E-Government in Arab countries by Marco Daglio, Administrator, Public Governance and Territorial Development Directorate.
1 Tallinn, 7 June 2010 – roundtable with the HEREs EU support to HIGHER EDUCATION REFORM EXPERTS.
Status and role of International Department (Slovak experience) MGSC Meeting Luxembourg 23 – 24 March 2012 SOSR.
Overview on the peer reviews in the Member States PGSC October 2015 Point 6 of the Agenda Claudia Junker Eurostat.
1 1 Global assessments/peer reviews – some issues seen from an assessor Jan Byfuglien Statistics Norway 5.1.
Training for organisations participating in Peer Review of Paediatric Diabetes.
1 Recent developments in quality related matters in the ESS High level seminar for Eastern Europe, Caucasus and Central Asia countries Claudia Junker,
M O N T E N E G R O Negotiating Team for the Accession of Montenegro to the European Union Working Group for Chapter 31 – Common Foreign and Security Policy.
The statistical act, its application and challenges BY ABERASH TARIKU ABAYE NATIONAL STATISTICAL DATA QUALITY AND STANDARDS COORDINATION DIRECTORATE DIRECTOR.
21 June 2011 High level seminar for EECCA on “Quality matters in statistics” High level seminar for EECCA on “Quality matters in statistics” The Code of.
Treasury of the Republic of Kazakhstan
Summary overview of Quality Information for
Governance, Fraud, Ethics and Corporate Social Responsibility
Cost analysis of key statistical products
Regional and Bilateral cooperation instruments
Message in a bottle: how to communicate macro-regional strategies -
Eurostat Quality Management (in the ESS context)
TurkStat's experience with the preparation for the peer review in 2015
Agenda Comparison of core topics with national practice.
CCSA Conference on Data Quality
Peer review on compliance with the Code of Practice and the coordination role of the National Statistical Institute Kosovo experience Meetings of.
4th RDG meeting Luxembourg
Risk Analysis at Statistics Finland
EVAL Practical Introduction May 2018
Summary overview of Quality Information for
Statistics beyond the National Level –Regional Experiences
Peer review in the Republic of Macedonia
Culture Statistics: policy needs
Palestinian Central Bureau of Statistics
Experience of Bulgaria
Results of the ESS peer reviews
Overview of lessons learnt from the Peer Reviews workshop in March
Comparative perspective of statistical laws in enlargement countries
"Experience with the peer reviews, successes and things to change for next reviews" Delina Ibrahimaj, Albania.
ESS Vision and VALIDATION
Eurostat and its activities A. Näslund, Head of Unit A2
Presentation transcript:

EU Code of Practice Peer Review 2006 – 8 :A Peer’s Perspective Frank Nolan Office for National Statistics UK

Q20082 Contents What I plan to cover Peer Review Process Challenges Lessons

Q20083 Code of Practice 15 Principles 3 sections –Governance –Processes –Outputs Many Indicators

Q The Peer Review Process

Q20085 Peer Review Objectives Enhance accountability and Build trust in the integrity of the ESS, its processes and outputs Transparency Sharing Best Practice

Q20086 Peer Review Scope The NSI of the country –NOT other producers (Central Bank) EU statistics –Difficult in practice to not consider all Principles 1 – 6 and 15 –Assessment for all 15 principles

Q20087 Peer Review Outcome Published Report –Agreed with team –Agreed with Eurostat Desk –Agreed with NSI Best Practice Improvement Actions

Q20088 Peer Review Team Three members –Two from NSIs –One from Eurostat –Not from reviewed NSI Senior staff - experienced

Q20089 Peer Reviewers Training One Day in Brussels Interactive sessions Common Understanding Principles Common Understanding grading Fully met / largely met / partly met / not met

Q Peer Review Work Programme Preliminary Reading –NSI documentation –NSI web site, etc Preliminary meeting 3 Day visit to NSI Post visit writing Post visit agreement on report

Q Peer Review NSI Visit Interviews with –Director General –Executive staff –Users – Bank, Treasury –Trade unions, Employers organisations –Media –Statistical Council –Junior staff On behalf of EU

Q Peer Review Work Allocation Principles 1-6 and 15 –2 or 3 for each reviewer –Read, Question, Write up Best Practice / Coordination roles Team leader –leads discussions, meetings –Guiding and Sign Off of Report –Coordinates with NSI and Eurostat

Q Peer Review Documentation A Peers Guide –Version 2 Administrative instructions Report template Guide to questions for each indicator and interviewee A Country Guide

Q Challenges

Q Lack of Time Additional to current full time job Considerable material to read before visit Considerable amount of material to assimilate in three days The NSI visit was just the beginning of the end!

Q Lack of Time Example - Reading Countries self assessment Comparison with all self assessments Statistical Law Statistical Policies Statistical plans and programmes Annual reports User survey results Statistical releases Web site

Q Lack of Time Example - Cyprus Tuesday Night meeting in Nicosia Wednesday meeting with Director and senior staff Meeting with confidentiality committee Meeting with Data Dissemination Officer Meeting with Ministry of Agriculture, Civil Registry, Inland Revenue Thursday – 8 hours of meetings including media, Bank, Ministry of Finance, users, junior staff Thursday – draft, collate and agree Improvement actions Thursday – draft overview

Q Lack of Time Example - Cyprus Friday present overview to Director and full senior management team Discuss each of six Principles and related improvement Friday Discuss overview and improvement actions with Permanent Secretary in the Ministry of Finance

Q Languages and Culture Review conducted in English Materials translated into English Not how we do it at home Lack of common English in teams

Q Dynamic nature of the process Self Assessment taken late 2005 –some time before visit Continuous Improvement Improvement Actions sometimes undertaken immediately

Q Static Nature of process Law and legislation Staffing and Finance Methods Information technology

Q Sorting What is Important Large amount of information collected Decisions on what is important for Code?? Decisions on what is achievable

Q Lessons

Q Did it Add Value?? Gave an overview Created some changes Created awareness across NSIs Highlighted strengths and limitations Added knowledge / shared best practice Provided a focus on quality within NSI Provided a base for increased Trust

Q The NSI Agenda What NSIs want out of the process More money More staff / updated IT International recognition –doing well [league table – most points] Changes –next slide

Q The NSI Agenda What NSIs want out of the process Changes –New Statistical Law – Iceland –Higher status of DG – Cyprus Impetus –The kick to get things started / finished

Q Other People’s Agenda What Others want out of the process Changes –Less macro economic forecasting – Norway Bank –More access to unit record data ….

Q Other Lessons Significant commitment from NSI senior managers – a lot of time Excellent hospitality EU commitments were often significant for small countries

Q Thanks to; the teams that I worked with, –The Eurostat staff, Martina and Solvegia; and –to the staff of Statistics Iceland, Statistics Norway and Cystat. Thank You