breakout session of the measurement subgroup, CD4

Slides:



Advertisements
Similar presentations
Page 1 Vienna, 03. June 2014 Mario Gavrić Croatian Bureau of Statistics Senior Adviser in Classification, Sampling, Statistical Methods and Analyses Department.
Advertisements

Standards and Guidelines for Quality Assurance in the European
PARIS21 CONSORTIUM MEETING Paris, October 2002 Progress Report of the Task Team on Food, Agriculture and Rural Statistics  Objectives  Past activities.
Tool for Assessing Statistical Capacity (TASC) The development of TASC was sponsored by United States Agency for International Development.
Impact assessment framework
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
Global Strategy: Implementation Plan for Africa Meeting on Country Needs Assessment Addis Ababa, Ethiopia August 2012 Background to Country Assessment.
African Centre for Statistics United Nations Economic Commission for Africa Towards a More Effective Production of Gender Sensitive Data in African Countries:
Review phase of the implementation programme of the 2008 SNA and supporting statistics Seminar on the implementation of the System of National Accounts.
Revisions Proposed to the CIS Plan by the Global Office Misha V. Belkindas Budapest, July 3-4, 2013.
QUALITY ECD FRAMEWORKS OSISA REGIONAL ECD CONFERENCE 3 TO 5 DECEMBER 2013 DR JULIANA SELETI UNICEF SOUTH AFRICA.
Quality Frameworks: Implementation and Impact Notes by Michael Colledge.
Governance Indicators at AfDB Stephen Bahemuka May, 2012 African Development Bank.
Statistical Capacity Building Activities for OIC Member Countries by Sıdıka Başçı (Ph.D) SESRTCIC Experts Group Meeting on Statistical Capacity Building.
High level seminar on the implementation of the System of National Accounts 2008 in the GCC countries Muscat, Oman, 27 May 2010 United Nations Statistics.
PARIS21 National Strategies for the Development of Statistics: Design and implementation issues 8 July 2010, Nouméa.
United Nations Economic Commission for Europe Statistical Division WHAT MAKES AN EFFECTIVE AND EFFICIENT STATISTICAL SYSTEM Lidia Bratanova, Statistical.
Strategic approach to national implementation programmes for SEEA by UNSD Sixth Meeting United Nations Committee of Experts on Environmental Economic Accounting.
Results of NSS Assessment Tools Study
Seminar on the implementation of the
Governance, Fraud, Ethics and Corporate Social Responsibility
INTER-AMERICAN DEVELOPMENT BANK CAPACITY BUILDING AND TRAINING.
Innovating the gaps away
The ESS vision, ESSnets and SDMX
Cost analysis of key statistical products
Quality assurance in official statistics
CCl Expert Team on Education and Training
Assessing Quality of Statistical Data: The ROSC Procedure in Israel
Winifred Mulindwa Director, District Statistics & Capacity Development
CD4.0 Measurement sub group TORs
Africa Programme on Gender Statistics (APGS)
“CareerGuide for Schools”
Africa Information Highway
Exchanging Reference Metadata using SDMX
Capacity development 4.0 an update
Eurostat Quality Management (in the ESS context)
PARIS21 - League of Arab States
Rachid Bouhia Global review of Gender Statistics Programmes Session 1: develop a coherent and comprehensive plan for the production of gender.
UK role in Statistical Capacity Building
Measuring Data Quality and Compilation of Metadata
UN Development Account Project on Improvement of Disability Statistics
Christophe Duhamel Global Office Coordinator, FAO Statistics Division
The relevance of “new metrics” (measurement tools and indicators) for the evaluation of SDGs with a focus on “leaving no one behind” Thank organisers –
Rolling Review of Education Statistics
Institutional Framework, Resources and Management
The European Statistical System
Valerie Bizier, Program manager
NSDS Roll-out: How Can PARIS21 Help?
EUPAN DG-Troika 3rd May 2007, Berlin Medium-Term Programme (MTP)
Agenda item 5b Main challenges and priorities for future work Implementation of the 2008 SNA in the EECCA, SEE and Mongolia Presentation by UNECE Workshop.
Strategy for statistical cooperation with the enlargement countries 2014 – 2020 MGSC March 2014 Point 3.1 of the Agenda Ferenc Gálik.
Objective of the workshop
Chief, Economic Statistics and National Accounts Section UNECA/ACS
Developing a regional action plan for the improvement of economic statistics in Asia and the Pacific Artur Andrysiak Statistics Development and Analysis.
Albania 2021 Population and Housing Census - Plans
LECTURE: GENDER ISSUES IN GLOBAL, REGIONAL AND NATIONAL CONTEXTS
International Statistics
Palestinian Central Bureau of Statistics
ADAPT TO SDGs advanced data planning tool
Tool for Assessing Statistical Capacity (TASC)
Partner Implications from Seminar: New Approaches to Capacity Development Break-out Group: Role of Assessment frameworks.
Preparing Ministerial Recommendations for the Medium-Term Programme (MTP)
Transformation of the National Statistical System: Experience
SMP Slovakia: Main recommendations
EUPAN DG-Troika 3rd May 2007, Berlin Medium-Term Programme (MTP)
‘QUALITY’ ROADMAP: Implementation
Metadata on quality of statistical information
Draft revision of ISPM 6: National surveillance systems ( )
Objective of the workshop
cooperation in statistics Proposal for the organizational structure
Presentation transcript:

breakout session of the measurement subgroup, CD4 breakout session of the measurement subgroup, CD4.0 TT A review of Statistical capacity assessments François Fonteneau (PARIS21)

Outline Purpose and methodology Results Possible next steps

Purpose Check what Assessments actually measure – and what they mean by capacity. Measure on NSO and other respondents (on-going). the response burden Propose an analytical question bank (next).

Method Step 1: Select 14 assessments tools on statistical capacity 1974 questions and indicators total Step 2: Code each question/indicator to one or two dimensions from CD 4.0 Framework, using OAR Resulting 2353 relations* Step 3: Quantitative and qualitative analysis of questions and assessments We have included fourteen assessments on statistical capacity in our Open Assessment Repository (OAR): the Self-Assessment Guidance Questionnaire from UNECA (SAGQ), the Snapshot (Eurostat), the Country Assessment of Agricultural Statistical Systems in Africa from AFDB (ASSA), the Tool for Assessing Statistical Capacity from US Census Bureau (TASC), the Light Self - Assessment Questionnaire on the implementation of the European Statistics Code of Practice from the European Commission and OECD (Light SAQ), the Statistical Capacity Indicators from the World Bank (SCI), the Generic National Quality Assessment Framework from UNSD (Generic NQAF, the Global Assessment of the National Statistical System from Eurostat and EFTA (GANSS), the Data Quality Assessment Framework for National Accounts Statistics from IMF(DQAF for National Accounts), the Assessing the National Health Information System. An Assessment Tool from HMN (HIS) , African Statistical Development Indicators from UNECA (StatDI), , the Environment Statistics Self-Assessment Tool from UNSD (ESSAT), the Pan-African Statistics Programme: Peer Reviews of NSIS/NSSS in African countries from Eurostat and AUSTAT (PAS), the extra modules added by IDB to the Tool for Assessing Statistical Capacity (TASC v.IDB).

Method US Census Bureau / IADB Tool for Assessing Statistical Capacity (TASC) UNECA Self-Assessment Guidance Questionnaire from (SAGQ) UNECA African Statistical Development Indicators from UNECA (StatDI) Eurostat Snapshot Eurostat / EFTA Global Assessment of the National Statistical System GS – FAO/AfDB Country Assessment of Agricultural Statistical Systems in Africa EC/OECD Light Self - Assessment Questionnaire on the implementation of the European Statistics Code of Practice Eurostat /AUSTAT Pan-African Statistics Programme: Peer Reviews of NSIS/NSSS in African countries (PAS) World Bank Statistical Capacity Indicators from the World Bank (SCI) HMN Assessing the National Health Information System UNSD Environment Statistics Self-Assessment Tool (ESSAT) UNSD Generic National Quality Assessment Framework (NQAF) IMF Data Quality Assessment Framework for National Accounts Statistics (DQAF) 13-14 different assessments. -different objectives ------measure capacity; measure capacity needs; measure data output vis a vis standards - Different survey modes: survey; self assessment; peer reviews, etc

Results Assessments focus mainly on organizational skills and knowledge Power and politics and incentives are marginally assessed Although most capacity development programmes target individuals, their capacities are marginally assessed Categories ranked by relevance We looked at the 14 most commonly used assessments of statistical capacity (including WB Statistical Capacity Indicators) and mapped their questions/indicators to our Capacity Development 4.0 framework. Here are the preliminary conclusions. 36% 12% 2%

Altogether, the top five dimensions represent more than half of the questions/indicators. Methods, practices and quality controls and standards and regulations : build up 35% of the questions The following three (transparency, laws and reference frameworks and strategic planning) make up another 19%. The remaining 46 dimensions account for 46% of the sample. WHAT DOES IT MEAN? - careful in interpretation

Crowded: ‘Methods, practices and quality control’ Purpose: Map these questions : to GSBPM phases to data quality sub-dimensions Results: uneven distribution of questions ‘Methods, practices and quality control’ is the most covered dimension within the category organizational Skills & Knowledge and in all the assessments. The terms of the dimensions ‘Methods, practices and quality control’ broadly refer to the series of steps and processes that statistical organisations follow to decide what to produce; design and implement data collection, process, analysis and dissemination of statistical products; and conduct evaluations of past implementations to improve future ones. 125 questions were selected from seven assessments CROWDED OVERALL – BUT SUB-DIMENSIONS ORPHANS?

Orphan: ’Individual Resources: education and work experience’ We TENTATIVELY proposed questions: Is the NSO able to attract enough candidates from the national/regional labour market to meet its staffing needs? What is the minimum educational attainment in statistics or related, required to qualify for a statistician / managerial position in the NSO? Does the NSO have a policy requiring future hires to have experience working with statistical software in a similar environment? Does the NSO have a young professionals programme? … Education refers to “organised and sustained communication designed” to transmit a body of knowledge and bring about learning and development in different subjects or in a specific field (OECD 2001; Merriam Webster, 2017). The educational background, attainment and field of specialisation of graduates, especially at the university level, are important for the NSO to recruit new staff at the relevant positions. Work experience refers to the experience of working, skills and familiarity with a field of knowledge gained by actual practice in a specific field or occupation. Balancing the work experience of NSO employees is relevant to ensure successful knowledge transfer from other institutions, such as academia, as well as within the organisation. Striking the right balance between junior and senior staff ensures that knowledge is accumulated in an efficient way. PB: WHAT IS RELEVANT AND REQUIRED TO BE MEASURED => LINK WITH OPERATIONALIZATION

Next steps Further analysis: Further assessments? … which ones? Further analysis by type of assessments? Further “crowded category” work? Further “orphan category” work? Towards a stat Cap question bank ? - Further document response burden? Education refers to “organised and sustained communication designed” to transmit a body of knowledge and bring about learning and development in different subjects or in a specific field (OECD 2001; Merriam Webster, 2017). The educational background, attainment and field of specialisation of graduates, especially at the university level, are important for the NSO to recruit new staff at the relevant positions. Work experience refers to the experience of working, skills and familiarity with a field of knowledge gained by actual practice in a specific field or occupation. Balancing the work experience of NSO employees is relevant to ensure successful knowledge transfer from other institutions, such as academia, as well as within the organisation. Striking the right balance between junior and senior staff ensures that knowledge is accumulated in an efficient way.

Thank you Francois.Fonteneau@oecd.org www.paris21.org Education refers to “organised and sustained communication designed” to transmit a body of knowledge and bring about learning and development in different subjects or in a specific field (OECD 2001; Merriam Webster, 2017). The educational background, attainment and field of specialisation of graduates, especially at the university level, are important for the NSO to recruit new staff at the relevant positions. Work experience refers to the experience of working, skills and familiarity with a field of knowledge gained by actual practice in a specific field or occupation. Balancing the work experience of NSO employees is relevant to ensure successful knowledge transfer from other institutions, such as academia, as well as within the organisation. Striking the right balance between junior and senior staff ensures that knowledge is accumulated in an efficient way.