>> Debriefing D1 EU Twinning Project Support to the Israeli Central Bureau of Statistics Jerusalem, 21-24 July 2013.

Slides:



Advertisements
Similar presentations
Input Data Warehousing Canada’s Experience with Establishment Level Information Presentation to the Third International Conference on Establishment Statistics.
Advertisements

QM Implementation Based on CoP, PDCA, and GSBPM
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Selecting Your Evaluation Tools Chapter Five. Introduction  Collecting information  Program considerations  Feasibility  Acceptability  Credibility.
Statistical Metadata Strategy Elham M. Saleh - Acting Director of Economic Statistics - Director of Technical Resources Central Informatics Organisation.
Capability Maturity Model (CMM) in SW design
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Quality evaluation and improvement for Internal Audit
Pestalozzi Children‘s Foundation emPower 2012 Monitoring & Evaluation Lecturers: Beatrice Schulter.
Business Case for Industriali- sation in Statistics Estonia: Small Example of a Large Trend MSIS 2013 Allan Randlepp Tuulikki Sillajõe.
Division of Purchasing Strengthening Contract Management in Idaho Office of Performance Evaluation Report Legislative Response Division of Purchasing Response.
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
“”Capacity and services to road users” Task descriptions Paul van der Kroon, Paris November 2005.
Tool for Assessing Statistical Capacity (TASC) The development of TASC was sponsored by United States Agency for International Development.
CZECH STATISTICAL OFFICE Na padesátém 81, CZ Praha 10, Czech Republic The use of administrative data sources (experience and challenges)
Implementing ESS standards for reference metadata and quality reporting at Istat Work Session on Statistical Metadata Topic (i): Metadata standards and.
McGraw-Hill/Irwin ©2009 The McGraw-Hill Companies, All Rights Reserved Marketing Research, Primary Data, Secondary Data, Qualitative Research, Quantitative.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Marina Signore Head of Service “Audit for Quality Istat Assessing Quality through Auditing and Self-Assessment Signore M., Carbini R., D’Orazio M., Brancato.
Initial thoughts on a Global Strategy for the Implementation of the SEEA Central Framework Ivo Havinga United Nations Statistics Division.
CZECH STATISTICAL OFFICE 1 The Quality Metadata System In the Czech Statistical Office Work Session on Statistical Metadata (METIS)
How to use the VSS to design a National Strategy for the Development of Statistics (NSDS) 1.
Copyright 2010, The World Bank Group. All Rights Reserved. Training the Enumerators and Collection of Data Part II.
THE CHALLENGENGES OF COLLECTING DISABILITY STATISITICS IN A LARGE SCALE EXERCISE, IN UGANDA by Helen Nviiri Uganda Bureau of Statistics.
1 Seminar on 2008 SNA Implementation June 2010, Saint John’s, Antigua and Barbuda GULAB SINGH UN Statistics Division Diagnostic Framework: National.
Assessing The Development Needs of the Statistical System NSDS Workshop, Trinidad and Tobago, July 27-29, 2009 Presented by Barbados.
Dr. Mojca Noč Razinger SURS Data collection in the Statistical Office of the Republic of Slovenia (SURS)
ESS-net DWH ESSnet DWH - Metadata in the S-DWH Harry Goossens – Statistics Netherlands Head Data Service Centre / ESSnet Coordinator
The TNA STEP model Target Groups Job requirements Existing
Quality Evaluation methodologies for e-Learning systems (in the frame of the EC Project UNITE) Tatiana Rikure Researcher, Riga Technical University (RTU),
Hallgrímur Snorrason Management seminar on global assessment Session 8: Planning, programming and priority setting under budgetary restraints; human resource.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
New sources – administrative registers Genovefa RUŽIĆ.
Training and Development Prof R K Singh AIMA CME.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Copyright 2010, The World Bank Group. All Rights Reserved. Principles, criteria and methods Part 2 Quality management Produced in Collaboration between.
Eurostat/UNSD Conference on International Outreach and Coordination in National Accounts for Sustainable Development and Growth 6-8 May, Luxembourg These.
Developing and applying business process models in practice Statistics Norway Jenny Linnerud and Anne Gro Hustoft.
The hidden side of successful story – implication of wide use of administrative data sources at national statistical institutes Metka Zaletel, Irena Križman.
1 Statistical business registers as a prerequisite for integrated economic statistics. By Olav Ljones Deputy Director General Statistics Norway
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
ESS-net DWH ESSnet on microdata linking and data warehousing in statistical production.
Page 1 Development of Metadata System at Croatian Bureau of Statistics Development of Metadata System at Croatian Bureau of Statistics Presented by Maja.
The business process models and quality issues at the Hungarian Central Statistical Office (HCSO) Mr. Csaba Ábry, HCSO, Methodological Department Geneva,
State of play and plans by variable Occupation. 2 Policy needs for comparable data on occupations  Indicators on gender segregation used in the follow.
United Nations Economic Commission for Europe Statistical Division WHAT MAKES AN EFFECTIVE AND EFFICIENT STATISTICAL SYSTEM Lidia Bratanova, Statistical.
Session 2: Developing a Comprehensive M&E Work Plan.
How official statistics is produced Alan Vask
Statistical process model Workshop in Ukraine October 2015 Karin Blix Quality coordinator
United Nations Statistics Division Developing a short-term statistics implementation programme Expert Group Meeting on Short-Term Economic Statistics in.
Component D: Activity D.3: Surveys Department EU Twinning Project.
>> EU-ISRAEL TWINNING PROJECT Activity D.4 Cognitive Aspects of Questionnaire Design Jerusalem, 30 March – 1 April 2014.
Component D: Data Collection in Field Surveys Activity D.1: Management and monitoring of field interviewers Surveys Department EU Twinning Project.
EU-SILC Survey Process in the Czech Republic presentation for EU-SILC Methodological Workshop November 7th Martina Mysíková, Martin Zelený Social.
Planning, Monitoring and Performing Surveys
Namibia Experience in the use of Administrative Data
Best GSBPM practices, Israel Central Bureau of Statistics Battia ATTALI, Elena DROR MEDSTAT IV, Training course on “Generic Statistical Business Process.
Component D: Data collection by field surveys
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 8. Routine monitoring.
Unit 6 Research Project in HSC Unit 6 Research Project in Health and Social Care Aim This unit aims to develop learners’ skills of independent enquiry.
Omurbek Ibraev Project coordinator December 2014
EMIS. EDUCATION MANAGEMENT INFORMATION SYSTEM (EMIS)
Introduction to BoP component
DIAGNOSTIC FRAMEWORK: National Accounts and Supporting Statistics
Kees van Berkel Mariëtte Vosmer Jerusalem, July 2013
GSBPM AND ISO AS QUALITY MANAGEMENT SYSTEM TOOLS: AZERBAIJAN EXPERIENCE Yusif Yusifov, Deputy Chairman of the State Statistical Committee of the Republic.
Chapter 2: Roles of the SBR
GENDER ANALYSIS MANUAL & TOOLKIT
Presentation transcript:

>> Debriefing D1 EU Twinning Project Support to the Israeli Central Bureau of Statistics Jerusalem, July 2013

>> Expected results Mandatory results for component D:  Quality control methods and tools for monitoring field interviewers  A manual of guidelines for interviewers Expected output of the D.1 activity  Mission report on methods for managing and monitoring fields interviewers  Input to a manual for interviewers 2

>> Recommendations (1)  More interviewers per superviser, but find the balance and the optimal number of supervisers  More flexibility in the organisation of the work  The individual interviewer and superviser involved in several surveys  To some extent the interviewer choosing her working hours herself can be helpful => quality, efficiency  Harmonisation of questionnaires ... and, come time, software and hardware 3

>> Recommendations (2)  Feedback procedures to interviewers  More is good  Based on improved indicators  Organisational / strategic issues  Sampling / surveying expert in Data Collection Division? -Dialogue even easier -More flexibility in the very short time frame, for example necessity for ”thinning the sample” (sub-sampling)  Use of administrative registers -Improving quality of sampling -Possibility for cross-checking the quality of the collected data 4

>> Recommendations (3)  Indicators – planning and monitoring  Capacity planning  Quantitative indicators  Qualitative indicators  Self-reporting by the interviewers on actual time used  Redefine the role of the superviser  Limited involvement in training  Emphasis on coaching interviewers  Control done by HQ – and action by supervisers  Training  By professional trainers who are specialised for this, only  Including all facets of training, including eLearning possiblities  Looks for methods to re-visit and study the instructions/manuals 5

>> Recommendations (4)  Interviewer manuals  establish a one-for-all generic manual  keep the same structure for all surveys (manuals) even though there may, in some cases, be very little to say 6 GeneralSurvey specific About CBS and confidentiality About this survey Sample conceptThis sample QuestionnairesThis questionnaire Approach strategyThis approach strategy Technical issuesSpecific technical issues, this survey

>> Recommendations (5)  General guide (e.g pages)  About CBS and confidentiality  Sample concept (representativity and sample frame and registers, how to handle the address units etc.)  How to handle a questionnaire -Interview technique -Interview rules (how to read etc.) -Other things  Approach strategy – handling addresses, people who have moved, get co-operation, administrative part of the interviewing etc.  Technical issues (software and labtop, data transmission etc.)  Survey specific guide  About the survey  This sample  This questionnaire  Special approach issues related to this survey  Technical issues related to this survey 7

>> Indicators What kind of indicators should be developed?  Indicators should be  Measurable  Provided with a standard  Interpretable  Further develop the two current indices used by CBS on time spent per response  Be systematic and follow the format: name, definition, variables, standard, action, purpose, presentation source, period under review) 8

>> Work plan  Before November 2013  Define the index / structure for the manuals  Put the two existing indices on the suggested format of the Quality Indicator System  Before June 2014 (study visit)  Coordinate the CAPI work with the CATI part of the project -there are some differences -some surveys (LFS) use both CAPI and CATI  Fill in the general part of the manual  Fill in the specific part for two different (in nature) surveys 9