1 Fieldwork logistics and data quality control procedures Kathleen Beegle Workshop 17, Session 2 Designing and Implementing Household Surveys March 31,

Slides:



Advertisements
Similar presentations
SURVEY QUALITY CONTROL
Advertisements

MICS 3 DATA ANALYSIS AND REPORT WRITING. Purpose Provide an overview of the MICS3 process in analyzing data Provide an overview of the preparation of.
Multiple Indicator Cluster Surveys Regional Training Workshop I - Survey Design Objectives of the Workshop.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Field Staff and Field Procedures.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Survey Quality Control.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Training of the Field Staff.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop MICS4 Technical Assistance.
Quality assurance Kari Kuulasmaa 1 st EHES Training Seminar, February 2010, Rome.
National training programmes EHES Training seminar, Rome, 12 February 2010 Päivikki Koponen.
Multiple Indicator Cluster Surveys Survey Design Workshop MICS Technical Assistance MICS Survey Design Workshop.
1 Field Management: Roles & Responsibilities Partially Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Survey Techniques,
Presented on behalf of the Ghanaian Team by T. H. Coleman Coordinator, EMIS.
1 Fieldwork Logistics. OBJECTIVES The importance of logistics in supporting high quality survey results and implementation schedule Key logistical.
Copyright 2010, The World Bank Group. All Rights Reserved. Agricultural Data Collection Procedures Section A 1.
Multiple Indicator Cluster Surveys Survey Design Workshop
Farm Household Surveys - Design and Sampling for Collection of Data Ernest L. Molua (University of Buea, Cameroon)
Washington Group on Disability Statistics Pre-test implementation documents Catriona Bate September 2005.
Using SMS-Gateways for Monitoring Progress and Quality of Data Collection: Lessons Learned from the 2010 Population Census of Indonesia Thoman Pardosi.
Multiple Indicator Cluster Surveys Survey Design Workshop Data Analysis and Reporting MICS Survey Design Workshop.
1 Module 6 Putting It All Together. 2 Learning Objectives At the end of this session participants will understand: The monitoring and evaluation process.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
Multiple Indicator Cluster Surveys Survey Design Workshop Preparing for Fieldwork MICS Survey Design Workshop.
1 Training Issues Adapted from Multiple Indicator Cluster Surveys (MICS) Regional Training Workshop – Field Staff & Training Issues, Unicef.
SAMPLE IMPLEMENTATION PLAN OF A POVERTY ASSESSMENT TOOL To Report to the Management of Microfinance Association of Patharland (MAP), Patharland This proposal.
1 Review of Toolkit Amber Gove RTI International Session 3.4.
Copyright 2010, The World Bank Group. All Rights Reserved. Training and Procedural Manuals Section A 1.
MICS4 Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Data Analysis and Reporting.
United Nations Regional Workshop on the 2010 World Programme on Population and Housing Censuses: Census Evaluation and Post Enumeration Surveys United.
Central egency for public mobilization and statistics.
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of MICS Tools, Templates, Resources, Technical Assistance.
United Nations Regional Workshop on the 2010 World Programme on Population and Housing Censuses: Census Evaluation and Post Enumeration Surveys Bangkok,
Roundtable Meeting on Programme for the 2010 Round of Censuses of Agriculture Bangkok, Thailand 28 November-2 December, 2005 VILLAGE LEVEL SOCIO-ECONOMIC.
3rd NRC Meeting, 9-12 June 2008, Windsor ICCS 2009 Main Survey Field Operations.
UNICEF’s work and planned activities for the production of data on children with disabilities Claudia Cappa, Data and Analytics Section, UNICEF, NY.
Global IE Network Support Sebastian Martinez Jennifer Sturdy Survey Coordinator July 2008.
AADAPT Workshop South Asia Goa, December 17-21, 2009 Maria Isabel Beltran 1.
Module 5b: Measuring Household ICT Ms Sheridan Roberts, Consultant Information Society Statistics Tuesday 10 March 2009.
Copyright 2010, The World Bank Group. All Rights Reserved. Managing Data Collection Section A 1.
Assuring good field work Juan Muñoz. What happens when fieldwork is poor? A long and frustrating process of “data cleaning” becomes unavoidable The data.
Post enumeration survey in the 2009 Pilot Census of Population, Households and Dwellings in Serbia Olga Melovski Trpinac.
Copyright 2010, The World Bank Group. All Rights Reserved. Reducing Non-Response Section B 1.
Population Census and PES 2006 Hudha Haleem Statistical Officer Department of National Planning Statistics Section MALDIVES.
European Conference on Quality in Official Statistics Session 26: Census 2011 « Helsinki, 6 May 2010 « Census quality control with BSC: the Portuguese.
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Data Entry Using Tablets / Laptops.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Copyright 2010, The World Bank Group. All Rights Reserved. Managing Data Processing Section B.
March 2014 Possibilities, Perils & Pitfalls of doing a CP-KAP Mónica Ruiz-Casares, Ph.D.
Mid-Decade Assessment of the United Nations 2010 World Population and Housing Census Program Arona L. Pistiner Office of the Associate Director for 2020.
United Nations Regional Workshop on the 2010 World Programme on Population and Housing Censuses: Census Evaluation and Post Enumeration Surveys Asunción,
MICS Data Processing Workshop Multiple Indicator Cluster Surveys Data Processing Workshop Overview of the MICS Process.
United Nations Regional Workshop on the 2010 World Programme on Population and Housing Censuses: Census Evaluation and Post Enumeration Surveys, Asunción,
Session 1 Quality assessment of national accounts data Workshop on national accounts for Asian member countries of the organization of Islamic Conference.
United Nations Regional Workshop on the 2010 World Programme on Population and Housing Censuses: Census Evaluation and Post Enumeration Surveys, Bangkok,
United Nations Regional Workshop on the 2010 World Programme on Population and Housing Censuses: Census Evaluation and Post Enumeration Surveys, Asunción,
MICS Survey Design Workshop Multiple Indicator Cluster Surveys Survey Design Workshop Data Entry Using Tablets / Laptops.
Copyright 2010, The World Bank Group. All Rights Reserved. Statistical Work Plan Development Section A 1.
1 Health Results-Based Financing Impact Evaluation Surveys Quality Assurance and Data Management Álvaro Canales, Beatriz Godoy, Juan Muñoz Sistemas Integrales.
United Nations Regional Workshop on the 2010 World Programme on Population and Housing Censuses: Census Evaluation and Post Enumeration Surveys, Addis.
First meeting of the Technical Cooperation Group for the Population and Housing Censuses in South East Europe Vienna, March 2010 POST-ENUMERATION.
Day 6: Supervisors’ Training This presentation has been supported by the U.S President’s Emergency Plan for AIDS Relief (PEPFAR) through the U.S. Agency.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
Improved socio-economic services for a more social microfinance.
National Population Commission (NPopC)
Country Practices on Census Evaluation: 2000 Census Round Pres. 1
2000 POPULATION AND HOUSING CENSUS:
Post Enumeration Survey Census
Planning and Implementation of Post Enumeration Surveys Pres. 4
Population and Housing Census 2015, and Challenge
International Standards and Contemporary Technologies,
Presentation transcript:

1 Fieldwork logistics and data quality control procedures Kathleen Beegle Workshop 17, Session 2 Designing and Implementing Household Surveys March 31, 2009

2 What makes a good survey? Relevance  Answer the policy questions of import Quality  Methodology  Accuracy and reliability  Adherence to international statistical guidelines Timeliness  Punctuality  Production time  Comparability over time Sustainability  Public dissemination  Documentation

3 Overview Overview of the main actors Quick description of broad activities of an LSMS Options for organization of field work & quality control aspects

4 Who does what Core team  Principal Investigators who understand the purpose of the survey Local team  Will implement the survey and process data  For LSMS is usually the national statistics office  For smaller/specialized surveys, usually local firms  Need to identify who in the local team is the project manager Technical assistance  Short-term consultants for specialized expertise such as sample design and data entry system development  Long-term assistance: possible project head if lacking in local team

5 Activities Broad activities involved with an LSMS  Management/logistics  Questionnaire development  Sampling  Field staff: recruitment, training, payments  Field work  Data management  Data analysis and final documentation Some activities are sequential (training after questionnaire finalization), but lots of overlap in timing (sampling & questionnaire development; logistics is on-going).

6 Design Data collection Analysis Management: Prepare a timeline…

7 ….and make it detailed, with deadlines!

8 Questionnaire development Pilot/pre-test questionnaires in various settings as relevant (urban, rural, etc) Careful review of translation (if any). Back translation is ideal. Finalize questionnaire before any training. Prepare Manuals with specific instructions for enumerators and supervisors, other staff, to accompany the questionnaire. This is also part of the final documentation to assist data users.

9 Field staff Recruit & train more enumerators than you need  Enables a selection process during training – competition among candidates  Provides reserves in case field staff leave early Training in two parts (if large field staff)  Training of Trainers (TOT): train field supervisors  Training of enumerators: main training with active participation of supervisors as facilitators

10 Field staff Training for an LSMS – a long, multi-topic questionnaire – 3-4 weeks. Includes:  in-class study of questionnaires  in-class mock interviews/exercises  tests  field practice and feedback  more if the survey will use special instruments (anthropometric, GPS, etc.)  Common training for all team members if feasible Payments  Usually not paid per questionnaire, instead paid daily rate  Bonuses paid at end of field work can reduce staff attrition during field work (important for longer surveys)  Performance bonuses (if they can be objectively determined)

11 Field work There are many options for field work structure. Your selection will depend on the specifics of the survey, timeline, and the budget:  Mobile teams with integrated DE. Improves quality of data because teams get immediate feedback on inconsistencies in the questionnaire.  Mobile teams without DE. DE in office.  Enumerator in the village. DE in office. Difficult to supervise well! May not be cost effective (with exception of surveys over several moths with ~20 HHs per village).  Local office from which teams operate. Feasible for a survey focused in one region/small geographic area. No excuse not to do concurrent DE! Don’t wait until end of field work to start entering questionnaires. This is often the reason for lags in data availability.

Field work: Composition of a mobile field team SupervisorInterviewers Anthropo -metrist Data entry operator.maybe.

13

14

15 Other factors of survey quality Management Team for the survey  Composed of Project Manager Fieldwork Manager Data Manger  Dedicated to the project in all three phases Design and Preparation Implementation Dataset documentation and initial tabulations

16 Other factors of survey quality Actions of the field team supervisors  Visual scrutiny of completed questionnaires Less critical if DE is integrated with field work. Computers do it better  Visual observation (direct observation) of interviews Some of this is needed, but not too much  Check-up visits Critical, and can only be implemented by human supervisors Need to be random Supervisors must be supervised too!!

17 Other factors of survey quality Review data files as they come in – done by Core staff (i.e. in Stata, look for inconsistencies across modules of the questionnaire which is normally not done in DE programs, compare with baseline data if a panel, etc…) Send Addendum notes/instructions to field staff as issues arise (for things missing from the Manuals) Keep detailed documentation, needed for analysis and relevant for panel surveys

18 Aspects for evaluation work If the evaluation entails panel data (baseline and follow-up surveys), plan surveys accordingly. At baseline, think about how to minimize attrition and match people correctly  get adequate address/map/phone information to locate households again  Record complete names of all household members on questionnaire and enter in DE  Other possibilities: GPS, photos of respondents At follow-up  pre-print some baseline data (Roster: names, age, sex, roster IDs) and also baseline information relevant to the evaluation for measuring changes (assets?)  collect some minimal information on respondents/households who drop out or move (in case you are unable to find them for a re-interview). For example: did they move, when, why…