Implementation Issues Program roll-out

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
What could go wrong? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education.
Operationalizing IE: Case Study example: Textbooks and Teacher Training in Sierra Leone APEIE Workshop, May
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
1 Managing Threats to Randomization. Threat (1): Spillovers If people in the control group get treated, randomization is no more perfect Choose the appropriate.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
1 Randomization in Practice. Unit of randomization Randomizing at the individual level Randomizing at the group level –School –Community / village –Health.
TRANSLATING RESEARCH INTO ACTION What is Randomized Evaluation? Why Randomize? J-PAL South Asia, April 29, 2011.
S-005 Intervention research: True experiments and quasi- experiments.
Sierra Leone Education Impact Evaluation Implementation Challenges & Solutions Philip F. Kargbo Philip F. Kargbo Field Coordinator.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
S-005 Intervention research: True experiments and quasi- experiments.
Common Pitfalls in Randomized Evaluations Jenny C. Aker Tufts University.
Randomized Evaluation: Dos and Don’ts An example from Peru Tania Alfonso Training Director, IPA.
IMPACT EVAUATION STUDY ON WHOLE SCHOOL DEVELOPMENT AND GRANTS THE GAMBIA.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Introduction to Teacher Evaluation
Using Randomized Evaluations to Improve Policy
Randomized Control Trials
Operational Aspects of Impact Evaluation
Impact Evaluation for Real Time Decision Making
Threats and Analysis.
Conducting Efficacy Trials
Impact Evaluation Terms Of Reference
The Club Health Assessment
Explanation of slide: Logos, to show while the audience arrive.
Succession Planning and Management
Assigning Value to Peel Regional Police’s School Resource Officer Program
Experiments and Observational Studies
AJS 595 Innovative Education-- snaptutorial.com
Using Randomized Evaluations to Improve Policy
Institutionalizing the Use of Impact Evaluation
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Development Impact Evaluation in Finance and Private Sector
Overall Project RAG Status:
Final Research Question
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
C APER Development 2017 NCDA Conference Miami, Florida June 15, 2017
Impact Evaluation Methods: Difference in difference & Matching
What is Evaluation?.
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Explanation of slide: Logos, to show while the audience arrive.
Class 2: Evaluating Social Programs
Service Array Assessment and Planning Purposes
Class 2: Evaluating Social Programs
Operational Aspects of Impact Evaluation
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
The Scottish Government’s Project Funding through the
Steps in Implementing an Impact Evaluation
Session 2-B Applying for and Implementing a Grant
Sample Sizes for IE Power Calculations.
Monitoring and Evaluating FGM/C abandonment programs
Steps in Implementing an Impact Evaluation
HOW TO ENGAGE COMMUNITY MEMBERS IN OUTCOME EVALUATION?
Reflecting on Your PBS Implementation Using Data
Reflecting on Your PBS Implementation Using Data
Rwanda Capitation Grant Impact Evaluation
Nancy Padian UC Berkeley
The Evidence on School Grants
Presentation transcript:

Implementation Issues Program roll-out This presentation draws heavily on previous presentations by Dan Levy, Rachel Glennerster, Arianna Legovini, Paul Gertler, and Sebastian Martinez. Africa Impact Evaluation Initiative World Bank

So you’ve design a great impact evaluation… Now it’s time to implement the program. But problems in implementation can invalidate the evaluation. Given the costs of the evaluation, should avoid any second best solutions, or evaluate questions that are not of priority relevance. Key steps The best remedy is prevention – do we really have a program design which is politically and administratively feasible, in the short and long run? Monitor the roll-out carefully – demand high standards of compliance with the original design (small details make a difference); e.g.: accessibility of communities Make adjustments immediately as needed

1. Delays Problem: Program expected to roll out in January but it doesn’t roll out until August; Delay in funding Solution: Move follow-up survey; Delay delivery of program; Collect light baseline; Look at alternative questions (plan B) Concerns Seasonality effects: If the baseline measures the end of school year and the follow-up measures just after a holiday, student performance may not be comparable School calendar Do we have a relevant to the baseline (close to the launch of the program)? Examples: Eritrea Malaria and ECD, Nigeria Malaria Other examples?

2. Timing of benefits and Multiple Treatments Treatment 1: Grants only Treatment 2: Grants + School management training Programs should be rolled out at same time. Suppose benefits roll out to Group 1 in June but to Group 2 in December, with a follow-up survey in January Now measured effect is different treatment + different time with treatment. Can’t measure effects of long term exposure to CCT in Mexico. Other examples?

3. Contamination Problem: Comparison group receives treatment Textbooks roll out to all schools (not by group) Trained teachers transfer to control schools Parents in neighboring non-treated schools learn about innovations in treated schools Some contamination is almost always bound to happen when there are alternative programs available, or NGOs are operating in the same area Solutions If contamination is small, can still estimate effect of being assigned to treatment group – will estimate different parameter, which may still be of interest. If control schools received benefits but less / or different, then measure impact of additional (redefine question – but make sure it is still relevant). If contamination is random… [deworming externalities]. David’s example. Other examples?

4. Heterogeneous treatment Sometimes the quality of the intervention is not the same across all recipients. This may happen because it is difficult to implement an intervention with uniform quality. If quality differences are not randomly assigned this may require some reinterpretation of results, even if they are observed. At the end of the day we have to focus on the question of interest and direct the evaluation towards that question: what type of quality can we deliver in practice?

5. Access to Treatment and Control Areas Because of new constraints, or constraints that were not planned for, some of the communities in the study are not fully accessible, either for treatment, for data collection, or both. If randomization is compromised then the whole study is compromised. Carefully plan for it (and to avoid it), and make an effort to stick with original design.