Steps in Implementing an Impact Evaluation

Slides:



Advertisements
Similar presentations
PAYING FOR PERFORMANCE In PUBLIC HEALTH: Opportunities and Obstacles Glen P. Mays, Ph.D., M.P.H. Department of Health Policy and Administration UAMS College.
Advertisements

AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Supporting Early Child Development (ECD) in HIV/AIDS Programs for Africa Mary Eming Young HDNED.
Child Care Subsidy Data and Measurement Challenges 1 Study of the Effects of Enhanced Subsidy Eligibility Policies In Illinois Data Collection and Measurement.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
What could go wrong? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education.
Operationalizing IE: Case Study example: Textbooks and Teacher Training in Sierra Leone APEIE Workshop, May
Practical Sampling for Impact Evaluations Cyrus Samii, Columbia University.
1 Module 6 Putting It All Together. 2 Learning Objectives At the end of this session participants will understand: The monitoring and evaluation process.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Pascaline Dupas, UCLA Pupil-Teacher Ratios, Locally-Hired Contract Teachers, and School-Based Management: Evidence from Kenya Making Schools Accountable:
Goal Paper  Improve our understanding on whether business training can improve business practices and firm outcomes (sales, profits, investment) of poor.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
1 Randomization in Practice. Unit of randomization Randomizing at the individual level Randomizing at the group level –School –Community / village –Health.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 14, 2009.
Measuring Impact: Experiments
SEDA IMPACT EVALUATION WESTERN CAPE (SOUTH AFRICA) Varsha Harinath (the dti) Francisco Campos (World Bank) Finance and Private Sector Development IE Workshop.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
Evidencing Outcomes Ruth Mann / George Box Commissioning Strategies Group, NOMS February 2014 UNCLASSIFIED.
School Finances for Finance Subcommittees School Councils.
Public Expenditure Tracking Surveys(PETS) Zerubabel Ojoo Management systems and Economic Consultants Jan
1 CORAT AFRICA MANAGEMENT WORKSHOP FOR IMBISA/AMECEA COMMUNICATION COORDINATORS, MAPUTO, MOZAMBIQUE.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
Learning Lessons from Experience: good practice case studies Inclusive Education Title: Inclusive education planning in school and local authority levels.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
OPPORTUNITIES AND CHALLENGES OF CONDUCTING RESEARCH IN LARGE SCALE PROGRAMS Presented by: Deanna Olney and Jef Leroy, IFPRI.
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
AADAPT Workshop South Asia Goa, India, December 17-21, 2009 Arianna Legovini Manager, Development Impact Evaluation Initiative The World Bank.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
INFORMATION AND PROGRESS An analysis of what is happening in the Caribbean with information, decision- making and progress in Education.
TRANSLATING RESEARCH INTO ACTION Randomized Evaluation Start-to-finish Abdul Latif Jameel Poverty Action Lab povertyactionlab.org.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
How to show your social value – reporting outcomes & impact
Using Randomized Evaluations to Improve Policy
Operational Aspects of Impact Evaluation
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Background Non-Formal Education is recognized as an important sub-sector of the education system, providing learning opportunities to those who are not.
Measuring Results and Impact Evaluation: From Promises into Evidence
Project No EPP EL-EPPKA2-CBHE-JP ( ) UES TEAM
KiuFunza: Improving Early Grade Learning Outcomes Constantine Manda
Descriptive Analysis of Performance-Based Financing Education Project in Burundi Victoria Ryan World Bank Group May 16, 2017.
INFORMATION AND PROGRESS
Explanation of slide: Logos, to show while the audience arrive.
Experiments and Observational Studies
Using Randomized Evaluations to Improve Policy
Implementation Issues Program roll-out
Institutionalizing the Use of Impact Evaluation
Supporting an omnichannel strategy Measuring performance
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Development Impact Evaluation in Finance and Private Sector
Impact Evaluation Methods
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
To Support The Poverty Reduction Strategy on a Sustainable Basis
Impact Evaluation Methods: Difference in difference & Matching
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Explanation of slide: Logos, to show while the audience arrive.
Sampling for Impact Evaluation -theory and application-
Sample Sizes for IE Power Calculations.
TTL: Nyambura Githagui
Steps in Implementing an Impact Evaluation
Rwanda Capitation Grant Impact Evaluation
The Evidence on School Grants
M & E Plans and Frameworks
Presentation transcript:

Steps in Implementing an Impact Evaluation This presentation draws heavily on previous presentations by Dan Levy, Rachel Glennerster, Arianna Legovini, Paul Gertler, and Sebastian Martinez. Aïchatou Hassane Africa Impact Evaluation Initiative World Bank

Step 1: Identify priorities Examine sector plan examples Poverty Reduction Strategy Paper Education Long-term Strategic and Financial Framework Identify highest priorities for learning Education Teacher incentives and management Learning materials School-based management (SBM) If most resources going into SBM, then that’s the opportunity to learn New policies can also present an opportunity for learning ( Rwanda)

How will the school-based management program roll out? Step 2: Understand roll-out of intervention and opportunities for impact evaluation How will the school-based management program roll out? Piloted in a random sample of schools? Rolled out nationwide? Rolled out in schools satisfying certain clear crtiteria? Each roll-out strategy yields distinct opportunities for impact evaluation

How will the different incentives for contract teachers be implemented Step 2+: Understand roll-out of intervention and opportunities for impact evaluation How will the different incentives for contract teachers be implemented Piloted in a random sample of districts/schools? Rolled out in certain districts? Pick a nationwide representative sample Each roll-out strategy yields distinct opportunities for impact evaluation

3. Appropriate design Keep in mind the needs of the intervention: Target needy schools the evaluation: Take advantage of opportunities for random assignment School grants example: 1,000 schools to receive grants over 3 years Randomly assign 300 to each of Phases 1-3 Identify 500 neediest and assign to Phases 1 and 2 Rwanda: 3,000 contract teachers to be hired by year Representative randomly selected sample of schools with contract teachers. These 7 steps present a very simplified description of the process. Idea is to give a complete picture on how this works in a typical experiment.

3+. More on design Determine scale: at scale or small pilot? At scale Nationally representative More costly to implement Better information about national effectiveness Small pilot (e.g., in two districts) Easier to implement Not as informative

4. Random assignment Randomly assign […] to treatment and control groups Can randomly assign at individual, school, clinic, or community level School grants: at school level Treatment package: at individual level Contract teachers: at school level Trade-off: higher level means bigger sample

5. Collect baseline data Baseline data not strictly necessary: Randomization implies treatment and control are similar but Allows you to verify that treatment and control appear balanced Provides valuable data for impact analysis Did the program mostly benefit patients who were poor at baseline? Or high school performers at baseline? Allows analysis of targeting efficiency Take advantage of on-going data collection

5+ Baseline questionnaires Include areas essential to impact evaluation Ultimate outcomes we care most about Intermediate outcomes we expect to change first Take advantage of opportunity to collect essential sector data Gambia: corporal punishment Rwanda: double shift teaching (location and teacher), bonus for teachers paid by parents Focus group on contract teachers for more information. Who collects it? Bureau of Statistics: Integrate with existing data Ministry concerned: i.e Rwanda: Ministry of Education Private agency: Sometimes easier quality monitoring

6. Check for balance Do treatment and control groups look similar at baseline? If not, all is not lost! Even in absence of perfect balance, can use baseline data to adjust analysis Poverty Female-headed hh Number of children in hh Formal sector job Treatment 70% 64% 3.1 20% Control 68% 66% 2.9 18%

7. Roll out intervention Monitor to roll-out to ensure evaluation is not compromised What if the benefits are accidentally rolled out to everyone, all at once? Example: New chalkboards go to all schools Evaluation is compromised: Needed to monitor! What if all the control households receive some other benefit? Example: NGO targets control schools to receive lunches. (WFP does over some schools in Rwanda) Changes evaluation. PTA training at district level

7+ Gather info on roll-out In reality, who receives which benefits when? Could affect the impacts measured Does the intervention involve something other than initially planned? Example: Learn that those giving resources to clinics gave detailed guidance on clinic management Program impact now includes the guidance

8. Follow-up data Collect follow-up data for both the treatment and control groups Appropriate intervals Consider how long it should take for outcomes to change Sub-sample at six months? Intermediate changes One year Provide initial outcomes Adjust program if needed Two years: Changes in longer term outcomes? After end of program: Do effects endure? School feeding in Kenya What happens once teachers have obtained permanent position or have not obtained permanent position School feeding refers to http://www.nuff.ox.ac.uk/users/vermeersch/schoolmeals.pdf

9. Estimate program impacts Randomization: Simply compare average outcomes for treatment and comparison Other methods: Make statistical assumptions to estimate impact of program

10. Are they big enough to matter? Are the effects statistically significant? Basic statistical test tells whether differences are due to the program or to noisy data Are they policy significant? If the anti-HIV media campaign costs a million dollars and has positive effect but it’s tiny, may not be worthwhile

11. Disseminate! If no one knows about it, it won’t make a difference to policy! Make sure the information gets into the right policy discussions Ownership to government, capacity building Forums Workshop Report Policy brief

12. Iterate Re-examine sector priorities: Identify next learning opportunity Or suppose the effects aren’t as big as you hoped Test variations (ex: different teacher or clinic officer training) Test other interventions to affect same outcomes (ex: better equipment or school materials) Test, test, test!

Thank you