Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.

Slides:



Advertisements
Similar presentations
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
Choosing the level of randomization
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
How to randomize? povertyactionlab.org. Quick Review: Why Randomize Choosing the Sample Frame Choosing the Unit of Randomization Options How to Choose.
Benefits and limits of randomization 2.4. Tailoring the evaluation to the question Advantage: answer the specific question well – We design our evaluation.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Chapter 13 Experiments and Observational Studies.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
EXPERIMENTS AND OBSERVATIONAL STUDIES Chance Hofmann and Nick Quigley
Types of Studies Observational Study –Observes and measures characteristics without trying to modify the subjects being studied Experiment –Impose a treatment.
SAMPLING AND STATISTICAL POWER Erich Battistin Kinnon Scott Erich Battistin Kinnon Scott University of Padua DECRG, World Bank University of Padua DECRG,
Impact Evaluation Toolbox Gautam Rao University of California, Berkeley * ** Presentation credit: Temina Madon.
What do we know about gender and agriculture in Africa? Markus Goldstein Michael O’Sullivan The World Bank Cross-Country Workshop for Impact Evaluations.
1 Randomization in Practice. Unit of randomization Randomizing at the individual level Randomizing at the group level –School –Community / village –Health.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 14, 2009.
Copyright © 2010 Pearson Education, Inc. Chapter 13 Experiments and Observational Studies.
Experiments and Observational Studies. Observational Studies In an observational study, researchers don’t assign choices; they simply observe them. look.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 13 Experiments and Observational Studies.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Measuring Impact: Experiments
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
Designing a Random Assignment Social Experiment In the U.K.; The Employment Retention and Advancement Demonstration (ERA)
 Is there a comparison? ◦ Are the groups really comparable?  Are the differences being reported real? ◦ Are they worth reporting? ◦ How much confidence.
Part III Gathering Data.
Why Use Randomized Evaluation? Presentation by Shawn Cole, Harvard Business School and J-PAL Presenter: Felipe Barrera-Osorio, World Bank 1 APEIE Workshop.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
Rigorous Quasi-Experimental Evaluations: Design Considerations Sung-Woo Cho, Ph.D. June 11, 2015 Success from the Start: Round 4 Convening US Department.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Why Use Randomized Evaluation? Isabel Beltran, World Bank.
Land Market Based Interventions in LAC: Protierras in Bolivia Martín Valdivia.
Applying impact evaluation tools A hypothetical fertilizer project.
Impact Evaluation for Evidence-Based Policy Making
What is randomization and how does it solve the causality problem? 2.3.
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
Measuring Impact 1 Non-experimental methods 2 Experiments
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
TRANSLATING RESEARCH INTO ACTION What and How to randomize? July 9, 2011 Dhaka Raymond Guiteras, Assistant Professor University of Maryland povertyactionlab.org.
CHOOSING THE LEVEL OF RANDOMIZATION. Unit of Randomization: Individual?
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
Randomized Assignment Difference-in-Differences
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Social Experimentation & Randomized Evaluations Hélène Giacobino Director J-PAL Europe DG EMPLOI, Brussells,Nov 2011 World Bank Bratislawa December 2011.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Using Randomized Evaluations to Improve.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Common Pitfalls in Randomized Evaluations Jenny C. Aker Tufts University.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Why Use Randomized Evaluation?
Using Randomized Evaluations to Improve Policy
Sampling and Experimentation
Measuring Results and Impact Evaluation: From Promises into Evidence
Commercial Agriculture Development Project Impact Evaluation
Quasi Experimental Methods I
Chapter 13- Experiments and Observational Studies
Quasi-Experimental Methods
Using Randomized Evaluations to Improve Policy
Development Impact Evaluation in Finance and Private Sector
Impact Evaluation Toolbox
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
SAMPLING AND STATISTICAL POWER
Presentation transcript:

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations to Improve Policy Rachel Glennerster 1

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Objective To separate the impact of the program from other factors—what is the causal effect of a program? Need to answer the counterfactual: what would have happened without the program Comparison group—who, except for the program, are just like those who got the program 2

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD 3 Illustration: children learning (+) Program impact (+) Effect of other factors

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD 4 Illustration: children learning FALSE (+) Program Impact

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Correlation is not causation knowledgeable Interested In politics OR 2) 1) Knowledgeable about politics = More likely to vote vote Interested In politics vote

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Motivation Hard to distinguish causation from correlation from statistical analysis of existing data – However complex the statistics can only see that X goes with Y – Difficult to correct for unobserved characteristics, like motivation – Motivation may be one of the most important things to correct for Selection bias a major issue for impact evaluation – Projects started at specific times and places for particular reasons – Participants may select into programs – First farmers to adopt a new technology are likely to be very different from the average farmer, looking at their yields will give you a misleading impression of the benefits of a new technology 6

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Motivation Retrospective impact evaluation: – Collecting data after the event you don’t know how participants and nonparticipants compared before the program started – Have to try and disentangle why the project was implemented where and when it was, after the event Prospective evaluation allows you to design the evaluation to answer the question you need to answer Allows you to collect the data you will need 7

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Experimental design All those in the study have the same chance of being in the treatment or comparison group By design treatment and comparison have the same characteristics (observed and unobserved), on average – Only difference is treatment With large sample, all characteristics average out Unbiased impact estimates 8

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Options for randomization Lottery (only some get the program) Random phase-in – (everyone gets it eventually) Variation in treatment – (full coverage, different options) Encouragement design (when partial take up) – Everyone can get it, some encouraged to get it 9

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Examples in agriculture Lottery – Lottery to receive information of a new ag. technology Random phase-in (everyone gets it eventually) – Train some farmers groups each year Variation in treatment – Some get information on new seed, others get credit, others get demonstration plot on their land etc Encouragement design – One farmers support center per district – Some farmers get travel voucher to attend the center 10

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Lottery among the qualified Must get the program whatever Not suitable for the program Randomize who gets the program

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Opportunities for randomization Budget constraint prevents full coverage – Random assignment (lottery) is fair and transparent Limited implementation capacity – Phase-in gives all the same chance to go first No evidence on which alternative is best – Random assignment to alternatives with equal ex ante chance of success 12

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Take up of existing program is not complete – Provide information or incentive for some to sign up Pilot a new program – Good opportunity to test design before scaling up Operational changes to ongoing programs – Good opportunity to test changes before scaling them up 13 Opportunities for randomization, cont.

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Different units you can randomize at – Individual – Farm – Farmers’ Association – Irrigation block – Village level – Women’s association – Youth groups – School level 14

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Group or individual randomization? If a program impacts a whole group usually randomize whole community to treatment or comparison Easier to get big enough sample if randomize individuals Individual randomizationGroup randomization

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Unit of randomization Randomizing at higher level sometimes necessary: – Political constraints on differential treatment within community – Practical constraints—confusing for one person to implement different versions – Spillover effects may require higher level randomization Randomizing at group level requires many groups because of within community correlation 16

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Elements of an experimental design 17 Random assignment Treatment GroupControl Group Participants  Drop-outs Evaluation sample Potential participants Maize producersRice producers Target population Commercial Farmers

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD External and internal validity External validity – The sample is representative of the total population – The results in the sample represent the results in the population – We can apply the lessons to the whole population Internal validity – The estimated effect of the intervention/program on the evaluated population reflects the real impact on that population – i.e. the intervention and comparison groups are comparable 18

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD External and internal validity An evaluation can have internal validity without external validity – Example: a randomized evaluation of encouraging women to stand in elections in urban area may not tell you much about impact of a similar program in rural areas An evaluation without internal validity, can’t have external validity – If you don’t know whether a program works in one place, then you have learnt nothing about whether it works elsewhere. 19

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Internal & external validity 20 Randomization National Population Samples National Population

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Internal validity 21 Stratification Randomization Population Population stratum Samples of Population Stratum

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD 22 Representative but biased: useless National Population Biased assignment USELESS! Randomization

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Example: Fertilizer Vouchers Distribution, internal validity 23 Random assignment Sample of Commercial Farmers

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Example: CDD in Sierra Leone Basic sequence of tasks for the evaluation – Listing eligible communities in target areas Communities not already served by other programs and of manageable size – Baseline data of communities – Random assignment to GoBifo, some public draws – GoBifo project implemented – Follow-up survey and participatory exercise 24

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Efficacy & Effectiveness Efficacy – Proof of concept – Smaller scale – Pilot in ideal conditions Effectiveness – At scale – Prevailing implementation arrangements -- “real life” Higher or lower impact? Higher or lower costs? 25

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Advantages of “experiments” Clear and precise causal impact Relative to other methods – Much easier to analyze – Cheaper (smaller sample sizes) – Easier to explain – More convincing to policymakers – Methodologically uncontroversial 26

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD What if there are constraints on randomization? Budget constraints: randomize among the most in need Roll-out capacity constraints: randomize who receives first (or next, if you have already started) Randomly promote the program to some 27

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD When is it really not possible? The treatment already assigned and announced and no possibility for expansion of treatment The program is over (retrospective) Universal take up already Program is national and non excludable – Freedom of the press, exchange rate policy (sometimes some components can be randomized) Sample size is too small to make it worth it 28

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Common pitfalls to avoid Calculating sample size incorrectly – Randomizing one district to treatment and one district to control and calculating sample size on number of people you interview Collecting data in treatment and control differently Counting those assigned to treatment who do not take up program as control—don’t undo your randomization!! 29

Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Thank You 30