Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Impact Evaluation Methods: Causal Inference
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
What could go wrong? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop Africa Program for Education.
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
What is a sample? Epidemiology matters: a new introduction to methodological foundations Chapter 4.
Chapter 51 Experiments, Good and Bad. Chapter 52 Experimentation u An experiment is the process of subjecting experimental units to treatments and observing.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
TOOLS OF POSITIVE ANALYSIS
EXPERIMENTS AND OBSERVATIONAL STUDIES Chance Hofmann and Nick Quigley
Impact Evaluation Toolbox Gautam Rao University of California, Berkeley * ** Presentation credit: Temina Madon.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
1 Randomization in Practice. Unit of randomization Randomizing at the individual level Randomizing at the group level –School –Community / village –Health.
Matching Methods. Matching: Overview  The ideal comparison group is selected such that matches the treatment group using either a comprehensive baseline.
Health Programme Evaluation by Propensity Score Matching: Accounting for Treatment Intensity and Health Externalities with an Application to Brazil (HEDG.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Measuring Impact: Experiments
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
SEDA IMPACT EVALUATION WESTERN CAPE (SOUTH AFRICA) Varsha Harinath (the dti) Francisco Campos (World Bank) Finance and Private Sector Development IE Workshop.
 Is there a comparison? ◦ Are the groups really comparable?  Are the differences being reported real? ◦ Are they worth reporting? ◦ How much confidence.
Experimental Design making causal inferences Richard Lambert, Ph.D.
CAUSAL INFERENCE Shwetlena Sabarwal Africa Program for Education Impact Evaluation Accra, Ghana, May 2010.
Why Use Randomized Evaluation? Presentation by Shawn Cole, Harvard Business School and J-PAL Presenter: Felipe Barrera-Osorio, World Bank 1 APEIE Workshop.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
Second Cross-country Workshop of the Africa Programs for Education and AIDS Impact Evaluation Dakar, December 2008 What do we know about school bursaries?
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Why Use Randomized Evaluation? Isabel Beltran, World Bank.
Applying impact evaluation tools A hypothetical fertilizer project.
Impact Evaluation for Evidence-Based Policy Making
What is randomization and how does it solve the causality problem? 2.3.
Measuring Impact 1 Non-experimental methods 2 Experiments
Public Finance and Public Policy Jonathan Gruber Third Edition Copyright © 2010 Worth Publishers 1 of 24 Empirical Tools of Public Finance 3.1 The Important.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
TRANSLATING RESEARCH INTO ACTION What and How to randomize? July 9, 2011 Dhaka Raymond Guiteras, Assistant Professor University of Maryland povertyactionlab.org.
Using Propensity Score Matching in Observational Services Research Neal Wallace, Ph.D. Portland State University February
Randomized Assignment Difference-in-Differences
Design of Clinical Research Studies ASAP Session by: Robert McCarter, ScD Dir. Biostatistics and Informatics, CNMC
Social Experimentation & Randomized Evaluations Hélène Giacobino Director J-PAL Europe DG EMPLOI, Brussells,Nov 2011 World Bank Bratislawa December 2011.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Using Randomized Evaluations to Improve.
Public Finance and Public Policy Jonathan Gruber Third Edition Copyright © 2010 Worth Publishers 1 of 24 Copyright © 2010 Worth Publishers.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Why Use Randomized Evaluation?
Using Randomized Evaluations to Improve Policy
Measuring Results and Impact Evaluation: From Promises into Evidence
An introduction to Impact Evaluation
Explanation of slide: Logos, to show while the audience arrive.
Using Randomized Evaluations to Improve Policy
Development Impact Evaluation in Finance and Private Sector
Empirical Tools of Public Finance
Impact Evaluation Toolbox
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Methods: Difference in difference & Matching
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
TTL: Nyambura Githagui
Presentation transcript:

Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative

2 Motivation Objective in evaluation is to estimate the CAUSAL effect of intervention X on outcome Y –What is the effect of a housing upgrade on household income? For causal inference, we need to understand exactly how benefits are distributed –Assigned / targeted –Take-up

3 Causation versus Correlation Correlation is NOT causation –Necessary but not sufficient condition –Correlation: X and Y are related Change in X is related to a change in Y And…. A change in Y is related to a change in X –Example: age and income –Causation – if we change X how much does Y change A change in X is related to a change in Y Not necessarily the other way around

4 Causation versus Correlation Three criteria for causation: –Independent variable precedes the dependent variable. –Independent variable is related to the dependent variable. –There are no third variables that could explain why the independent variable is related to the dependent variable.

5 Statistical analysis: Typically involves inferring the causal relationship between X and Y from observational data –Many challenges & complex statistics –We never know if we’re measuring the true impact Impact Evaluation: –Retrospectively: same challenges as statistical analysis –Prospectively: we generate the data ourselves through the program’s design  evaluation design makes things much easier! Statistical Analysis & Impact Evaluation

6 How to assess impact What is the effect of a housing upgrade on household income? Ideally, compare same individual with & without programs at same point in time What’s the problem? The need for a good counterfactual –What are the requirements?

7 Case study: housing upgrade Informal settlement of 15,000 households Goal: upgrade housing of residents Evaluation question: What is the impact of upgrading housing on household income? on employment? Counterfeit counterfactuals

8 Gold standard: Experimental design Only method that ensures balance in unobserved (and observed) characteristics  Only difference is treatment Equal chance of assignment into treatment and control for everyone With large sample, all characteristics average out Experimental design = Randomized evaluation

9 “Random” What does the term “random” mean here? –Equal chance of participation for everyone How could one really randomize in the case of housing upgrading? Options –Lottery –Lottery among the qualified –Phase-in –Encouragement –Randomize across treatments

10 Kinds of randomization Random selection: external validity –Ensure that the results in the sample represent the results in the population –What does this program tell us that we can apply to the whole country? Random assignment: internal validity –Ensure that the observed effect on the outcome is due to the treatment rather than other factors –Does not inform scale-up without assumptions Example: Housing upgrade in Western Cape vs Sample from across country

11 Randomization External Validity (sample) Internal Validity (identification) External vs Internal

12 Example of Randomization What is the impact of providing free books to students on test scores? Randomly assign a group of school children to either: - Treatment Group – receives free books - Control Group – does not receive free books

13 Randomization Random Assignment

14 How Do You Randomize? 1) At what level? –Individual –Group School Community District

15 When would you use randomization? Universe of eligible individuals typically larger than available resources at a single point in time –Fair and transparent way to assign benefits –Gives an equal chance to everyone in the sample Good times to randomize: –Pilot programs –Programs with budget/capacity constraints –Phase in programs

16 Basic Setup of an Experimental Evaluation Potential Participants Evaluation Sample Random Assignment Treatment Group Control Group ParticipantsNo-Shows Based on Orr (1999) All informal settlement dwellers Communities that might participate or a targeted sub- group Select those you want to work with right now

17 Examples…

18 Beyond simple random assignment Assigning to multiple treatment groups –Treatment 1, Treatment 2, Control –Upgrade housing in situ, relocation to better housing, control –What do we learn? Assigning to units other than individuals or households –Health Centers (bed net distribution) –Schools (teacher absenteeism project) –Local Governments (corruption project) –Villages (Community-driven development projects)

19 Unit of randomization Individual or household randomization is lowest cost option Randomizing at higher levels requires much bigger samples: within-group correlation Political challenges to unequal treatment within a community –But look for creative solutions: e.g., uniforms in Kenya Some programs can only be implemented at a higher level –e.g., strengthening school committees

20 Efficacy & Effectiveness Efficacy –Proof of Concept –Pilot under ideal conditions Effectiveness –At scale –Normal circumstances & capabilities –Lower or higher impact? –Higher or lower costs?

21 Advantages of experiments Clear causal impact Relative to other studies –Much easier to analyze –Cheaper! (smaller sample sizes) –Easier to convey –More convincing to policymakers –Not methodologically controversial

22 What if randomization isn’t possible? It probably is… Budget constraints: randomize among the needy Roll-out capacity: randomize who receives first Randomly promote the program to some

23 When is it really not possible? The treatment has already been assigned and announced and no possibility for expansion of treatment The program is over (retrospective) Universal eligibility and universal access –Example: free education, exchange rate regime