Explanation of slide: Logos, to show while the audience arrive.

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

Impact Evaluation Methods: Causal Inference
The World Bank Human Development Network Spanish Impact Evaluation Fund.
An Overview Lori Beaman, PhD RWJF Scholar in Health Policy UC Berkeley
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
Measuring Impact: Experiments
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
Impact Evaluation Designs for Male Circumcision Sandi McCoy University of California, Berkeley Male Circumcision Evaluation Workshop and Operations Meeting.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Applying impact evaluation tools A hypothetical fertilizer project.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Measuring Impact 1 Non-experimental methods 2 Experiments
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Laura Chioda.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
Implementing an impact evaluation under constraints Emanuela Galasso (DECRG) Prem Learning Week May 2 nd, 2006.
Randomized Assignment Difference-in-Differences
Impact Evaluation Using Impact Evaluation for Results Based Policy Making Arianna Legovini Impact Evaluation Cluster, AFTRL Slides by Paul J. Gertler &
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Using Randomized Evaluations to Improve Policy
SESRI Workshop on Survey-based Experiments
Measuring Results and Impact Evaluation: From Promises into Evidence
An introduction to Impact Evaluation
Impact Evaluation Methods
Explanation of slide: Logos, to show while the audience arrive.
Using Randomized Evaluations to Improve Policy
Development Impact Evaluation in Finance and Private Sector
SESRI Workshop on Survey-based Experiments
Impact Evaluation Methods
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Impact Evaluation Toolbox
Matching Methods & Propensity Scores
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Methods: Difference in difference & Matching
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Sampling for Impact Evaluation -theory and application-
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Module 3: Impact Evaluation for TTLs
Steps in Implementing an Impact Evaluation
Steps in Implementing an Impact Evaluation
Impact evaluations: An introduction
Presentation transcript:

Explanation of slide: Logos, to show while the audience arrive.

Impact Evaluation Methods for Policy Makers MEASURING IMPACT Impact Evaluation Methods for Policy Makers Christel Vermeersch The World Bank Slides by Sebastian Martinez, Christel Vermeersch and Paul Gertler. The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

1 Causal Inference Counterfactuals False Counterfactuals Before & After (Pre & Post) Enrolled & Not Enrolled (Apples & Oranges) Causal Inference

2 IE Methods Toolbox Randomized Assignment Randomized Offering/Promotion Discontinuity Design Difference-in-Differences Diff-in-Diff IE Methods Toolbox Matching P-Score matching

2 IE Methods Toolbox Randomized Assignment Randomized Offering/Promotion Discontinuity Design Difference-in-Differences Diff-in-Diff IE Methods Toolbox Matching P-Score matching

Choosing your IE method(s) Key information you will need for identifying the right method for your program: Prospective/Retrospective Evaluation? Eligibility rules and criteria? Poverty targeting? Geographic targeting? Roll-out plan (pipeline)? Is the number of eligible units larger than available resources at a given point in time? Budget and capacity constraints? Excess demand for program? Etc.

Choosing your IE method(s) Choose the best possible design given the operational context: Best Design Best comparison group you can find + least operational risk Have we controlled for everything? Internal validity Good comparison group External validity Local versus global treatment effect Evaluation results apply to population we’re interested in Is the result valid for everyone?

2 IE Methods Toolbox Randomized Assignment Randomized Offering/Promotion Discontinuity Design Difference-in-Differences Diff-in-Diff IE Methods Toolbox Matching P-Score matching

Randomized Treatments & Comparison Eligibles > Number of Benefits Randomize! Lottery for who is offered benefits Fair, transparent and ethical way to assign benefits to equally deserving populations. Oversubscription Give each eligible unit the same chance of receiving treatment Compare those offered treatment with those not offered treatment (comparisons). Randomized Phase In Give each eligible unit the same chance of receiving treatment first, second, third… Compare those offered treatment first, with those offered later (comparisons).

Randomized treatments and comparisons 3. Randomize treatment 1. Population 2. Evaluation sample X Comparison For presentation (with animation effects) Treatment = Ineligible External Validity Internal Validity = Eligible

Unit of Randomization Choose according to type of program Keep in mind Individual/Household School/Health Clinic/catchment area Block/Village/Community Ward/District/Region As a rule of thumb, randomize at the smallest viable unit of implementation. Keep in mind Need “sufficiently large” number of units to detect minimum desired impact: Power. Spillovers/contamination Operational and survey costs

Case 3: Randomized Assignment Progresa CCT program Unit of randomization: Community 506 communities in the evaluation sample Randomized phase-in 320 treatment communities (14446 households): First transfers in April 1998. 186 comparison communities (9630 households): First transfers November 1999

Case 3: Randomized Assignment 320 Time Treatment Communities 186 Comparison Communities Comparison Period

Case 3: Randomized Assignment How do we know we have good clones? In the absence of Progresa, treatment and comparisons should be identical Let’s compare their characteristics at baseline (T=0)

Case 3: Balance at Baseline Case 3: Randomized Assignment Treatment Comparison T-stat Consumption ($ monthly per capita) 233.4 233.47 -0.39 Head’s age (years) 41.6 42.3 -1.2 Spouse’s age 36.8 -0.38 Head’s education 2.9 2.8 2.16** Spouse’s education 2.7 2.6 0.006 Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

Case 3: Balance at Baseline Case 3: Randomized Assignment Treatment Comparison T-stat Head is female=1 0.07 -0.66 Indigenous=1 0.42 -0.21 Number of household members 5.7 1.21 Bathroom=1 0.57 0.56 1.04 Hectares of Land 1.67 1.71 -1.35 Distance to Hospital (km) 109 106 1.02 Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

Case 3: Randomized Assignment Treatment Group (Randomized to treatment) Counterfactual (Randomized to Comparison) Impact (Y | P=1) - (Y | P=0) Baseline (T=0) Consumption (Y) 233.47 233.40 0.07 Follow-up (T=1) 268.75 239.5 29.25** Estimated Impact on Consumption (Y) Linear Regression 29.25** Multivariate Linear Regression 29.75** Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

Progresa Policy Recommendation? Impact of Progresa on Consumption (Y) Case 1: Before & After Multivariate Linear Regression 34.28** Case 2: Enrolled & Not Enrolled Linear Regression -22** -4.15 Case 3: Randomized Assignment 29.75** Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

! Keep in Mind Randomized Assignment In Randomized Assignment, large enough samples, produces 2 statistically equivalent groups. Feasible for prospective evaluations with over-subscription/excess demand. Most pilots and new programs fall into this category. We have identified the perfect clone. Randomized beneficiary Randomized comparison