Measuring Results and Impact Evaluation: From Promises into Evidence

Slides:



Advertisements
Similar presentations
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation Muna Meky Impact Evaluation Cluster, AFTRL Slides by Paul J.
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Measuring Impact: lessons from the capacity building cluster SWF Impact Summit 2 nd October 2013 Leroy White University of Bristol Capacity Building Cluster.
Advantages and limitations of non- and quasi-experimental methods Module 2.2.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
PAI786: Urban Policy Class 2: Evaluating Social Programs.
IMPACTS OF A WORK-BASED POVERTY REDUCTION PROGRAM ON CHILDREN’S ACHIEVEMENT AND SOCIAL BEHAVIOR: THE NEW HOPE PROJECT Aletha C. Huston, Greg J. Duncan,
Guidance on Evaluation of Youth Employment Initiative
Gender and Impact Evaluation
AADAPT Workshop South Asia Goa, December 17-21, 2009 Nandini Krishnan 1.
Assessing the Distributional Impact of Social Programs The World Bank Public Expenditure Analysis and Manage Core Course Presented by: Dominique van de.
CAUSAL INFERENCE Shwetlena Sabarwal Africa Program for Education Impact Evaluation Accra, Ghana, May 2010.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Session III Regression discontinuity (RD) Christel Vermeersch LCSHD November 2006.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Causal Inference Nandini Krishnan Africa Impact Evaluation.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Nigeria Impact Evaluation Community of Practice Abuja, Nigeria, April 2, 2014 Measuring Program Impacts Through Randomization David Evans (World Bank)
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Impact Evaluation for Evidence-Based Policy Making
There and Back Again: An Impact Evaluator’s Tale Paul Gertler University of California, Berkeley July, 2015.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Africa Program for Education Impact Evaluation Dakar, Senegal December 15-19, 2008 Experimental Methods Muna Meky Economist Africa Impact Evaluation Initiative.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 Steps in Implementing an Impact.
Randomized Assignment Difference-in-Differences
Impact Evaluation Using Impact Evaluation for Results Based Policy Making Arianna Legovini Impact Evaluation Cluster, AFTRL Slides by Paul J. Gertler &
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Impact Evaluation Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative AFTRL.
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Looking for statistical twins
COST BENEFIT ANALYSIS-KITCHEN GARDENS INTERVENTION
Module 2 Basic Concepts.
Quality of government expenditure
An introduction to Impact Evaluation
Impact Evaluation Methods
Explanation of slide: Logos, to show while the audience arrive.
Quasi-Experimental Methods
S1316 analysis details Garnet Anderson Katie Arnold
Matching Methods & Propensity Scores
Matching Methods & Propensity Scores
Development Impact Evaluation in Finance and Private Sector
Impact Evaluation Methods
Empirical Tools of Public Finance
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Matching Methods & Propensity Scores
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Guidance on Evaluation of Youth Employment Initiative
Evaluating Impacts: An Overview of Quantitative Methods
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation Designs for Male Circumcision
Explanation of slide: Logos, to show while the audience arrive.
Class 2: Evaluating Social Programs
Sampling for Impact Evaluation -theory and application-
Class 2: Evaluating Social Programs
Applying Impact Evaluation Tools: Hypothetical Fertilizer Project
Module 3: Impact Evaluation for TTLs
Estimating net impacts of the European Social Fund in England
Presentation transcript:

Measuring Results and Impact Evaluation: From Promises into Evidence Paul Gertler University of California, Berkeley

How do we turn this teacher…

…into this teacher?

Answer Three Questions: Why is evaluation valuable? What makes a good impact evaluation? How to implement an impact evaluation?

Why Evaluate? Need evidence on what works Limited budget & bad policies could hurt Improve program/policy implementation Design: eligibility, benefits Operations: efficiency & targeting Information key to sustainability Budget negotiations Informing beliefs and the press Results agenda & Aid effectiveness

Allocate limited resources? Benefit-Cost analysis allows comparison of choices indicates highest return investment Benefit: change in outcome indicators measured through impact evaluation Cost: additional cost of providing benefit Economic versus Accounting costs

Impact Evaluation Answers What was the effect of the program on outcomes? How much better off are the beneficiaries because of the program/policy? How would outcomes change if changed program design? Is the program cost-effective? Traditional M&E cannot answer these

Impact Evaluation Answers… What is effect of scholarships on school attendance & performance (test scores)? Does contracting out primary health care lead to an increase in access? Does replacing dirt floors with cement reduce parasites & improve child health? Do improved roads increase access to labor markets & raise income

Types of Impact Evaluation Efficacy: Proof of Concept Pilot under ideal conditions Effectiveness: Normal circumstances & capabilities Impact will be lower Impact at higher scale will be different Costs will be different as there are economies of scale from fixed costs

Using impact evaluation to…. Scale up pilot-interventions/programs Kill programs Adjust program benefits Inform (i.e. Finance & Press) e.g. PROGRESA in Mexico Transition across presidential terms Expansion to 5 million households Change in benefits Battle with the press Educate the world (Brazil versus Mexico case)

Answer Three Questions: Why is evaluation valuable? What makes a good impact evaluation? How to implement an impact evaluation?

How to assess impact e.g. How much does an education program improve test scores (learning)? What is beneficiary’s test score with program compared to without program? Formally, program impact is: α = (Y | P=1) - (Y | P=0) Compare same individual with & without programs at same point in time

Solving the evaluation problem Counterfactual: what would have happened without the program Estimated impact is difference between treated observation and counterfactual Never observe same individual with and without program at same point in time Need to estimate counterfactual Counterfactual is key to impact evaluation

Counterfactual Criteria… Treated & counterfactual have identical characteristics, except for benefiting from the intervention No other reason for differences in outcomes of treated and counterfactual Only reason for the difference in outcomes is due to the intervention

2 “Counterfeit” Counterfactuals Before and after: Same individual before the treatment Those not enrolled Those who choose not to enroll in program Those who were not offered the program Problem: Cannot completely know why the treated are treated and the others not

Before and After Examples Agricultural assistance program Financial assistance to purchase inputs Compare rice yields before and after Before is normal rainfall, but after is drought Find fall in rice yield Did the program fail? Could not separate (identify) effect of financial assistance program from effect of rainfall School scholarship program on enrollment

Before and After Compare Y before and after intervention αi = (Yit | P=1) - (Yi,t-1| P=0) Estimate of counterfactual (Yi,t-1| P=0) = (Yi,t| P=0) Does not control for time varying factors Y Before After A B C t-1 t Time

2. Non-Participants…. Compare non-participants to participants Counterfactual: non-participant outcomes Impact estimate: αi = (Yit | P=1) - (Yj,t| P=0) , Assumption: (Yj,t| P=0) = (Yi,t| P=0) Problem: why did they not participate?

2. Non-participants Example 1 Job training program offered Compare employment & earning of those who sign up to those who did not Who signs up? Those who are most likely to benefit, i.e. those with more ability Would have higher earnings than non-participants without job training Poor estimate of counterfactual

2. Non-participants Example 2 Health insurance offered Compare health care utilization of those who got insurance to those who did not Who buys insurance: those that expect large medical expenditures Who does not: those who are healthy With no insurance: Those that did not buy have lower medical costs than that did Poor estimate of counterfactual

What's wrong? Selection bias: People choose to participate for specific reasons Many times reasons are related to the outcome of interest Job Training: ability and earning Health Insurance: health status and medical expenditures Cannot separately identify impact of the program from these other factors/reasons

Program placement example Gov’t offers family planning program to villages with high fertility Compare fertility in villages offered program to fertility in other villages Program targeted based on fertility, so Treatments have high fertility Counterfactuals have low fertility Estimated program impact confounded with targeting criteria

Need to know… Know all reasons why someone gets the program and others not reasons why individuals are in the treatment versus control group If reasons correlated w/ outcome cannot identify/separate program impact from other explanations of differences in outcomes

Possible Solutions… Need to guarantee comparability of treatment and control groups ONLY remaining difference is intervention In this seminar we will consider Experimental design/randomization Quasi-experiments Regression Discontinuity Double differences Instrumental Variables

These solutions all involve… Knowing how the data are generated Randomization Give all equal chance of being in control or treatment groups Guarantees that all factors/characteristics will be on average equal btw groups Only difference is the intervention If not, need transparent & observable criteria for who is offered program

Road map: The next 5 days Today: The Context Why do results matter? Linking monitoring with evaluation Importance of evidence for policy Today, Monday, Tuesday: The Tools Cost-benefit and cost effectiveness Identification strategies Data collection Operational issues Wednesday, Thursday: The Experience Group work on evaluation design and presentations

THANK YOU