Www.worldbank.org/hdchiefeconomist The World Bank Human Development Network Spanish Impact Evaluation Fund.

Slides:



Advertisements
Similar presentations
Impact analysis and counterfactuals in practise: the case of Structural Funds support for enterprise Gerhard Untiedt GEFRA-Münster,Germany Conference:
Advertisements

Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Introduction to Propensity Score Matching
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
Mywish K. Maredia Michigan State University
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 Measuring Impact 1 Non-experimental methods 2 Experiments Vincenzo Di Maro Development.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Impact Evaluation Click to edit Master title style Click to edit Master subtitle style Impact Evaluation World Bank InstituteHuman Development Network.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Propensity Score Matching Lava Timsina Kristina Rabarison CPH Doctoral Seminar Fall 2012.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
© Institute for Fiscal Studies The role of evaluation in social research: current perspectives and new developments Lorraine Dearden, Institute of Education.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Matching Methods. Matching: Overview  The ideal comparison group is selected such that matches the treatment group using either a comprehensive baseline.
TRANSLATING RESEARCH INTO ACTION What is Randomized Evaluation? Why Randomize? J-PAL South Asia, April 29, 2011.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Non-Experimental Methods Florence Kondylis.
Global Workshop on Development Impact Evaluation in Finance and Private Sector Rio de Janeiro, June 6-10, 2011 Mattea Stein Quasi Experimental Methods.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Welfare Reform and Lone Parents Employment in the UK Paul Gregg and Susan Harkness.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Matching Estimators Methods of Economic Investigation Lecture 11.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
New Measures of Data Utility Mi-Ja Woo National Institute of Statistical Sciences.
What is randomization and how does it solve the causality problem? 2.3.
Framework of Preferred Evaluation Methodologies for TAACCCT Impact/Outcomes Analysis Random Assignment (Experimental Design) preferred – High proportion.
Using Propensity Score Matching in Observational Services Research Neal Wallace, Ph.D. Portland State University February
WBI WORKSHOP Randomization and Impact evaluation.
Randomized Assignment Difference-in-Differences
REBECCA M. RYAN, PH.D. GEORGETOWN UNIVERSITY ANNA D. JOHNSON, M.P.A. TEACHERS COLLEGE, COLUMBIA UNIVERSITY ANNUAL MEETING OF THE CHILD CARE POLICY RESEARCH.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Effects of migration and remittances on poverty and inequality A comparison between Burkina Faso, Kenya, Nigeria, Senegal, South Africa, and Uganda Y.
MATCHING Eva Hromádková, Applied Econometrics JEM007, IES Lecture 4.
The Evaluation Problem Alexander Spermann, University of Freiburg 1 The Fundamental Evaluation Problem and its Solution SS 2009.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
Alexander Spermann University of Freiburg, SS 2008 Matching and DiD 1 Overview of non- experimental approaches: Matching and Difference in Difference Estimators.
ENDOGENEITY - SIMULTANEITY Development Workshop. What is endogeneity and why we do not like it? [REPETITION] Three causes: – X influences Y, but Y reinforces.
Copyright © 2015 Inter-American Development Bank. This work is licensed under a Creative Commons IGO 3.0 Attribution-Non Commercial-No Derivatives (CC-IGO.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Henrik Winterhager Econometrics III Before After and Difference in Difference Estimators 1 Overview of non- experimental approaches: Before After and Difference.
Quasi Experimental Methods I
Sec 9C – Logistic Regression and Propensity scores
Quasi Experimental Methods I
Impact Evaluation Methods
Explanation of slide: Logos, to show while the audience arrive.
Explanation of slide: Logos, to show while the audience arrive.
Quasi-Experimental Methods
Impact evaluation: The quantitative methods with applications
Matching Methods & Propensity Scores
Matching Methods & Propensity Scores
Methods of Economic Investigation Lecture 12
Impact Evaluation Methods
Impact Evaluation Methods
1 Causal Inference Counterfactuals False Counterfactuals
Matching Methods & Propensity Scores
Impact Evaluation Methods: Difference in difference & Matching
Evaluating Impacts: An Overview of Quantitative Methods
Explanation of slide: Logos, to show while the audience arrive.
Module 3: Impact Evaluation for TTLs
Presentation transcript:

The World Bank Human Development Network Spanish Impact Evaluation Fund

MEASURING IMPACT Impact Evaluation Methods for Policy Makers This material constitutes supporting material for the "Impact Evaluation in Practice" book. This additional material is made freely but please acknowledge its use as follows: Gertler, P. J.; Martinez, S., Premand, P., Rawlings, L. B. and Christel M. J. Vermeersch, 2010, Impact Evaluation in Practice: Ancillary Material, The World Bank, Washington DC ( The content of this presentation reflects the views of the authors and not necessarily those of the World Bank.

1 Causal Inference Counterfactuals False Counterfactuals Before & After (Pre & Post) Enrolled & Not Enrolled (Apples & Oranges)

2 IE Methods Toolbox Randomized Assignment Discontinuity Design Diff-in-Diff Randomized Offering/Promotion Difference-in-Differences P-Score matching Matching

2 IE Methods Toolbox Randomized Assignment Discontinuity Design Diff-in-Diff Randomized Offering/Promotion Difference-in-Differences P-Score matching Matching

For each treated unit pick up the best comparison unit (match) from another data source. Idea Matches are selected on the basis of similarities in observed characteristics. How? If there are unobservable characteristics and those unobservables influence participation: Selection bias! Issue?

Propensity-Score Matching (PSM) Comparison Group: non-participants with same observable characteristics as participants. o In practice, it is very hard. o There may be many important characteristics! Match on the basis of the “propensity score”, Solution proposed by Rosenbaum and Rubin: o Compute everyone’s probability of participating, based on their observable characteristics. o Choose matches that have the same probability of participation as the treatments. o See appendix 2.

Density of propensity scores Density Propensity Score 0 1 Participants Non-Participants Common Support

Case 7: Progresa Matching (P-Score) Baseline Characteristics Estimated Coefficient Probit Regression, Prob Enrolled=1 Head’s age (years) ** Spouse’s age (years) ** Head’s education (years) ** Spouse’s education (years) -0.03** Head is female= Indigenous= ** Number of household members 0.216** Dirt floor= ** Bathroom= ** Hectares of Land ** Distance to Hospital (km) 0.001* Constant 0.664** Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**).

Case 7: Progresa Common Support Pr (Enrolled) Density: Pr (Enrolled)

Case 7: Progresa Matching (P-Score) Estimated Impact on Consumption (Y) Multivariate Linear Regression Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**). If significant at 10% level, we label impact with +

Keep in Mind Matching Matching requires large samples and good quality data. Matching at baseline can be very useful: o Know the assignment rule and match based on it o combine with other techniques (i.e. diff-in-diff) Ex-post matching is risky: o If there is no baseline, be careful! o matching on endogenous ex-post variables gives bad results. !

Progresa Policy Recommendation? Note: If the effect is statistically significant at the 1% significance level, we label the estimated impact with 2 stars (**). If significant at 10% level, we label impact with + Impact of Progresa on Consumption (Y) Case 1: Before & After 34.28** Case 2: Enrolled & Not Enrolled Case 3: Randomized Assignment 29.75** Case 4: Randomized Offering 30.4** Case 5: Discontinuity Design 30.58** Case 6: Difference-in-Differences 25.53** Case 7: Matching 7.06+

Appendix 2 Steps in Propensity Score Matching 1.Representative & highly comparables survey of non- participants and participants. 2.Pool the two samples and estimated a logit (or probit) model of program participation. 3.Restrict samples to assure common support (important source of bias in observational studies) 4.For each participant find a sample of non-participants that have similar propensity scores 5.Compare the outcome indicators. The difference is the estimate of the gain due to the program for that observation. 6.Calculate the mean of these individual gains to obtain the average overall gain.