Howard White International Initiative for Impact Evaluation (3ie)

Slides:



Advertisements
Similar presentations
Econometric analysis informing policies UNICEF workshop, 13 May 2008 Christian Stoff Statistics Division, UNESCAP,
Advertisements

The World Bank Human Development Network Spanish Impact Evaluation Fund.
REGRESSION, IV, MATCHING Treatment effect Boualem RABTA Center for World Food Studies (SOW-VU) Vrije Universiteit - Amsterdam.
Measuring Impact: lessons from the capacity building cluster SWF Impact Summit 2 nd October 2013 Leroy White University of Bristol Capacity Building Cluster.
Explanation of slide: Logos, to show while the audience arrive.
Overview M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Mywish K. Maredia Michigan State University
A Guide to Education Research in the Era of NCLB Brian Jacob University of Michigan December 5, 2007.
The Welfare Impact of Rural Electrification Howard White IEG, World Bank.
Evaluation Research Pierre-Auguste Renoir: Le Grenouillere, 1869.
Philip Davies The Challenges Ahead for Impact Evaluation Studies in Poland Using Mixed Methods of Evaluation for Policy Making Purposes.
Project Monitoring Evaluation and Assessment
Experimental Design making causal inferences. Causal and Effect The IV precedes the DV in time The IV precedes the DV in time The IV and DV are correlated.
Reading the Dental Literature
Howard White Theory Based Evaluation Impact Evaluation Howard White International Initiative for Impact Evaluation.
Impact Evaluation: The case of Bogotá’s concession schools Felipe Barrera-Osorio World Bank 1 October 2010.
Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.
What do we know about gender and agriculture in Africa? Markus Goldstein Michael O’Sullivan The World Bank Cross-Country Workshop for Impact Evaluations.
Women’s Empowerment in Agriculture Index for Feed the Future How should CARE work with it?
PAI786: Urban Policy Class 2: Evaluating Social Programs.
OVE’s Experience with Impact Evaluations Paris June, 2005.
Impact Evaluation: Presentation to DAC Evaluation Working Group, Paris, June 2, 2005 Howard White Operations Evaluation Department World Bank.
Quasi-experimental Design CRJS 4466EA. Introduction Quasi-experiment Describes non-randomly assigned participants and controls subject to impact assessment.
Program Evaluation Using qualitative & qualitative methods.
I want to test a wound treatment or educational program in my clinical setting with patient groups that are convenient or that already exist, How do I.
Matching Methods. Matching: Overview  The ideal comparison group is selected such that matches the treatment group using either a comprehensive baseline.
Quasi Experimental Methods I Nethra Palaniswamy Development Strategy and Governance International Food Policy Research Institute.
Workshop 6 - How do you measure Outcomes?
Beyond logical frameworks to program impact pathways CIIFAD M&E Workshop 5 November 2011  135 Emerson Hall Sunny S. Kim, MPH PhD candidate Division of.
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Impact Evaluation for Evidence-Based Policy Making
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Africa RISING M&E Expert Meeting Addis Ababa, 5-7 September 2012.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Current practices in impact evaluation Howard White Independent Evaluation Group World Bank.
Evaluation design and implementation Puja Myles
1. Overarching Question “to what extent have IFAD financed interventions in market access met the institutional objectives of IFAD?” Overview and Methodology.
PRESENTATION BY THE GHANA TEAM By Eunice Dapaah Senior Education Specialist World Bank- Ghana Office.
Improving Maternal & Infant Nutrition Helen Yewdall National Maternal and Infant Nutrition Coordinator.
Impact Evaluation of Urban Upgrading Programs Judy Baker, FEU November 19, 2007.
Florence M. Turyashemererwa Lecturer- Makerere University
CJ490: Research Methods in Criminal Justice UNIT #4 SEMINAR Professor Jeffrey Hauck.
EVALUATION RESEARCH To know if Social programs, training programs, medical treatments, or other interventions work, we have to evaluate the outcomes systematically.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Impact Evaluation Methods Randomization and Causal Inference Slides by Paul J. Gertler & Sebastian Martinez.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Community Action Research to Improve Nutrition and Growth in rural India CARING Delhi, March 2013.
Attitude, self-efficacy, knowledge and intention to exclusively breastfeed among pregnant women in rural Bangladesh Window of Opportunity: Bangladesh Joan.
Development of Gender Sensitive M&E: Tools and Strategies.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
School of Public Administration & Policy Dr. Kaifeng Yang 研究设计 : 实验研究的基本问题.
Impact evaluations of the UNICEF-IKEA Foundation programme on Improving Adolescents Lives in Afghanistan, India and Pakistan: Integrating an equity and.
Measuring Results and Impact Evaluation: From Promises into Evidence
Food and Agriculture Organization of the United Nations
General belief that roads are good for development & living standards
Evaluation of Nutrition-Sensitive Programs*
© 2012 The McGraw-Hill Companies, Inc.
Matching Methods & Propensity Scores
Matching Methods & Propensity Scores
Matching Methods & Propensity Scores
Evaluating Impacts: An Overview of Quantitative Methods
Class 2: Evaluating Social Programs
Sampling for Impact Evaluation -theory and application-
Class 2: Evaluating Social Programs
Presentation transcript:

Howard White International Initiative for Impact Evaluation (3ie)

An impact evaluation seeks to attribute all, or part, of the observed change in outcomes to a specific intervention.

LevelIndicators InputsResources: Funds and personnel ActivitiesTeacher training School improvements Decentralized management and local school management committees OutputsTrained teachers Better school facilities Functioning school management committees Intermediate outcomesHigher school enrolments at all levels Teacher and parent satisfaction Better managed schools Final outcomesImproved learning outcomes ImpactHigher productivity and earnings Empowerment

 Focus on final welfare outcomes, e.g.  Infant mortality  Income poverty  Security  Usually long-term, but not necessarily so (but then sustainability is an issue)

 Projects (or specific interventions)  Individual projects are the ‘back bone’ of impact analysis  But even then may only be able to do rigorous impact analysis of some components  Programmes  Sector wide programs can be conceived of as supporting a range of interventions, many of which can be subject to rigorous impact evaluation.  Policies  In general different approaches are required, such as CGEs – these are not being discussed today

Pick a named intervention for an impact evaluation and make a short list of indicators (using the log frame) for evaluation of this intervention

BeforeAfter Project (treatment)66 Control

But we don’t know if they were similar before… though there are ways of doing this BeforeAfter Project (treatment)66 Control55

Sometimes this can work … but usually not BeforeAfter Project (treatment)4066 Control

BeforeAfter Project (treatment)4066 Control4455

 Ex ante design preferred to ex post: impact evaluation design is much stronger if baseline data are available (but may still be possible even if they are not)  Means collecting data before intervention starts, and can be affecting the design of the intervention  But can sometimes use secondary data, that is an existing survey

1. Confounding factors 2. Selection effects 3. Spillovers and contagion 4. Impact heterogeneity 5. Ensuring policy relevance

 Other things happen – so before versus after rarely sufficient  So get a control group… but different things may happen there  So collect data on more than just outcome and impact indicators  And collect baseline data  But …

 Program placement and self-selection  Program beneficiaries have particular characteristics correlated with outcomes – so impact estimates are biased  Need to use experimental or quasi- experimental methods to cope with this; this is what has been meant by rigorous impact evaluation  But it is just one facet of impact evaluation design  Other things can also bias impact estimates

 Experimental (randomized):  Limited application, but there are applications and it is a powerful approach  Many concerns (e.g. budget and ethics) and not valid  Quasi-experimental design (regression based):  Propensity score matching is most common  Regression discontinuity  Interrupted time series  Regression modelling of outcomes

 Spillover – positive and negative impacts on non-beneficiaries  Contagion – similar interventions in control areas  Need to collect data on these aspects and may need to revise evaluation design

WHAT ARE THE MAJOR CONFOUNDING FACTORS FOR YOUR OUTCOME AND IMPACT INDICATORS? HOW MIGHT SELECTION BIAS, SPILLOVER AND CONTAGION AFFECT THE EVALUATION OF THE INTERVENTION YOU HAVE SELECTED ?

 Impact varies by intervention (design), beneficiary and context  ‘Averages’ can be misleading  Strong implications for evaluation design

 Is the impact of X and Y, bigger, equal to or less than the impacts of doing X and Y separately?  For example, hygiene promotion and sanitation facilities  Evidence suggestions they are substitutes- either one reduces incidence child diarrhea by 40-50%, but not by more if the two are combined

 Irreparable damage to physical and cognitive development results from nutritional deprivation in the first two years of life  Hence interventions to infants have greater long-run impact on many outcomes than do those aimed at older children (such as school feeding programs)

Expected impact Good yearLow Average-bad yearHigh Very bad yearNone

What sort of differences in impact would you expect for your intervention with respect to intervention (design), context and beneficiary?

 Process  Stakeholder engagement  Packaging messages  Design  Theory-based approach  Mixed methods  Capture all costs and benefits, including cross- sectoral effects  Cost effectiveness and CBA

 Make explicit underlying theory about how inputs lead to intended outcomes and impacts  Documents every step in causal chain  Draws on multiple data sources and approaches  Stresses context of why or why not working

AssumptionFindings Provide nutritional counselling to care givers Mothers are not decision makers, especially if they live with their mother-in-law Women know about sessions and attend 90% participation, lower in more conservative areas Malnourished and growth faltering children correctly identified No – community nutrition practitioners cannot interpret growth charts Women acquire knowledgeThose attending training do so And knowledge is turned into practice No there is a substantial knowledge- practice gap Supplementary feeding is additional food for intended beneficiary No, considerable evidence of substitution and leakage Adopted changes are sufficient to improve intended outcomes Only sometimes (not for pregnant women)

 Need to collect survey data at the unit of intervention (child, firm etc)  Will need also facility/project data  Need data across the log frame and for confounding factors – and for your instrumental variables (lack of valid instruments is the major obstacle to performing IE)  Designing data collection instruments takes time and should be iterated with qualitative data

StudyData sources Rural electrification3 rural electrification surveys 11 DHS 2 LSMS India irrigation and rural livelihoods Own survey District-level government data Census data Bangladesh maternal and child health and nutrition DHS Project data + national nutrition survey Ghana basic education1988/89 GLSS (LSMS) Own follow up survey Kenya agricultural extension2 previous rural surveys Own follow up survey

OUTLINE YOUR PROPOSED EVALUATION DESIGN (TIMING OF DATA COLLECTION, IDENTIFICATION OF CONTROL, IF ANY) WHAT DATA SOURCES WOULD YOU USE FOR YOUR PROPOSED EVALUATION?

VISIT