Impact Assessment of Africa RISING: Approaches and Challenges Beliyou Haile IFPRI March 2013.

Slides:



Advertisements
Similar presentations
When are Impact Evaluations (IE) Appropriate and Feasible? Michele Tarsilla, Ph.D. InterAction IE Workshop May 13, 2013.
Advertisements

Mywish K. Maredia Michigan State University
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
INTERNATIONAL FOOD POLICY RESEARCH INSTITUTE What are the Effects of Land Use Restrictions on Local Communities? Evidence from an Impact Evaluation of.
BUSINESS AND FINANCIAL LITERACY FOR YOUNG ENTREPRENEURS: EVIDENCE FROM BOSNIA-HERZEGOVINA Miriam Bruhn and Bilal Zia (World Bank, DECFP)
Childhood Obesity Scenario: Quasi- Experiments and Natural Experiments Versus RCTs Steven Gortmaker, Ph.D. Harvard School of Public Health /Harvard Prevention.
Fact: We constantly employ the entire assessment cycle in our daily lives Determining our desired outcomes Designing an assessment methodology Collecting.
Making Impact Evaluations Happen World Bank Operational Experience 6 th European Conference on Evaluation of Cohesion Policy 30 November 2009 Warsaw Joost.
TOOLS OF POSITIVE ANALYSIS
Agenda: Block Watch: Random Assignment, Outcomes, and indicators Issues in Impact and Random Assignment: Youth Transition Demonstration –Who is randomized?
Impact Evaluation Session VII Sampling and Power Jishnu Das November 2006.
Chapter 5 Copyright © Allyn & Bacon 2008 This multimedia product and its contents are protected under copyright law. The following are prohibited by law:
What do we know about gender and agriculture in Africa? Markus Goldstein Michael O’Sullivan The World Bank Cross-Country Workshop for Impact Evaluations.
Farming systems analysis in Africa RISING Bernard Vanlauwe - IITA Jeroen Groot - WUR Carl Timler - WUR Lotte Klapwijk – IITA/WUR 28 May 2013 Lilongwe,
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Research Process Research Process Step One – Conceptualize Objectives Step One – Conceptualize Objectives Step Two – Measure Objectives Step Two – Measure.
Regional scale Participatory analysis Crop modeling Performance data Plot Participatory Modeling Survey Household Research 4 Development platform (extension,
Overview of MSP Evaluation Rubric Gary Silverstein, Westat MSP Regional Conference San Francisco, February 13-15, 2008.
RESEARCH A systematic quest for undiscovered truth A way of thinking
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
AADAPT Workshop Latin America Brasilia, November 16-20, 2009 Nandini Krishnan Africa Impact Evaluation Initiative World Bank April 14, 2009.
Impact Evaluation in the Real World One non-experimental design for evaluating behavioral HIV prevention campaigns.
Methodology Describe Context & setting Design Participants Sampling Power Analysi s Interventions Outcome (study variables) Data Collection Procedures.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Assisting GPRA Report for MSP Xiaodong Zhang, Westat MSP Regional Conference Miami, January 7-9, 2008.
Northern Ghana Agricultural Household Survey Beliyou Haile IFPRI Northern Ghana Baseline Survey Instrument Design Workshop (Accra, Ghana) February 10,
Intervention Studies - Cluster (Randomized) Trials Intervention at the cluster level - What are clusters? - Why intervene in clusters (rather than in individuals)?
Impact Evaluation in Education Introduction to Monitoring and Evaluation Andrew Jenkins 23/03/14.
CAUSAL INFERENCE Presented by: Dan Dowhower Alysia Cohen H 615 Friday, October 4, 2013.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
1 Copyright © 2011 by Saunders, an imprint of Elsevier Inc. Chapter 8 Clarifying Quantitative Research Designs.
AFRICA IMPACT EVALUATION INITIATIVE, AFTRL Africa Program for Education Impact Evaluation David Evans Impact Evaluation Cluster, AFTRL Slides by Paul J.
WWC Standards for Regression Discontinuity Study Designs June 2010 Presentation to the IES Research Conference John Deke ● Jill Constantine.
Applying impact evaluation tools A hypothetical fertilizer project.
Non-experimental methods Markus Goldstein The World Bank DECRG & AFTPM.
Research Output #1 in ESA -Update from IFPRI- ESA Review and Planning Meeting, Mangochi, 14 July 2015 Carlo Azzarri, Beliyou Haile, Sara Signorelli, Cleo.
S Ethiopia Sustainable Land Management Project Impact Evaluation CROSS-COUNTRY WORKSHOP FOR IMPACT EVALUATIONS IN AGRICULTURE AND COMMUNITY DRIVEN DEVELOPMENT.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Measuring Impact 1 Non-experimental methods 2 Experiments
Research Design ED 592A Fall Research Concepts 1. Quantitative vs. Qualitative & Mixed Methods 2. Sampling 3. Instrumentation 4. Validity and Reliability.
Africa RISING M&E Expert Meeting Addis Ababa, 5-7 September 2012.
Africa RISING Evaluation Design Carlo Azzarri, Beliyou Haile, Cleo Roberts IFPRI Africa RISING M&E Meeting November 2014, Arusha, Tanzania.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Chapter Eight: Quantitative Methods
REGIONAL EDUCATIONAL LAB ~ APPALACHIA The Effects of Hybrid Secondary School Courses in Algebra 1 on Teaching Practices, Classroom Quality and Adolescent.
1 Lecture 11: Cluster randomized and community trials Clusters, groups, communities Why allocate clusters vs individuals? Randomized vs nonrandomized designs.
Randomized Assignment Difference-in-Differences
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Bilal Siddiqi Istanbul, May 12, 2015 Measuring Impact: Non-Experimental Methods.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
BY: ALEJANDRA REYES DALILA OCHOA MARY GARCIA Part A Introduction to Research Methods Topics 1-5.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Farming systems analysis Ghana and Mali Bamako, February 3 rd, 2014 Wageningen University, IITA, ICRISAT, MSU.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Randomization.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Randomized Control Trials: What, Why, How, When, and Where Mark L. Davison MESI Conference March 11, 2016 Department of Educational Psychology.
Identifying recommendation domains for scaling improved crop varieties in Tanzania Dr. Francis Kamau Muthoni Dr. Haroon Sseguya Prof. Bekunda Mateete Dr.
IMPACT EVALUATION PBAF 526 Class 5, October 31, 2011.
A First Look at Maize Markets and Demographics among Conservation Agriculture Adopters and Non Adopters in Mozambique W.E. McNair 1, D.M. Lambert 1 *,
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
Operations Research In MCH Programs: Measurement Challenges How MCHIP is working to Support PVOs CORE Spring Meeting, April 28, 2010 Wednesday 11:00-12:30.
IEc INDUSTRIAL ECONOMICS, INCORPORATED Attributing Benefits to Voluntary Programs: Practical and Defensible Approaches Cynthia Manson, Principal June 23,
Impact Evaluation Methods Regression Discontinuity Design and Difference in Differences Slides by Paul J. Gertler & Sebastian Martinez.
Food and Agriculture Organization of the United Nations
Impact Evaluation Methods: Difference in difference & Matching
Sampling for Impact Evaluation -theory and application-
Presentation transcript:

Impact Assessment of Africa RISING: Approaches and Challenges Beliyou Haile IFPRI March 2013

Outline Do we need impact assessment (IA)—for a project? When (for which year’s activities)? What are the feasible IA approaches, given the scale/nature/timing of activities? Feasible approach determined by what can be delivered by The researchers, with input from IFPRI IFPRI, with input from the researchers

Do we need impact assessment? For which activities? Yes, for this year (and the coming years) – Need to discuss/agree about the IA approach now (survey design is not independent of IA approach) Randomized Controlled Trials (RCTs) Quasi-experimental approaches Farming system modeling Qualitative analyses If a combination, which combination? Other Yes, but may be later on? Survey now to meet immediate data needs/RO1 Analyze diffusion of technologies now and IA later on (Ethiopia)

What are the feasible IE approaches…? I.Randomized Controlled Trials (RCTs)  Enough sample size - to identify impact when in fact there is one Sampling strategy – simple versus clustered sampling Expected change in outcome variable - production per hectare Intra-cluster correlation Inter-temporal correlation Desired statistical significance and power

What are the feasible IE approaches…RCTs? Enough sample size/scale of intervention – Ghana (UE, UW, N regions) Follow-up scenarios -> 20% increase in avg. maize harvest value/area Large # of HHs per community vs. large # of communities w/ small # of farmers per community New maize harvest/ha Correlation between measurements PowerρSample required (N)# of households/village # of villages Baseline values Avg maize harvest value/ha: 192 GHc/ha Std. dev.: 401 Deff: 3.41 (ρ=.172) % -1, , , , , % , , , , , , , ,

What are the feasible IE approaches…RCTs? Enough sample size/scale of intervention – Tanzania (Babati, Kongwa, and Kiteto) Follow-up scenarios -> 10% increase in avg. maize yield New maize yield Correlation between measurements Powerρ Sample required (N) # of households/villag e # of villages Baseline values: Avg maize yield: 1660kg/ha Std. dev.: 1311 Deff: 4.28 (ρ=.234) % , , % , ,

What are the feasible IE approaches…RCTs? Enough sample size/scale of intervention – Ethiopia (4 regions) Follow-up scenarios -> 10% increase in avg. wheat yield New wheat yield Correlation b/n measurements Powerρ Sample required (N) # of households/village # of villages Baseline values: Avg wheat yield: 1171 kg/ha Std. dev.: 554 Deff: (ρ=.453) % , % ,

What are the feasible IE approaches…RCTs? Enough sample size/scale of intervention – Malawi (Dedza and Ntcheu) Follow-up scenarios -> 20% increase in avg. maize yield New maize yield Correlation between measurements PowerρSample required (N)# of households/village # of villages Baseline values: Avg maize yield: 1049kg/ha Std. dev.: 1955 Deff: 2.36 (ρ=.072) % , , , , % , , , ,

What are the feasible IE approaches…RCTs?  Random selection Of target and control communities (from dev’t domains) Of the timing of interventions in target communities – phased-in RCT CommunityProject Year/ Treatment Status Time=tTime=t+1Time=t+2 Group 1 (T1) treated Group 2 (T2) controltreated Group 3 (T3) control treated Group 4 (T4) Controlcontrol

What are the feasible IE approaches…RCTs?  Random selection Of farmers – researchers provides IFPRI a targeting strategy --by community – IFPRI prepares a household list of target population (~sampling frame) – researchers/IFPRI randomizes farmers to target & control  Baseline survey - LSMS-type

What are the feasible IE approaches…RCTs? Challenges with RCT – Contamination (undesirable) Improper environmental control – controls ‘learning’ from treatments, can be minimized but not avoided Delivery of treatment – “the implementation is the intervention” – can be minimized IF measured, it can be accounted for during data analyses – Spillovers (desirable) Different ways of measuring it – Using within-cluster units that remained untreated (Duflo and Saez, 2003) – Intersecting an RCT with pre-existing networks (Oster and Thornton, 2009)

What are the feasible IE approaches…? II.Quasi-experimental approach. For example,  Matching if We can identify communities that are ‘similar’ to target communities We can collect basic socio-economic data (not LSMS-type) pre-intervention We can collect detailed LSMS-type post-intervention data  Regression discontinuity if Participation is a deterministic but discontinuous function of some observable characteristic (such as poverty index and land holding…RO1?) We have enough households “just above” and “just below” the threshold level We can collect detailed LSMS-type post-intervention data

What are the feasible IE approaches…? III.Farming systems modeling researchers will lead the effort and IFPRI will provide support (no in-house expertise) IV.Qualitative approach Need clarity on purpose An on-going effort V.A combination – which combination? RCT and farming system modeling Farming system modeling and qualitative analysis now (to inform AR activities and RCT later) Other combination

Summary Is there a need for empirical evidence on the causal impact of AR activities – by project? Assess and determine the approach(s) feasible, given the scale and timing of activities Determine the kind of data to be collected (at baseline) and from whom data will be collected Ensure the existence of a system on the ground (personnel, equipment, etc.) before embarking on quality (LSMS-type) data collection (university students?) IFPRI will conduct quantitative data collection (when the above are fulfilled) and supports qualitative data collection (ongoing) Need for documentation of what has been discussed and agreed