Operational Aspects of Impact Evaluation

Slides:



Advertisements
Similar presentations
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Advertisements

Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
1 Module 6 Putting It All Together. 2 Learning Objectives At the end of this session participants will understand: The monitoring and evaluation process.
System of Environmental-Economic Accounting SEEA Implementation Guide and Diagnostic Tool Alessandra Alfieri UNSD.
1 Module 4: Designing Performance Indicators for Environmental Compliance and Enforcement Programs.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
PADI Action Agenda for China(draft) History of Poverty Reduction in China History of Monitoring & Evaluation in Poverty Reduction Objectives.
Impact Evaluation and the Project Cycle Arianna Legovini PREM WEEK May 2, 2006 This presentation is based on work by the Thematic Group on Impact Evaluation.
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
Impact Evaluation for Evidence-Based Policy Making
An introduction to Impact Evaluation and application to the Ethiopia NFSP Workshop on Approaches to Evaluating The Impact of the National Food Security.
Development Impact Evaluation in Finance and Private Sector 1.
Africa Impact Evaluation Program on AIDS (AIM-AIDS) Cape Town, South Africa March 8 – 13, Steps in Implementing an Impact Evaluation Nandini Krishnan.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
AADAPT Workshop South Asia Goa, India, December 17-21, 2009 Arianna Legovini Manager, Development Impact Evaluation Initiative The World Bank.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Social inclusion of excluded young people and prevention of re-offending behavior
Impact Evaluation of Urban Upgrading Programs Judy Baker, FEU November 19, 2007.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Collaborative Africa Budget Reform Seminar Budget Reform in Mauritius
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
Development of Gender Sensitive M&E: Tools and Strategies.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Monitoring and evaluation 16 July 2009 Michael Samson UNICEF/ IDS Course on Social Protection.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Importance of Monitoring and Evaluation. Lecture Overview  Monitoring and Evaluation  How to Build an M&E System.
Impact Evaluation Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative AFTRL.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
Improving Effectiveness and Outcomes for the Poor in Health, Nutrition, and Population An Evaluation of World Bank Group Support Since 1997 Martha Ainsworth,
Project monitoring and evaluation
Session VII: Formulation of Monitoring and Evaluation Plan
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Developing reporting system for SDG and Agenda 2063, contribution of National Statistical System, issues faced and challenges CSA Ethiopia.
Module 2 Basic Concepts.
Impact Evaluation for Real Time Decision Making
Measuring Results and Impact Evaluation: From Promises into Evidence
WHO The World Health Survey General Introduction
Fundamentals of Monitoring and Evaluation
Investment Logic Mapping – An Evaluative Tool with Zing
Ivor Beazley, World Bank
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Impact Evaluation Terms Of Reference
The GEF-IWCAM Communications and Education Approach and Role of PCU
Program Evaluation Essentials-- Part 2
Tracking development results at the EIB
Presented by Richard Laing
Institutionalizing the Use of Impact Evaluation
GIFT and IBP Pilot PROJECT on PUBLIC PARTICIPATION
April 2011.
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Development Impact Evaluation in Finance and Private Sector
Monitoring and Evaluation for Poverty Reduction
Evaluation in the GEF and Training Module on Terminal Evaluations
4.2 Identify intervention outputs
Implementation Challenges
Impact Evaluation for Real Time Decision Making
Helene Skikos DG Education and Culture
III. Practical Considerations in preparing a CIE
Sampling for Impact Evaluation -theory and application-
Operational Aspects of Impact Evaluation
School-Wide Positive Behavioral Interventions and Supports (SWPBIS)
Steps in Implementing an Impact Evaluation
Comprehensive M&E Systems
Steps in Implementing an Impact Evaluation
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

Operational Aspects of Impact Evaluation Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative

Implementing impact evaluation Shift of paradigm Institutional setting Key steps Team capacity & Bank support Evaluation questions Choosing methodology Data, Analysis & Dissemination Budget & Financing Timeline

Paradigms Program is a set of activities defined at time zero designed to deliver expected results. Program will either deliver or not Program is menu of alternatives, with a strategy to find out which are best Activities might change overtime. Project will deliver more in the future than it might do at the beginning

Retro- & Pro-spective Old: retrospective Look back and judge New: prospective Decide what need to learn Experiment with alternatives Measure and inform Adopt best alternatives

Prospective evaluation can Measure effectiveness of operation Measure effectiveness of alternatives modes of delivery packages pricing schemes Provide rigorous evidence to modify project features overtime (managing for results) Inform future project designs

“Infrastructure of evaluation” Not a one shot study, but… Institutional framework linking evaluation to policy cycle + An analytical framework and data system for sequential learning

Institutional framework PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Cost-effectiveness of different programs Effects of government program Cost-effectiveness of alternatives and effect of sector programs

Institutional framework M&E staff IE specialists Local researchers Statisticians Data collectors Policy-makers Program managers

Key steps 2. Design evaluation 3. Prepare data collection instruments 1. Identify policy questions 3. Prepare data collection instruments 4. Collect baseline 5. Implement program in treatment areas 6. Collect follow up 8. Improve program and implement new 7. Analyze & feedback

Build team capacity Provide training opportunity for team and operational staff from the outset Keep training team and implementing agencies overtime Keep team involved in the work: learning by doing is best Bank support: Cross country training, coordination, network & knowledge sharing Cross country technical assistance (measurement, instruments, analytical tools) Country-specific technical assistance (project team, research team and field coordinator)

Keep it useful! Answer the questions that are most important to improve program Use evaluation discussion to question your program Use baseline to adjust targeting and learn more about the factors that explain the problems you face Time pilots to give you quick answers on behavioral change (e.g. take up rates) and adjust program in the early stages Give enough exposure time to ensure you can measure real results Use evaluation results to improve program Get better results

Operational questions Question design-choices of operation Develop causal chain Ask whether equally likely alternatives should be considered Think of what choices were made on hunches rather than solid evidence Identify program features that are being changed and question underlying rationale

Dynamic learning / managing for results Use random trials to test alternatives Focus on short term outcomes e.g. take up rates, use, adoption Follow up data collection and analysis 6-12 months after exposure Measure impact of alternative treatments on ST outcomes and identify “best” Change program to adopt best alternative Start over

Policy questions Is the operation achieving its objectives? Does the operation have other desirable or undesirable effects? Is it cost-effective?

Fiscal accountability Use most rigorous method of evaluation Focus on higher level outcomes e.g. educational achievement, health status, income Follow up data collection and analysis at yearly or bi yearly intervals (1-10 years horizon) Measure impact of operation on stated objectives and range of other outcomes Compare with results from other programs Allocate budget toward more effective programs and away from least effective programs

Example: Malaria Government target for under five child mortality = 50 ST-Operational question: will we achieve target? Experiment with alterative communication campaigns (radio, community, door-to-door) to increase use of nets for children under five Measure use of nets for each type of communication campaign Evaluate within the rain season 2-4 months Adopt communication campaigns for the program LT-Policy questions: will net use secure expected gains in mortality? Compare mortality over time in phase 1 communities with phase 2 communities Longer horizon (until control is available)

Choosing the Methodology Methodological issues Identify control group Outcome indicators Available data and required data Sampling and sample sizes Choose the most robust strategy that fits the operational context Complement operations Do not deny benefits for evaluation purpose

Using opportunities to find Control Groups Limited roll out capacity Use phase in and give equal chance to people to be in phase I or II (I treatments, II controls) Limited budget Establish clear eligibility criteria: regression discontinuity Give eligible population equal chance to participate: random assignment Universal eligibility Vary treatment Encourage participation

Collecting Data Measure indicators at baseline in both control and treatment groups Follow-up surveys with enough exposure time to detect changes How will you collect data? Available in administrative data? Can be added? Collected through existing surveys? Do we need special surveys?

Administrative data Typical content Lists of beneficiaries Distribution of benefits Expenditures Outputs Information used by impact evaluation to verify: Beneficiaries, Benefits, Timing Data can be used for IE analysis if it covers: Control population Outcome indicators

Survey data Who do you survey? Households? Communities? Firms? Facilities? Content: information needed for analysis and covariates Researchers with survey experts design the questionnaires Harmonize basic questionnaires across evaluations and add evaluation-specific modules Develop set of common outcome indicators to compare results across different evaluations

Sample Based on power calculations Larger the size the rarer the outcome the larger the standard deviation of the outcome indicator the smaller the effect size you want to detect the larger the number of subpopulations you want to analyze (gender, regions, income groups)

Manage trade-offs Method Experimental Non-experimental Data Least costly Best estimates Not always feasible Non-experimental More expensive Good second best estimates Always feasible Data Administrative data (expanded to control) Most cost effective Piggy back on ongoing data collection Cost effective Coordination intensive Collect special surveys Tailored and timely Expensive

Qualitative and Quantitative are complementary-not substitutes Gain insights Develop hypotheses Explain quantitative Quantitative Measure Generalize Replicate Mixed Qual/Quant Qualitative informs & enriches quantitative Quantitative tests qualitative

Monitor Intervention Evaluation requires intensive monitoring to: Ensure intervention is rolled out as planned Ensure design of evaluation is respected This is to avoid contamination of evaluation results All people involved in the program should be informed Train implementers: they need to know why Hire supervisor: avoid or at least record deviations from plan Strengthen reporting systems

Analysis and Dissemination Analyze baseline and each follow-up Disseminate and discuss on an on-going basis Tailor product to different audiences Politicians- one liners Policy makers- one pagers Press- one pagers Program managers- non-technical summaries Researchers – academic papers, data access

Budget Financing Data collection BIG BUDGET ITEM Consultants Travel Staff time Country: Establish which funds are already available as part of project funds and MAP allocations Prepare and submit funding proposal (IE concept note with detailed budget) Bank TTL: Include IE budget in project/MAP financing Initiate product IE-code and obtain BB supervision funds Financing

Timeline (18-24 months) Preparation 3-6 months Design and institutional approval Funding and Procurement Questionnaire development Training Pilot testing Baseline data collection Field Work (1-2 months), Data input (1-2 months) Exposure period Follow-up data collection Analysis and dissemination Program and Policy Feedback loop

Operational messages Plan evaluation while planning project Build an explicit team Influence roll-out to obtain control groups Strengthen monitoring systems Develop budget and financing plan Schedule evaluation to match policy cycles Early results buy interest