Download presentation
Presentation is loading. Please wait.
1
Operational Aspects of Impact Evaluation
Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative
2
Implementing impact evaluation
Shift of paradigm Institutional setting Key steps Team capacity & Bank support Evaluation questions Choosing methodology Data, Analysis & Dissemination Budget & Financing Timeline
3
Paradigms Program is a set of activities defined at time zero designed to deliver expected results. Program will either deliver or not Program is menu of alternatives, with a strategy to find out which are best Activities might change overtime. Project will deliver more in the future than it might do at the beginning
4
Retro- & Pro-spective Old: retrospective Look back and judge
New: prospective Decide what need to learn Experiment with alternatives Measure and inform Adopt best alternatives
5
Prospective evaluation can
Measure effectiveness of operation Measure effectiveness of alternatives modes of delivery packages pricing schemes Provide rigorous evidence to modify project features overtime (managing for results) Inform future project designs
6
“Infrastructure of evaluation”
Not a one shot study, but… Institutional framework linking evaluation to policy cycle + An analytical framework and data system for sequential learning
7
Institutional framework
PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Cost-effectiveness of different programs Effects of government program Cost-effectiveness of alternatives and effect of sector programs
8
Institutional framework
M&E staff IE specialists Local researchers Statisticians Data collectors Policy-makers Program managers
9
Key steps 2. Design evaluation 3. Prepare data collection instruments
1. Identify policy questions 3. Prepare data collection instruments 4. Collect baseline 5. Implement program in treatment areas 6. Collect follow up 8. Improve program and implement new 7. Analyze & feedback
10
Build team capacity Provide training opportunity for team and operational staff from the outset Keep training team and implementing agencies overtime Keep team involved in the work: learning by doing is best Bank support: Cross country training, coordination, network & knowledge sharing Cross country technical assistance (measurement, instruments, analytical tools) Country-specific technical assistance (project team, research team and field coordinator)
11
Keep it useful! Answer the questions that are most important to improve program Use evaluation discussion to question your program Use baseline to adjust targeting and learn more about the factors that explain the problems you face Time pilots to give you quick answers on behavioral change (e.g. take up rates) and adjust program in the early stages Give enough exposure time to ensure you can measure real results Use evaluation results to improve program Get better results
12
Operational questions
Question design-choices of operation Develop causal chain Ask whether equally likely alternatives should be considered Think of what choices were made on hunches rather than solid evidence Identify program features that are being changed and question underlying rationale
13
Dynamic learning / managing for results
Use random trials to test alternatives Focus on short term outcomes e.g. take up rates, use, adoption Follow up data collection and analysis 6-12 months after exposure Measure impact of alternative treatments on ST outcomes and identify “best” Change program to adopt best alternative Start over
14
Policy questions Is the operation achieving its objectives?
Does the operation have other desirable or undesirable effects? Is it cost-effective?
15
Fiscal accountability
Use most rigorous method of evaluation Focus on higher level outcomes e.g. educational achievement, health status, income Follow up data collection and analysis at yearly or bi yearly intervals (1-10 years horizon) Measure impact of operation on stated objectives and range of other outcomes Compare with results from other programs Allocate budget toward more effective programs and away from least effective programs
16
Example: Malaria Government target for under five child mortality = 50
ST-Operational question: will we achieve target? Experiment with alterative communication campaigns (radio, community, door-to-door) to increase use of nets for children under five Measure use of nets for each type of communication campaign Evaluate within the rain season 2-4 months Adopt communication campaigns for the program LT-Policy questions: will net use secure expected gains in mortality? Compare mortality over time in phase 1 communities with phase 2 communities Longer horizon (until control is available)
17
Choosing the Methodology
Methodological issues Identify control group Outcome indicators Available data and required data Sampling and sample sizes Choose the most robust strategy that fits the operational context Complement operations Do not deny benefits for evaluation purpose
18
Using opportunities to find Control Groups
Limited roll out capacity Use phase in and give equal chance to people to be in phase I or II (I treatments, II controls) Limited budget Establish clear eligibility criteria: regression discontinuity Give eligible population equal chance to participate: random assignment Universal eligibility Vary treatment Encourage participation
19
Collecting Data Measure indicators at baseline in both control and treatment groups Follow-up surveys with enough exposure time to detect changes How will you collect data? Available in administrative data? Can be added? Collected through existing surveys? Do we need special surveys?
20
Administrative data Typical content Lists of beneficiaries
Distribution of benefits Expenditures Outputs Information used by impact evaluation to verify: Beneficiaries, Benefits, Timing Data can be used for IE analysis if it covers: Control population Outcome indicators
21
Survey data Who do you survey? Households? Communities? Firms? Facilities? Content: information needed for analysis and covariates Researchers with survey experts design the questionnaires Harmonize basic questionnaires across evaluations and add evaluation-specific modules Develop set of common outcome indicators to compare results across different evaluations
22
Sample Based on power calculations Larger the size
the rarer the outcome the larger the standard deviation of the outcome indicator the smaller the effect size you want to detect the larger the number of subpopulations you want to analyze (gender, regions, income groups)
23
Manage trade-offs Method Experimental Non-experimental Data
Least costly Best estimates Not always feasible Non-experimental More expensive Good second best estimates Always feasible Data Administrative data (expanded to control) Most cost effective Piggy back on ongoing data collection Cost effective Coordination intensive Collect special surveys Tailored and timely Expensive
24
Qualitative and Quantitative are complementary-not substitutes
Gain insights Develop hypotheses Explain quantitative Quantitative Measure Generalize Replicate Mixed Qual/Quant Qualitative informs & enriches quantitative Quantitative tests qualitative
25
Monitor Intervention Evaluation requires intensive monitoring to:
Ensure intervention is rolled out as planned Ensure design of evaluation is respected This is to avoid contamination of evaluation results All people involved in the program should be informed Train implementers: they need to know why Hire supervisor: avoid or at least record deviations from plan Strengthen reporting systems
26
Analysis and Dissemination
Analyze baseline and each follow-up Disseminate and discuss on an on-going basis Tailor product to different audiences Politicians- one liners Policy makers- one pagers Press- one pagers Program managers- non-technical summaries Researchers – academic papers, data access
27
Budget Financing Data collection BIG BUDGET ITEM Consultants Travel
Staff time Country: Establish which funds are already available as part of project funds and MAP allocations Prepare and submit funding proposal (IE concept note with detailed budget) Bank TTL: Include IE budget in project/MAP financing Initiate product IE-code and obtain BB supervision funds Financing
28
Timeline (18-24 months) Preparation 3-6 months
Design and institutional approval Funding and Procurement Questionnaire development Training Pilot testing Baseline data collection Field Work (1-2 months), Data input (1-2 months) Exposure period Follow-up data collection Analysis and dissemination Program and Policy Feedback loop
29
Operational messages Plan evaluation while planning project
Build an explicit team Influence roll-out to obtain control groups Strengthen monitoring systems Develop budget and financing plan Schedule evaluation to match policy cycles Early results buy interest
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.