Download presentation
Presentation is loading. Please wait.
Published byAshlie Jones Modified over 9 years ago
1
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
2
Paradigms Project is a set of activities defined at time zero designed to deliver expected results. Short of major upsets, we will try to stick to the script. Project will either deliver or not Project is menu of options, some better than the others, with a strategy to find out which are best Activities might change overtime. Project will deliver more by the end than it did at beginning
3
Evaluation Old: retrospective Look back and judge New: prospective Decide what need to learn Experiment with alternatives Measure and inform Adopt best alternative
4
IE as OR can Measure effectiveness of alternatives (modes of delivery, packages, pricing schemes) Provide rigorous evidence to modify project features overtime (managing by results) Inform future project designs
5
Prospective evaluation: Key steps 1. Identify policy questions 2. Design evaluation 3. Prepare data collection instruments 4. Collect baseline 5. Implement program in treatment areas 6. Collect follow up 8. Improve program and implement new 7. Analyze and feedback
6
“Infrastructure of evaluation” Not a one shot study Institutional framework linking data and analysis to policy cycle + An analytical framework and data system for a sequence of learning
7
Institutional framework Evaluation team M&E staff IE specialists Local researchers Statisticians Data collectors Policy-makers Program managers Program Change Data Feedback Analysis
8
Start building team capacity Provide training opportunity for team and operational staff Discuss and question policies, developmental hypotheses and causal linkages, investigate alternatives Develop evaluation questions Develop evaluation design Provide some time for internal discussion and agreement
9
Operational questions Question design choices of operation Ask whether equally likely alternatives should be considered Think of what choices were made on hunches rather than solid evidence Identify program features that are being changed and question underlying rationale
10
Decision tree Subsidy 20% Subsidy 40% $1 CFL $1.5 CFL TIME
11
Sequential learning Use random trials to test alternative delivery mechanisms or packages Focus on short term outcomes Develop causal chain Identify outcomes that change in the short term, e.g. take up rates, use, adoption, and are likely to lead to higher order outcomes Time follow up data collection 6-12-18 months after exposure
12
Measure impact on ST outcomes in alternative treatments Identify best Change program to adopt best alternative Start with a new set of operational questions and trails
13
Example Ethiopia PRSP set targets for electricity household coverage (50%) The electric company set out a ten year strategy to connect rural towns No subsidy for last mile connection Will they achieve targets??? Experiment with alterative subsidy values (high, medium, low) to lower connection barriers Measure connection rate at each level of subsidy, including the one needed to achieve 50% household coverage Results 6-12 months Adopt subsidy policy for the program consistent with targets OR change targets
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.