Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)

Slides:



Advertisements
Similar presentations
Vs. Attending a Different Training as a Site Team.
Advertisements

LAO PDR Summary Findings from NOSPA Mission and Possible Next Steps.
Poverty Monitoring - Good Practice in Selecting Indicators Presentation by Dirk U. Hahn, Consultant at the Workshop on Prioritization, Operationalization.
REGIONAL CONFERENCES Planning, Doing, and Using Evaluation.
Second Cross-country Workshop of the Africa Programs for Education and AIDS Impact Evaluation Dakar, December 2008 Africa Impact Evaluation Initiative.
Learning Teams (LT): Building Capacity to Sustain High Quality, Site-Based, Professional Development.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
Results-Based Management: Logical Framework Approach
Introducing the New College Scheme Seevic Performance Appraisal.
From original work by C.Mueller-Ahsmann 2004 The Cycle of Analysis NQ Physical Education A process to improve performance.
Benchmarking as a management tool for continuous improvement in public services u Presentation to Ministry of Culture of the Russian Federation u Peter.
Cancer Disparities Research Partnership Program Process & Outcome Evaluation Amanda Greene, PhD, MPH, RN Paul Young, MBA, MPH Natalie Stultz, MS NOVA Research.
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
CASE STUDIES IN PROJECT MANAGEMENT
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Arianna Legovini Africa Impact Evaluation Initiative (AIM) and Development.
Developing the Logical Frame Work …………….
Performance Technology Dr. James J. Kirk Professor of HRD.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
CONNECT WITH CAEP | | CAEP Standard 3: Candidate quality, recruitment and selectivity Jennifer Carinci,
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Impact Evaluation and the Project Cycle Arianna Legovini PREM WEEK May 2, 2006 This presentation is based on work by the Thematic Group on Impact Evaluation.
1 European Union Regional Policy – Employment, Social Affairs and Inclusion The new architecture for cohesion policy post-2013 High-Level Meeting on the.
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.
Why Evaluate? Evaluating the Impact of Projects and Programs, Beijing, China April Shahid Khandker World Bank Institute.
INTRODUCTION TO THE UNIDO/REEEP TRAINING MANUAL ‘SUSTAINABLE ENERGY REGULATION AND POLICY-MAKING FOR AFRICA’ User Manual.
Overview of Evaluation ED Session 1: 01/28/10.
Applying a health systems research perspective to the synergy question Peter Berman The World Bank Cape Town July 18, 2009.
ILO Management of Training Institutions Workshop Flexible Training Delivery Trevor Riordan ILO Senior Training Policy Specialist.
TANZANIA’S MONITORING SYSTEM: CHALLENGES AND WAY FORWARD BY EKINGO MAGEMBE POVERTY MONITORING OFFICER (MoF-TANZANIA )
Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.
RBPRBP Regional HRD Strategies Workshop Zagreb 30 May – 3 June 2005 Michael Burisch Riis Burisch & Partner GmbH.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
1 Poverty Analysis and Data Initiative (PADI) Capacity Building Program To Support The Poverty Reduction Strategy Shahid Khandker World Bank Institute.
An introduction to Impact Evaluation and application to the Ethiopia NFSP Workshop on Approaches to Evaluating The Impact of the National Food Security.
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
Development Impact Evaluation in Finance and Private Sector 1.
Private Health Sector Assessments (PHSA) April Harding 2011.
Transforming Patient Experience: The essential guide
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
0 ©2015 U.S. Education Delivery Institute While there is no prescribed format for a good delivery plan, it should answer 10 questions What a good delivery.
AADAPT Workshop South Asia Goa, India, December 17-21, 2009 Arianna Legovini Manager, Development Impact Evaluation Initiative The World Bank.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
John N. Lavis, MD, PhD Professor and Director, McMaster Health Forum McMaster University Program in Policy Decision-Making McMaster University 7 June 2012.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Importance of Monitoring and Evaluation. Lecture Overview  Monitoring and Evaluation  How to Build an M&E System.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
Kenya Evidence Forum - June 14, 2016 Using Evidence to Improve Policy and Program Designs How do we interpret “evidence”? Aidan Coville, Economist, World.
Taxonomy of Strategies
Operational Aspects of Impact Evaluation
M&E in HNP Operations: Lessons Learned in South Asia
Impact Evaluation for Real Time Decision Making
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
ALTER WP1 IMPACT PATHWAYS
The Wicked Problem of Measuring the Impact of Teacher Preparation: Increasing Rigor in Documenting Preparation Practices Larry Maheady, Buffalo State University.
Institutionalizing the Use of Impact Evaluation
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
THE FINAL MAJOR PROJECT
Development Impact Evaluation in Finance and Private Sector
4.2 Identify intervention outputs
Implementation Challenges
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Impact Evaluation for Real Time Decision Making
Randomization This presentation draws on previous presentations by Muna Meky, Arianna Legovini, Jed Friedman, David Evans and Sebastian Martinez.
Operational Aspects of Impact Evaluation
Presentation transcript:

Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)

Paradigms  Project is a set of activities defined at time zero designed to deliver expected results. Short of major upsets, we will try to stick to the script.  Project will either deliver or not  Project is menu of options, some better than the others, with a strategy to find out which are best  Activities might change overtime. Project will deliver more by the end than it did at beginning

Evaluation  Old: retrospective  Look back and judge  New: prospective  Decide what need to learn  Experiment with alternatives  Measure and inform  Adopt best alternative

IE as OR can  Measure effectiveness of alternatives (modes of delivery, packages, pricing schemes)  Provide rigorous evidence to modify project features overtime (managing by results)  Inform future project designs

Prospective evaluation: Key steps 1. Identify policy questions 2. Design evaluation 3. Prepare data collection instruments 4. Collect baseline 5. Implement program in treatment areas 6. Collect follow up 8. Improve program and implement new 7. Analyze and feedback

“Infrastructure of evaluation”  Not a one shot study  Institutional framework linking data and analysis to policy cycle +  An analytical framework and data system for a sequence of learning

Institutional framework  Evaluation team M&E staff IE specialists Local researchers Statisticians Data collectors Policy-makers Program managers Program Change Data Feedback Analysis

Start building team capacity  Provide training opportunity for team and operational staff  Discuss and question policies, developmental hypotheses and causal linkages, investigate alternatives  Develop evaluation questions  Develop evaluation design  Provide some time for internal discussion and agreement

Operational questions  Question design choices of operation  Ask whether equally likely alternatives should be considered  Think of what choices were made on hunches rather than solid evidence  Identify program features that are being changed and question underlying rationale

Decision tree Subsidy 20% Subsidy 40% $1 CFL $1.5 CFL TIME

Sequential learning  Use random trials to test alternative delivery mechanisms or packages  Focus on short term outcomes  Develop causal chain  Identify outcomes that change in the short term, e.g. take up rates, use, adoption, and are likely to lead to higher order outcomes  Time follow up data collection months after exposure

 Measure impact on ST outcomes in alternative treatments  Identify best  Change program to adopt best alternative  Start with a new set of operational questions and trails

Example  Ethiopia PRSP set targets for electricity household coverage (50%)  The electric company set out a ten year strategy to connect rural towns  No subsidy for last mile connection  Will they achieve targets???  Experiment with alterative subsidy values (high, medium, low) to lower connection barriers  Measure connection rate at each level of subsidy, including the one needed to achieve 50% household coverage  Results 6-12 months  Adopt subsidy policy for the program consistent with targets OR change targets