Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank.

Slides:



Advertisements
Similar presentations
Domestic Resource Mobilization and the Challenge of Governance Prof. Mushtaq H. Khan Department of Economics SOAS, University of London.
Advertisements

RE-THINKING ACCOUNTABILITY Social Accountability and the Search for More Effective Public Expenditure Jeff Thindwa Participation and Civic Engagement.
Introduction to the Results Framework. What is a Results Framework? Graphic and narrative representation of a strategy for achieving a specific objective.
Project Appraisal Module 5 Session 6.
Second Cross-country Workshop of the Africa Programs for Education and AIDS Impact Evaluation Dakar, December 2008 Africa Impact Evaluation Initiative.
The World Bank Human Development Network Spanish Impact Evaluation Fund.
Explanation of slide: Logos, to show while the audience arrive.
Developing an Evaluation Strategy – experience in DFID Nick York Director – Country, Corporate and Global Evaluations, World Bank IEG Former Chief Professional.
Donald T. Simeon Caribbean Health Research Council
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
Designing an Effective Evaluation Strategy
Department for International Development Payment by Results.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
Ray C. Rist The World Bank Washington, D.C.
#ieGovern Impact Evaluation Workshop Istanbul, Turkey January 27-30, 2015 ARIANNA LEGOVINI DEVELOPMENT IMPACT EVALUATION IMPACT EVALUATION FOR REAL TIME.
Health Systems and the Cycle of Health System Reform
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Arianna Legovini Africa Impact Evaluation Initiative (AIM) and Development.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, 2009 AIM-CDD Using Randomized Evaluations.
Outputs and Outcomes Building Better Opportunities Neil King - Director – CERT Ltd.
Results-Based Management
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
Project “Ex-ante evaluation of programming documents and strengthening evaluation capacity for EU funds post-accession” (EUROPAID/130401/D/SER/HR) Project.
Lesson 2: Project: Evaluation, Monitoring, Auditing Macerata, 22 nd October Alessandro Valenza, Director, t33 srl.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Use of impact evaluation for operational research PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL)
Andrey Ivanov, Senior Policy Advisor, Human Development and Roma Inclusion cluster, UNDP BRC.
Evidence-based policies and indicator systems 2006 Conducting effective research and analysis to support policy delivery. The Green Book: Appraisal and.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
PREM Week April 2009 PREM Week April 2009 Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank Impact Evaluation for Real.
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Regional Policy EU Cohesion Policy 2014 – 2020 Proposals from the European Commission.
The World Bank Monitoring and evaluation of science, technology & innovation An International Perspective.
Ministry of Healthcare & Nutrition Broader Approaches to Health Strategic Frame Work for Health Development.
Bridging the Research-to-Practice Gap Session 1. Session Objectives  Understand the importance of improving data- informed decision making  Understand.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
Impact Evaluation for Evidence-Based Policy Making
AADAPT Workshop for Impact Evaluation in Agriculture and Rural Development Goa, India 2009 With generous support from Gender Action Plan.
Impact Measurement why what how Atlanta. Today Imperatives Questions Why Now? Significant Challenges Breakthroughs in the field CARE’s Long-Term.
Africa RISING M&E Expert Meeting Addis Ababa, 5-7 September 2012.
Development Impact Evaluation in Finance and Private Sector 1.
Revisions Proposed to the CIS Plan by the Global Office Misha V. Belkindas Budapest, July 3-4, 2013.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
PRESENTATION BY THE GHANA TEAM By Eunice Dapaah Senior Education Specialist World Bank- Ghana Office.
Overview of Health Systems Constraints in Developing Countries David Peters November 30, 2005.
What is Impact Evaluation … and How Do We Use It? Deon Filmer Development Research Group, The World Bank Evidence-Based Decision-Making in Education Workshop.
Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead Specialist Africa Impact Evaluation Initiative.
SUMMARY Macerata, 8 th April Andrea Gramillano, t33 srl.
Cross-Country Workshop for Impact Evaluations in Agriculture and Community Driven Development Addis Ababa, April 13-16, Causal Inference Nandini.
1 An introduction to Impact Evaluation (IE) for HIV/AIDS Programs March 12, 2009 Cape Town Léandre Bassolé ACTafrica, The World Bank.
Impact Evaluation Impact Evaluation for Evidence-Based Policy Making Arianna Legovini Lead, Africa Impact Evaluation Initiative AFTRL.
1 Developing country capacity for impact evaluation PREM Week 2007 Arianna Legovini, Africa Impact Evaluation Initiative (AFTRL) Africa Impact Evaluation.
Improving Effectiveness and Outcomes for the Poor in Health, Nutrition, and Population An Evaluation of World Bank Group Support Since 1997 Martha Ainsworth,
Operational Aspects of Impact Evaluation
Impact Evaluation for Real Time Decision Making
Measuring Results and Impact Evaluation: From Promises into Evidence
a New Focus for External Validity
Seminario Avaliacão de Impacto em Agricultura e Desinvolvimento Rural
Introduction to Comprehensive Evaluation
Presenter: Beverly Reynolds, DPM, Health Sector Development
Institutionalizing the Use of Impact Evaluation
Workshop of the Africa Program for Impact Evaluation of HIV/AIDS
Development Impact Evaluation in Finance and Private Sector
Impact Evaluation for Real Time Decision Making
Steps in Implementing an Impact Evaluation
Data for PRS Monitoring: Institutional and Technical Challenges
Presentation transcript:

Impact Evaluation for Real Time Decision Making Arianna Legovini Head, Development Impact Evaluation Initiative (DIME) World Bank

FIRST What is a results chain? Inputs Financial and human resources Activities Actions to convert inputs into outputs Outputs Tangible products, informatio n campaign s, trainings, studies Inter- mediate outcome s Use of outputs by intended population Take-up Use Final Outcome s Objective of the program Growth Social cohesion Employm ent status 2 Implementation (SUPPLY SIDE)Results (DEMAND RESPONSE)

Example of a results chain Inputs Financial and human resources to build local institutions Activities Develop com- munication Set up grant mechanisms Provide technical assistance to communities Outputs Communica- tion package Grant offerings Technical assistance visits Intermedia te outcomes Grants awarded and disbursed Projects implemente d Community rallies Final Outcomes Reduced conflict Improved inclusion of ex- combatants Expanded employment 3 Monitor whether program is implemented as plannedEvaluate whether program is effective ?

The results chain  Developmental hypothesis that helps you define:  what you are doing and for what purpose  what needs to be monitored and  what needs to be evaluated 4

What is monitoring?  Monitoring tracks indicators over time (in the treatment group)  It is descriptive before-after analysis  It tells us whether things are moving in the right direction 5

 Impact evaluation tracks outcomes over time in the treatment group relative to a control group  It measures the effect of an intervention on outcomes relative to a counterfactual  what would have happened without it?  It identifies the causal output-outcome link  separately from the effect of other time-varying factors

 Use monitoring to track implementation efficiency (input-output) 7 INPUTSOUTCOMESOUTPUTS MONITOR EFFICIENCY EVALUATE EFFECTIVENESS $ UPPLY DEMAND RESPONSE  Use impact evaluation to measure effectiveness (output-outcome)

Pick the right method to answer your questions Descriptive Analysis Monitoring (and process evaluation) Is program being implemented efficiently? Is program targeting the right population? Are outcomes moving in the right direction? Causal Analysis Impact Evaluation What was the effect of the program on outcomes? How do alternative implementation modalities compare? Is the program cost- effective? 8

Discuss among yourselves (5m)  What happens if you use monitoring to evaluate impact? 9

Discuss among yourselves (5m)  What happens if you use monitoring to evaluate impact?  You get the wrong answer… …100% of the times 10 AfterBefore B A t0t0 t1t1 B Intervention Change C Impact

NEXT: Do we know ex ante…  On Community Driven Development, what  information will get communities to respond?  facilitation will results in high quality proposals?  rules will increase inclusion in the decision-making process?  monitoring mechanisms and co-payments will improve local projects and their use of funds?  On Disarmament, Demobilization and Reintegration,  are community based or targeted approaches most effective?  should we try to delink combatants from units or build on unit cohesion?  is including or excluding command structures most effective? 11

 We turn to our best judgment for guidance and pick an information campaign, a package of services, a structure of incentives.  Is there any other campaign, package, incentive structure that will do better? 12

The decision process is complex  A few big decisions are taken during design but many more decisions are taken during roll out & implementation 13

Pick up the ball: What is a results tree?  A results tree is a representation of the set of results chains that are considered viable during program design or program restructuring.  Is a set of competing policy and operational alternatives to reach a specific objective. 14

Reintegratio n Public communicat ion campaign to reintegrate ex- combatants Use combatant unit cohesion structures Include command structuresExclude command structures Delink combatant and dismantle unit structure Supply-based trainingDemand-based training Community- based campaign Use combatant unit cohesion structures Include command structuresExclude command structures Delink combatant and dismantle unit structure Supply-based trainingDemand-based training 15

 Establish which decisions will be taken upfront and which will be tested during roll- out  Experimentally test critical nodes: measure the impact of one option relative to another or to no intervention  Pick better and discard worse during implementation  Cannot learn everything at once  Select carefully what you want to test by involving all relevant partners 16

17 Take-up MAX Use of services MAX Benefits from services MAX Reintegration Public communicati on campaign to reintegrate ex- combatants Use combatant unit cohesion structures Include command structuresExclude command structures Delink combatant and dismantle unit structure Supply-based trainingDemand-based training Community- based campaign Use combatant unit cohesion structures Include command structuresExclude command structures Delink combatant and dismantle unit structure Supply-based trainingDemand-based training

H0w IE can support you 18

Discuss among yourselves (5m)  How many times do you make changes to your program on the feeling that something is not working right?  How useful would it be to know for sure which ways are best? 19

Why evaluate?  Improve quality of programs  Test alternatives and inform design in real time  Increase program effectiveness  Answer the “so what” questions  Build government institutions for evidence-based policy-making  Plan for implementation of options not solutions  Find out what alternatives work best  Adopt better way of doing business and taking decisions 20

The market for evidence 21 PM/Presidency: Communicate to constituencies Treasury/ Finance: Allocate budget Line ministries: Deliver programs and negotiate budget Cost- effectiveness of different programs Effects of government program BUDGET SERVICE DELIVERY CAMPAIGN PROMISES Accountability Cost-effectiveness of alternatives and effect of sector programs

Shifting Program Paradigm From:  Program is a set of activities designed to deliver expected results  Program will either deliver or not To:  Program is menu of alternatives with a learning strategy to find out which work best  Change programs overtime to deliver more results 22

 From retrospective, external, independent evaluation  Top down  Determine whether program worked or not  To prospective, internal, and operationally driven impact evaluation /externally validated  Set program learning agenda bottom up  Consider plausible implementation alternatives  Test scientifically and adopt best  Just-in-time advice to improve effectiveness of program over time 23

Retrospective (designed & evaluated ex-post) vs. Prospective (designed ex-ante and evaluated ex-post)  Retrospective impact evaluation:  Collecting data after the event you don’t know how participants and nonparticipants compared before the program started  Have to try and disentangle why the project was implemented where and when it was, after the event  Prospective impact evaluation:  design the evaluation to answer the question you need to answer  collect the data you will need later  Ensure analytical validity 24

 Evaluative process to provide useful (actionable) information at each step of the impact evaluation 25

Discuss among yourselves (5m)  What are some of the entry points in policy- making cycles?  What are some of the answers you would like to have? 26

Ethical considerations  It is not ethical to deny benefits to something that is available and we know works  HIV medicine proven to prolong life  It is ethical to test interventions before scale up if we don’t know if it works and whether it has unforeseen consequences  Food aid may destroy local markets create perverse incentives  Most times we use opportunities created by roll out and budget constraints to evaluate so as to minimize ethical considerations AND  We can always and should test alternatives to maximize out results 27

Questions? Comments? 28