Operational Issues – Lessons learnt So you want to do an Impact Evaluation… Luis ANDRES Lead Economist Sustainable Development Department South Asia Region.

Slides:



Advertisements
Similar presentations
UNITED NATIONS’ RESPONSE TO THE
Advertisements

Linking regions and central governments: Indicators for performance-based regional development policy 6 th EUROPEAN CONFERENCE ON EVALUATION OF COHESION.
Results Based Monitoring (RBM)
EuropeAid PARTICIPATORY SESSION 2: Managing contract/Managing project… Question 1 : What do you think are the expectations and concerns of the EC task.
When are Impact Evaluations (IE) Appropriate and Feasible? Michele Tarsilla, Ph.D. InterAction IE Workshop May 13, 2013.
CONCURRENT FEEDBACK SESSION OP 9 PROJECTS n Facilitator : Philip Weller n NGO : Ger Bergkamp n Rapporteur : Maryam Niamir-Fuller.
From Research to Advocacy
The World Bank Human Development Network Spanish Impact Evaluation Fund.
USE OF REGIONAL NETWORKS FOR POLICY INFLUENCE: THE HIS KNOWLEDGE HUB EXPERIENCE Audrey Aumua and Maxine Whittaker Health Information Systems Knowledge.
Knowing if the RBF mechanism is working Incorporating Rigorous Impact Evaluation into your HRBF program Sebastian Martinez World Bank.
ESRC/DfID Poverty Alleviation Conference 9/9/14
Good Evaluation Planning – and why this matters Presentation by Elliot Stern to Evaluation Network Meeting January 16 th 2015.
Benefits and limits of randomization 2.4. Tailoring the evaluation to the question Advantage: answer the specific question well – We design our evaluation.
Workshop on Transportation Corridor Evaluation With a focus on Economic and Community Development.
Case study in the ToolBox Dana Thalmeinerova, GWP.
Knowledge Translation Curriculum Module 3: Priority Setting Lesson 2 - Interpretive Priority Setting Processes.
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
1 14. Project closure n An information system project must be administratively closed once its product is successfully delivered to the customer. n A failed.
Chapter 2 DO How can you create a strategic map for your hotel?
Logic Modeling for Success Dr Kathryn Wehrmann Illinois State University.
Welcome to The Expert Community Forum 19 November 2007.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Health Systems and the Cycle of Health System Reform
Submission Writing Fundamentals – Part Webinar Series Leonie Bryen.
At the end of this module, participants should have a better understanding of the following : Elements of Gender Mainstreaming Basics of Gender Analysis.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
The Research Problem and Objectives Lecture 6 1. Organization of this lecture Research Problem & Objectives: Research and Decision/Action Problems Importance.
Indicator Baseline Target Milestones PERFORMANCE MEASUREMENTS.
Capacity Building for Better Agricultural Statistics Misha Belkindas and Graham Eele Development Data Group, World Bank.
26 TH ACADEMIC COUNCIL ST APRIL 2015 Breakout session Group C “EXPERIENTIAL LEARNING”
Capacity 2015 A Capacity Development Platform UNDP take on Capacity Development CD has been a fundamental component of TC since the Marshal Plan (1951)
Step 6: Implementing Change. Implementing Change Our Roadmap.
© 2015 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Too expensive Too complicated Too time consuming.
Key Messages Day 1 Objectives Themes Identifying main trends and challenges Session discussions and outcomes Bridging the gap.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Cultivating Demand Within USAID for Impact Evaluations of Democracy and Governance Assistance Mark Billera USAID Office of Democracy and Governance Perspectives.
Evidence-based policies and indicator systems 2006 Conducting effective research and analysis to support policy delivery. The Green Book: Appraisal and.
GPNM proposed Action Plan GPNM proposed Action Plan
Group Technical Assistance Webinar August 5, CFPHE RESEARCH METHODS FOR COMPARATIVE EFFECTIVENESS RESEARCH.
Stakeholder consultations Kyiv May 13, Why stakeholder consultations? To help improve project design and implementation To inform people about changes.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Partnership Analysis & Enhancement Tool Kit Cindy S. Soloe Research Triangle Institute (RTI) April Y. Vance Centers for Disease Control and Prevention.
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Roadmap & Actions Expert Consultation Health in All Policies.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Workshop on ICT Policy reform and rural communications infrastructure 22August – 2 nd Sept 2004 Tokyo, Japan Rural Communications Development – Uganda.
Africa Program for Impact Evaluation on HIV/AIDS (AIM-AIDS) Cape Town, March 2009 Workshop of the Africa Program for Impact Evaluation of HIV/AIDS.
Climate Generation Day 3 Practical Stakeholder and community engagement.
INTEGRATED ASSESSMENT AND PLANNING FOR SUSTAINABLE DEVELOPMENT 1 Click to edit Master title style 1 Evaluation and Review of Experience from UNEP Projects.
Health Quality Ontario: Health System Performance New Zealand Master Class March 25, 2014.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Bringing people together to create great places to live, work, and play Planning for Rural Success APA Idaho October 7, 2015.
IB Business & Management Topic 6 – Strategy HL ONLY.
Building Strong Library Associations | Sustaining Your Library Association BSLA Stakeholders Workshop Yaounde, Cameroon, April 2012 Managing Relationships.
Bangladesh Joint Country Assistance Evaluation: Assessing Total ODA at the Country Level Presentation to OECD DAC November 2006 Bruce Murray Director General.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
The process of building a national eHealth strategy for Ukraine “ How do we get to where we want to be?” Clayton Hamilton Unit leader, eHealth and Innovation,
Top Tips Localism In Action Tip 1: Getting Started Use existing links to build a strong localism partnership across the CA area Be proactive,
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Case studies and Communication Strategies: Communication and Messaging Sylvia Meek (Malaria Consortium), June 2014.
Project Cycle Management
Advocacy and CampaiGning
MAP-IT: A Model for Implementing Healthy People 2020
CORE 3: Unit 3 - Part D Change depends on…
Presentation transcript:

Operational Issues – Lessons learnt So you want to do an Impact Evaluation… Luis ANDRES Lead Economist Sustainable Development Department South Asia Region

Road map for this session… Lets recap… four key elements for an IE Some of the usual concerns and how we can handle them Practical considerations when implementing IEs Final remarks Bottom line: We can do it!

Lets recap… four key elements for an IE Clear understanding of the intervention Well defined outcomes (impacts) Credible identification strategy (definition of the counterfactuals) -> methodology Credible data

WHAT HAVE WE LEARNT ON HOW TO MEASURE RESULTS? (HAVE WE?)

Some of the usual “excuses”: Ethical Concerns We can not run “experiments” on development issues We don’t know everything Experimenting is part of the way to learn what is working (and what is not) We cannot leave people aside for the sake of the IE We can not intervene with everybody anyway…. We have to make choices [Example: Rural water in Paraguay… “only” 200 communities will be intervened (out of 5K communities that need water!)] The evaluation may be a fair assignation Lessons Learned: Work with counterparts at the beginning on identifying and addressing their concerns Be clear and explain everything Offer the evaluation as a solution rather than an extra layer for complications 5

Some of the usual “excuses”: Political Concerns There is no interest in showing (potential) bad news It is worse to DO something bad and to hide it Experimenting is part of the learning process of identifying what is working (and what isn’t) The evaluations may be designed as a tool to find “areas of improvement” The evaluation can be designed to pilot different options [Example: Nicaragua] Long durations of the evaluations do not reconcile with political timelines Evaluations can be designed to show results based on time constraints; but there are limitations A good design can go beyond a political cycle [Example: Progresa in Mexico] Lessons Learned: Understand the political concerns in order to design accordingly Work in phases Show results soon (even with some limitations in the analysis) as it keeps politicians interested and engaged with the IE. 6

Some of the usual “excuses”: Technical Concerns We already “know” what works… there is no need for evaluation Arrogance… but basic questions are not answered yet… [Example: Water and health outcomes… there are just few evaluations that attempt to establish the causal relation between them] The project is already complicated and we don’t want to add more complexity Same than before… it may be complicated, but if we are not evaluating we are not learning Complex programs can be decompressed into simpler activities Not all the activities have to be evaluated The concept of the project is already agreed upon and this is what the government wants In most of the cases, the so called “agreements” are just basic features in the project’s concept [example: Rural Water project in Paraguay] The design may be worked out in such a way that these agreements are maintained The evaluations are too expensive, we cannot afford it Different designs have different costs… and teams can apply for trust funds! Lessons Learned: Projects in the preparation stage are better candidates than those under implementation Be pragmatic 7

Some of the usual “excuses”: ”INTERNAL” Concerns All the above… and Task team leaders (and managers!) are not recognized for doing good evaluations Evaluations do not have good marketing… we have to turn this in the other way around and sell it as a learning and communication tool Good evaluations (even those with bad results) should be recognized and rewarded since ALL of them generate knowledge…we indeed are a knowledge bank The impact evaluation is in academic interest not a practical one No! IEs are based on policy questions, so the goal is to influence policy decisions Good evaluations bring us business An IE is too “dangerous” because the team may be penalized for bad results IE evaluates interventions, not teams… again, teams with good IEs have to be recognized even with adverse results… since they are generating knowledge Lessons Learn: Our organizations have to have a corporate decision about IEs… Task team leaders, managers, and projects with good evaluations should be recognized (and rewarded!) 8

SOME PRACTICAL CONSIDERATIONS…

Practical considerations (1) Impact evaluation is not for every intervention Be selective Be opportunistic The “gold standard” is plausible causality, not a single impact evaluation method Recognize constraints Be flexible, be creative Start early, work IE into the design of the program Think hard about benefits (what impacts to measure) Link to project objectives Careful choice of indicators Understand time frame for outcomes to materialize Identify logical axes of disaggregation (e.g. income groups, gender) and plan sample accordingly

Practical considerations (2) Monitor implementation of program – policy does not always equate to reality (know what you are evaluating). The same holds true for data collection The task for implementing IE does not end with a sound design… Mix methods – qualitative and quantitative Qualitative data provides information on the actual mechanism that caused the impacts It may also provide intuition, new questions, and anecdotal stories that will enhance the final evaluation Watch for contamination of the treated and comparison groups To the extent possible, bulletproof the control and treated groups Stay on top of the implementation so we adjust the unforeseen events with sound solutions as quickly as possible

Practical considerations (3) It is important to work on Monitoring and Evaluation This may link the IE with the ongoing efforts for monitoring the project This generates more ownership of the IE by the project team It may be worth considering the implementation of Information Systems: targeting, program implementation, and evaluation Discuss your design with other IE colleagues We learn from each other and you may get new ideas This helps with later dissemination Work together with local partners This may build local capacity for future evaluations Validates the design and results Having people in the ground helps to preserve the design They tend to stand for change in governments

FINAL REMARKS (FINALLY!)

Final remarks We can do more of what we are doing! But we can do less of what we would like to do! Projects under preparation are better candidates for good IEs Be selective… go for good, relevant, and answerable policy questions Be opportunistic… (some) good evaluations come from unexpected venues Be pragmatic… we live in a real world Understand and tackle (to the extend possible) all the concerns Show results ASAP… this keeps politicians (and our managers) interested Engage with all the stakeholders and explain why, what, and how Look for partnerships with local institutions 14

Thanks! Luis Alberto ANDRES Sustainable Development Department South Asia Region

? Q & A