Impact of evaluations matters IDEAS Conference 2011, Amman “Evidence to Policy: Lessons Learnt from Influential Impact Evaluations” Presenter: Daniel Svoboda,

Slides:



Advertisements
Similar presentations
Key challenges in mutual accountability - citizens and csos imperative in accountability Antonio Tujan Jr.
Advertisements

STRENGTHENING FINANCING FOR DEVELOPMENT: PROPOSALS FROM THE PRIVATE SECTOR Compiled by the UN-Sanctioned Business Interlocutors to the International Conference.
Presented at the ECOSOC 2012 Development Cooperation Forum 1 st High-level Symposium Bamako, Mali 5-6 May 2011 by Timothy Lubanga, Assistant Commissioner.
Application of the PROJECT CYCLE MANAGEMENT in Piedmont Region.
Development and Cooperation Preparing the Communication on Civil Society Organisations in Development.
1 Evaluating Communication Plans Cvetina Yocheva Evaluation Unit DG REGIO 02/12/2009.
1 Module 4: Partners demand and ownership Towards more effective Capacity Development.
Role of CSOs in monitoring Policies and Progress on MDGs.
Building up capacity for Roma inclusion at local level Kosice, November 6 th, 2013.
Open Forum on CSO Development Effectiveness as a Response to Paris Declaration IDEAS Global Assembly 2009 Getting to Results: Evaluation Capacity Building.
Group 3 Financing EFA: Domestic resource mobilization and external support Facilitator: Robert Prouty, The World Bank.
Monitoring and Evaluation in the CSO Sector in Ghana
Progress Toward Impact Overall Performance Study of the GEF Aaron Zazueta GEF Evaluation Office Hanoi, March 10, 2010.
Review of different stakeholders needs in relation to Joint Assessment of National Strategies and Plans (JANS) Preliminary Findings IHP+ Country Teams.
Ray C. Rist The World Bank Washington, D.C.
CSOs on the Road to Busan: Key Messages and Proposals March 2011.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Evaluation. Practical Evaluation Michael Quinn Patton.
Purpose of the Standards
Mainstreaming Gender in development Policies and Programmes 2007 Haifa Abu Ghazaleh Regional Programme Director UNIFEM IAEG Meeting on Gender and MDGs.
IWRM PLAN PREPARED AND APPROVED. CONTENT Writing an IWRM plan The content of a plan Ensuring political and public participation Timeframe Who writes the.
Presentation on Managing for Development Results in Zambia By A. Musunga Director M&E MOFNP - Zambia.
Edward M. Haugh Jr. ESC Consultant. III. Recommendations for Applying Outcomes Planning to ESC  I. Introduction to Outcomes Planning II. A Sample ESC.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
Critical Role of ICT in Parliament Fulfill legislative, oversight, and representative responsibilities Achieve the goals of transparency, openness, accessibility,
AN INTRODUCTION Country Systems. Outline 1. What are Country Systems? 2. What does it mean to use country systems? 3. Why does the ‘use of country systems’
This project is funded by the EUAnd implemented by a consortium led by MWH Logical Framework and Indicators.
Understanding Project Management. Project attributes  The project has a purpose  Project has a life cycle like an organic entity  Project has clearly.
 Summary Presentation of Haiti  Norway’s Evaluation: Basic Information  Challenges Leading to Policy Level Findings  Lessons from the Norwegian Portfolio.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
1 Consultative Meeting on “Promoting more effective partnership between INGOs and other CSOs” building on Oxfam’s “Future Roles of INGO in Cambodia”, 24.
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
Cross-cutting Issues And other things your project document must include.
Egyptian Environmental Affairs Agency National Capacity Self Assessment (GEF/UNDP) The Third GEF Assembly Side Event – 30 th August,2006 Cape town Integrating.
Aid Effectiveness or Development Effectiveness? Hot Discussion in Times of Turbulence IDEAS Conference 2011, Amman Presenter: Daniel Svoboda, Czech Republic.
THE STRATEGIC PLANNING PROCESS Chris Sidoti ppt 4.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
Evaluation of sector programmes and budget support operations in the context of EU development cooperation 1 st M&E Network Forum 07 to 08 November 2011.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
Development and Cooperation EU Structured Dialogue with Civil Society and Local Authorities Angelo Baglio Head of Unit D2 "Civil Society and Local Authorities"
T he Istanbul Principles and the International Framework Geneva, Switzerland June 2013.
1 8-9/10/8008 From the Structured Dialogue to the Policy Forum on Development: the point of view of the actors Paola Simonetti, PFD, June 18-19, Brussels.
1 PROJECT CYCLE MANAGEMENT Gilles Ceralli TR Methodology – HI Luxembourg 06/2008.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
April_2010 Partnering initiatives at country level Proposed partnering process to build a national stop tuberculosis (TB) partnership.
Regional Training/Consultations on Capacity Development for Sustainable Growth and Human Development in Europe and CIS Application of Capacity Development.
Reforming civil service in the Baltic States: the Case of Lithuania Jurgita Siugzdiniene, PhD Department of Public Administration, Kaunas University of.
Module 4: Partners’ demand and ownership Supporting change through Capacity Development.
Evaluation of EC aid delivery through Civil society organisations Major findings and concerns relating to EC-NGO funding relationship and questions to.
Donor - recipient relations Experience of a transition country Development Centre IIR, Prague, Daniel Svoboda, IDEAS Board member, civic.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
DEVELOPMENT COOPERATION FRAMEWORK Presentation by Ministry of Finance 10 December 2013.
Participatory Development. Participatory Development-PD Participatory Development seeks to engage local populations in development projects or programs.
Kathy Corbiere Service Delivery and Performance Commission
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
1 Role of Evaluation Societies in Nurturing the M&E Systems Daniel Svoboda Czech Evaluation Society IDEAS Global Assembly October 2015.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
Global Partnership for Enhanced Social Accountability (GPESA) December 19, 2011 World Bank.
SWA Progress Review Initial Framing Ken Caplan & Leda Stott 12 November 2013 SWA Partnership Meeting 2013.
European Agency for Development in Special Needs Education Project updates Marcella Turner-Cmuchal.
Evaluation What is evaluation?
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Stages of Research and Development
Evaluation : goals and principles
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
Regulated Health Professions Network Evaluation Framework
Presentation transcript:

Impact of evaluations matters IDEAS Conference 2011, Amman “Evidence to Policy: Lessons Learnt from Influential Impact Evaluations” Presenter: Daniel Svoboda, Czech Republic

Background There are never-ending discussions about sophisticated evaluation methods, competencies and independence. In the fight for perfect results, the use of evaluation results is still underestimated.

Critical Gaps The decision makers often have no idea how (and why) to use evaluation results The results of development interventions are assessed by donors; and final beneficiaries have little chance to say what their real priorities are The evaluators do not accept any responsibility for the intervention to be evaluated There are problems in communicating evaluation findings

Challenge Focus on sustainable benefits must lead to shift from accountability to fund managers to accountability for real results - positive (or negative) impacts on people's lives. The same need for a changed paradigm concerns also the evaluations - their purpose is not to provide unchallenged findings but to contribute to benefits of the target groups. So far, it seems that influential evaluations are considered only for a specific category of evaluations…

Questions What is the real purpose of impact evaluations?  Is it to satisfy the clients by proving their attribution?  Is it to test the theory of change in order to use the discovered causalities next time?  Is it to discover whatever impact of any intervention?  Or is it to confirm or revise the intended sustainable benefits for the target groups by discovering the real needs, mechanisms, motivations and assumptions ?

Questions  Who are the real clients of impact evaluations?  Can any donor or independent evaluator understand the genuine changes in lives and minds of the target groups?  Can the benefits be imposed or evaluated from outside? The impacts are directly linked to the target groups - and these are the only actors who can confirm the needs and priorities, and the existence, relevance, extent and reach of the benefits; as well as the key actors who must be engaged in any development intervention and impact evaluation.

Methods Participatory approaches do not mean using the participants as objects for formal approvals or as objects of evaluators´ focus groups, questionnaires or counterfactual experiments. The beneficiaries must participate in setting the priorities of development policies, programs and projects, in setting the indicators of benefits that are to improve their lives, and they must also enforce their own evaluation questions, evaluation methods, and utilization of lessons learned.

Methods Similarly, country-led evaluation system does not mean using the local staff trained, paid and ruled by donors but full local understanding of the need for evaluations, and introduction of appropriate own national systems, structures and mechanisms for effective performance, and use of evaluations.

Methods The evaluator must recognize (or revise) the theory of change and must confront the needed, achieved or achievable results with intervention's design and approaches. If these key aspects are not covered by the evaluation, then probably it is not a real evaluation. And in the same way, if there is no theory what change the evaluation can or should bring and what the expected use of evaluation results is, there is no reason to consider it for evaluation and to carry it out.

Methods Evaluators must be aware of and responsible for the impacts of their work and must be able to evaluate them as well. Therefore they should apply the same approaches for evaluations as required for interventions´ planners. This includes appropriate evaluation logic model (theory of change) with all crucial assumptions and risks, and all development effectiveness criteria, starting with ownership and ending with accountability for results.

Impact of evaluations What is the purpose of any evaluation? Use of evaluation recommendations and lessons learned… Why the results should be applied? To contribute to the common goal of the intervention and its evaluation – to sustainable benefits for target groups…

Impact of evaluations Result-based Evaluations? Goal = sustainable benefits for the target groups (final beneficiaries, not donors as evaluation clients!) Outcome = changed situation or behavior = proper and timely use of the evaluation outputs (recommendations) to enhance developmental effectiveness of the intervention or of the ODA system in general Output = evaluation findings and recommendations (report) Activities = evaluation itself

Impact of evaluations What are the indicators of evaluation success or failure? What is the added value of the evaluation? What are the sources of verification at each result level? What are the assumptions and risks at each level? What are the key preconditions? What means and costs are needed? And is not it a typical “killing assumption” that the intended user is not ready to use the evaluation results?

Conclusions There is no sense to carry out non-influential evaluations, however great their technical quality can be. Evaluators must be fully aware of the impact of their work and must take their part of responsibility for it. All evaluations must have their specific theory of change, and this theory must clearly relate to the intervention to be evaluated. Only the evaluation results matter. It does not mean the evaluation activities or report but the contribution of evaluation to sustainable benefits for the target groups.

Example In 2008, the “Evaluation of EC aid delivery through Civil Society organizations” was completed. The overall recommendation calls upon the EC to drastically improve the overall use of civil society as a channel for aid delivery. This implies: (i) ensuring greater consistency between official EC policy objectives towards civil society and current practices in using the CSO channel; (ii) better identifying and tapping the full added value of CSOs in helping to achieve key EC development objectives in various contexts; (iii) improving the conditions for achieving sustainable impact with aid delivered through CSOs; (iv) removing the political and institutional barriers at the level of EC (HQ and Delegations) for an effective and efficient use of the CSO channel.

Example What is the response to this concrete evaluation? What is its impact? All recommendations were agreed and a political support to cooperation with CSOs was confirmed. There are already some achievements at “activity” level like the running structured dialogue with CSOs, member states and the parliament, or civil society help desk. On the other hand, at “outcome” level, the space for CSOs both from European and developing countries is getting radically reduced, applied financial mechanisms undermine CSOs´ role of initiative and exclude them from political consultations, the predictability of aid worsened since 2008 and the CSOs still cannot fully apply their roles and added values as agents of social and developmental changes.

Example What was then the reason for this concrete evaluation? The positive and irreplaceable role of CSOs was confirmed by evaluators and agreed by the client. But the situation of CSOs got worsened. Whose fault? Did the client expect different results? Did the evaluator make a mistake in formulating non-applicable recommendations? Did the evaluator underestimate the external assumptions or the effects of global crisis? Are there other critical constraints at the client's side? Is there any chance to reach a positive impact? Who knows? And who should know?

Impact of evaluations? Thank you for your attention! Daniel Svoboda: