Download presentation
Presentation is loading. Please wait.
Published byAvice George Modified over 9 years ago
1
1 Role of Evaluation Societies in Nurturing the M&E Systems Daniel Svoboda Czech Evaluation Society IDEAS Global Assembly 26-30 October 2015
2
2 Czech Evaluation Society Established in 2007. Currently 34 members. Originally, the main focus was on capacity building of evaluators and on experience exchange. Currently we also share responsibility for nurturing the national evaluation system for both pillars of development programs in the Czech Republic and the Czech Official Development Assistance (ODA).
3
3 Czech Evaluation Society Since 2011, we are engaged in the Working Group on Evaluations of the Czech advisory Council for International Development Cooperation. In 2013 we piloted a peer-review system, a mutual review and consultations between evaluators. Since 2014, we are directly engaged in the Reference Group on Evaluations at the Ministry of Foreign Affairs. Since 2015, we are discussing a similar cooperation with other Czech ministries.
4
4 DWW – Development Worldwide Civic association established in 2001. Currently 28 members. Membership in the Czech NGO platform FoRS, Czech Evaluation Society, European Evaluation Society, and IDEAS – International Development Evaluation Association. Coordination of annual summer school EPDET – European Program for Development Evaluation Training since 2007.
5
5 Peer Review 2013 – First pilot By the end of 2013, the Czech Evaluation Society launched a voluntary peer review of ODA evaluation reports completed in 2012 and 2013. 8 evaluators assessed 16 reports of their peers, using the Standards for Conducting an Evaluation. The results and recommendations were shared and discussed with evaluators, with managers of the evaluated projects as well as with the Ministry of Foreign Affairs and other development actors. In 2014, a similar peer-review was prepared for evaluations of European Structural Funds.
6
6 Peer Review 2013 – Key Findings The key findings included: 1. Too general ToR, missing key evaluation questions, foreseen use of the results unclear 2. Missing coherence between ToR, findings, conclusions, and recommendations (interpretation of results is not evidence-based) 3. Insufficient quality of some reports (or conflict of interest) complicates presentation and use of the results; some recommendations are not realistic
7
7 Peer Review 2013 – Key Findings The following topics need an expert discussion: Understanding & evaluation of Theory of Change Inappropriate indicators (in particular for outcomes and impacts) Evaluation of cross-cutting issues Presentation (and addressees) of evaluation results Appropriate evaluation methods; assessment and selection of evaluation bids Introduction of evaluation ethics and standards
8
8 Introduced Good Practice All evaluation reports are published on MFA website. The plans of ODA evaluations are published one year in advance. The Reference Group participates also in formulation of main evaluation questions and ToR preparation. Ethical Code of Evaluator, Standards for Conducting an Evaluation, and the Guidelines for evaluating gender aspects have become an integral part of ToR. The ToR and some evaluations are already published in English or in language of partner country.
9
9 Introduced Good Practice (2) New templates and procedures for evaluation reports are used. Besides open presentations of evaluation results, a special expert workshop was piloted in 2015 for discussing the implementation of evaluation recommendations. The MFA and Czech Development Agency (CZDA) annually prepare written responses to the evaluation recommendations. The evaluators voluntarily participate in peer reviews.
10
10 Remaining Challenges Limited predictability (credible evaluations need a quality planning well in advance). Insufficient monitoring (incoherent approaches regarding monitoring by implementing organizations, CZDA, and Embassies). Recommendation Tracking System is not published. Quality of evaluations still differs, more communication and capacity building is needed. Missing guidelines for evaluations of cross-cutting aspects (e.g. good governance, environment…).
11
11 2014 Review – „Rapid metaevaluation“ In 2014, sectoral evaluations were carried out in Bosnia and Herzegovina, Ethiopia, Georgia, Moldova and Palestinian Autonomous Territories. A metaevaluation was prepared for all evaluations carried out in 2012 and 2013. The Czech Evaluation Society prepared a summary review of lessons learned from all 2014 evaluations and formulated several systemic recommendations.
12
12 2014 Key Results & Recommendations The ToR should better specify the foreseen use of evaluation results and key evaluation questions Programmatic approach to ODA is necessary New procedures for evaluation procurement are needed (Framework Agreements?) ToR, evaluation reports, and comments should be primarily prepared in English Evaluation criteria and evaluability should be reflected already in formulation and implementation of development interventions
13
13 2014 Key Results & Recommendations (2) The ToR should include also a template for the Inception report; the template for the Final report should be updated Evaluation reports must reflect the ToR; proper editing of reports is often missing Evaluations must clearly describe the evaluation methods and limits; links between questions, findings, conclusions and recommendations must be strengthened; evidence (results of data analysis) is often missing
14
14 The recommendations must be well formulated, understandable and clearly addressed; the evaluator must reflect feasibility of recommendations A methodology for recognizing and evaluating the cross-cutting aspects is needed (besides gender, also good governance, human rights, environmental sustainability and impacts, etc.) Discussion between evaluators and clients must continue 2014 Key Results & Recommendations (3)
15
15 Capacity building efforts – EPDET European Program for Development Evaluation Training
16
16 Capacity building efforts – EPDET European Program for Development Evaluation Training
17
17 Capacity building efforts – EPDET European Program for Development Evaluation Training
18
18 Capacity building efforts – EPDET European Program for Development Evaluation Training
19
19 Other capacity building and networking efforts §Specialized demand-driven trainings (e.g. for the Czech Development Agency or the Czech NGO platform FoRS), direct cooperation with several Ministries (Regional Development; Labor and Social Affairs; Education, Youth and Sports…) §Peer-reviewed magazine Evaluation Theory and Practice, and a monthly newsletter §Cooperation within NESE – Network of Evaluation Societies in Europe (chaired by Czech and Slovak Evaluation Societies in 2015) §Annual (international) conferences in Prague
20
20 Conclusions Direct cooperation and consultations between the evaluation societies, the evaluation clients, the evaluators and the project managers is the best way how to promote the changes and how to strengthen shared responsibility for results. Peer-review is a valuable tool for training-by-doing education, for sharing the best practices and progressive methods, and for learning from the mistakes. It also increases ownership of evaluation ethics and competences.
21
21 Conclusions Evaluation societies should focus not only on increasing the professional capacities of their members but namely on strengthening the evaluation systems. Whatever is the quality of evaluation reports, whatever brilliant are the methods applied, the evaluation results have no sense if they are not properly used. Evaluators must take their part of responsibilities for the usability and real use of evaluation results…
22
22 Thank you for your attention! Daniel Svoboda DWW, Machova 469/23, 120 00 Prague 2 Czech Republic Phone: (+420) 724 179 562 svoboda@dww.cz http://www.dww.cz
23
23 Evaluation
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.