Reflections on Revising the Guidance: An Evaluation

Slides:



Advertisements
Similar presentations
EuropeAid PARTICIPATORY SESSION 1: 3 topics Each table chooses its topic: o Managing reality (Blue) o Assessing performance (Yellow) o Monitoring & reporting.
Advertisements

Applying Conflict Sensitivity in Emergency Response: Current Practice and Ways Forward Conflict Sensitivity Consortium ODI Humanitarian Practice Network.
Knowledge Translation Curriculum Module 3: Priority Setting Lesson 2 - Interpretive Priority Setting Processes.
Comprehensive M&E Systems
HOW TO WRITE A GOOD TERMS OF REFERENCE FOR FOR EVALUATION Programme Management Interest Group 19 October 2010 Pinky Mashigo.
Independent CRP-Commissioned External Evaluation of the CGIAR Research Program (CRP): Agriculture, Nutrition and Health (A4NH) INCEPTION REPORT AND PROPOSALS.
ETCF is funded by the European Union Project is implemented by Eurochambres & TOBB CONCEPT NOTE ETCF Information Days April 3rd 2008 Ankara.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
What makes a successful development project? Kristin Olsen IOD PARC
Challenges of putting research into action. Oxfam-Monash Partnership Research that will “make a difference in people’s lives” Research conducted by academics.
Addressing methodological challenges: measuring resilience + international coherence Juliet Field Climate and Environment Dept.
Regional Seminar 2005 EVALUATING POLICY Are your policies working? How do you know? School Development Planning Initiative.
SESSION 3: FROM SETTING PRIORITIES TO PROGRAMMING FOR RESULTS.
DAC OECD Workshop on Evaluating conflict prevention and peace-building activities Oslo, 17 February 2011 Evaluation of overall European Commission support.
4/5 June 2009Challenges of the CMEF & Ongoing Evaluation 1 Common monitoring and evaluation framework Jela Tvrdonova, 2010.
Methodological Lessons Joint Evaluation of Conflict Prevention and Peace-Building in DRC, Day 2, Presentation in Oslo ‘What Have We learnt From.
Recommendation 2001/331/EC: Review and relation to sectoral inspection requirements Miroslav Angelov European Commission DG Environment, Unit A 1 Enforcement,
Senior Evaluation Officer GEF Independent Evaluation Office Minsk, Belarus September 2015 Evaluation in the GEF and Training Module on Terminal Evaluations.
Task NumberHarmonise, develop & implement capacity building Performance Indicators CB-07-01c Harmonise efforts by Tasks, in particular those related with.
United Nations Development Programme Ministry of Labour and Social Policy Local Public Private Partnerships THE BULGARIAN EXPERIENCE.
GENERAL APPROACH FOR PHASE II OF THE EVALUATION OF THE PARIS DECLARATION ON AID EFFECTIVENESS Phase II Approach Paper.
EPRC Conference conclusions - Collaboration initiative for sustainable regional Arctic development ARC-NET 1.
1. Overarching Question “to what extent have IFAD financed interventions in market access met the institutional objectives of IFAD?” Overview and Methodology.
The FDES revision process: progress so far, state of the art, the way forward United Nations Statistics Division.
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
Alice Pedretti, Project Manager Effective management of complaints for companies Lessons learned from the Management of Complaints Assessment Tool Amsterdam,
Evaluation What is evaluation?
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Development Updates Executive Committee Meeting April 2010 Funding, Enlargement, Participation and Evaluation.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Introduction and Overview
Small Charities Challenge Fund (SCCF) Guidance Webinar
ALNAP Biannual Meeting 9-10 June 2005
Monitoring and Evaluating Rural Advisory Services
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
GUIDELINES Evaluation of National Rural Networks
Capital Project / Infrastructure Renewal – Making the Business Case
Module 1: Introducing Development Evaluation
Monitoring and Evaluation of Peacekeeping and Peacebuilding
Project Integration Management
European Network on teacher Education Policies
Building an Intervention Logic
UNEG – HEIG Humanitarian Evaluation Interest Group
Presentation to the GAC: A preliminary presentation of the Capacity Building program evaluation Alice Munyua 25 June 2018, ICANN 62, Panama.
Monitoring and Evaluation (M&E)
Planning a Learning Unit
Introduction and Overview
Claire NAUWELAERS, independent policy expert
Introduction to CPD Quality Assurance
منهج الإطار المنطقي وإطار الرصد والتقييم وإطار النتائج
Dr. Thania Paffenholz Oslo 14 February 2011
Integrated DRR and CCA Mainstreaming TOOL OUTLINE
Evaluation plans for programming period in Poland
Evaluation in the GEF and Training Module on Terminal Evaluations
Country group work on integrating climate into CSP cycle
Coordination Group for Biodiversity and Nature
CATHCA National Conference 2018
WHAT is evaluation and WHY is it important?
Quality in Evaluation: the international development experience
Sustainable Development
Partner Implications from Seminar: New Approaches to Capacity Development Break-out Group: Role of Assessment frameworks.
Strengthening the Role of EQAVET National Reference Points
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
Gender & Equity Issues.
Guidelines on the Mid-term Evaluation
ESF monitoring and evaluation in Draft guidance
Civil Society Facility and Media Programme Call for proposals: EuropeAid/162473/DH/ACT/Multi Webinar no. 3: Preparing effective Concept Note.
Presentation transcript:

Reflections on Revising the Guidance: An Evaluation Dr. Thania Paffenholz Oslo 17 February 2011

Improved Evaluation Practice Output Outcome 1 Outcome 2 Impact Improved Evaluation Practice Awareness for/of EVAL + CPPB field Draft Guidance Contribution to better Quality in CPPB work Outcome 3 Results Input 2 DAC Networks + Experts

Evaluation along Criteria Relevance Is the Draft Guidance responding to the needs of CPPB field? Is the Draft Guidance responding to the needs of evaluation field? Effectiveness Intended outcomes: How effectively has the Draft Guidance replied to the needs of the two main target groups: Evaluators and Evaluation Managers? Unintended outcomes: What kinds of other outcomes (positive and negative) have the Draft Guidance so far produced? Sustainability How can the Guidance be used for sustainable learning in CPPB? What kinds of processes have been build-in to ensure follow ups?

Revision: Main Points Structure of Guidance New Overall Structure needs to serve purpose of audiences Evaluation Managers Evaluators (DEV + CPPB) Broader CPPB field New Overall Structure Introduction of CPPB Context Introduction into evaluations General Specificities of CPPB evaluation (incl. conflict sensitivity as eval goal, transversal theme or conduct issue Managing/Preparing an Evaluation Conducting an Evaluation Preconditions for Evaluations -> Planning for Results, Evaluability and closing strategic gap

Revision: Main Points Chapter 2 Managing/Preparing an Evaluation: Points to be added/changed Evaluation’s general focus Evaluation Criteria (DAC + 3C) + transversal themes (e.g. conflict sensitivity, gender) Process Design Elaborate on Conflict Analysis Topic Build-in Quality Control Phases + Reporting (distinction along Types of eval) Politics and other real world risks Build-in Reference/Steering group + ombudsperson Feedback, Dissemination, Learning etc TORs + how they will be adapted after inception phase! (Flexible) budgets Request (potential) evaluators to develop proposal for evaluation design, approaches, methodologies, feasibility

Revision: Main Points Chapter 3: Conducting an Evaluation: much more focus on HOW Overall evaluation designs Distinction between different types + scopes of evaluations Evaluation Approaches: purposes + HOW+ best practice Linking elements + methodology to criteria Relevance: need for conflict analysis + theory of change + HOW to do it with a set of options + examples (incl. sampling) Effectiveness (theory of change), etc. Clarification about impact assessment (impact versus outcomes versus conflict effects) Core challenges Data gathering under constraints including overreliance on interviews + reality based options on HOW incl. sequences + feasibility Politics

Revision: Main Points Conflict / Context Analysis Transparency about HOW, Ownership and USE in Evaluation Adjusting Types of Analysis to Evaluation Goals Elements Historical, Socio-economic context, etc National + local level Conflict Analysis insufficient, more elements needed Analysis of Peacebuilding Context and short, medium and long-term needs Assessing conflict sensitivity of activities: general + context specific definition and assessment (+ options for how, ex. Coverage/partners, power relations => link to conflict analysis) Assessing conflict monitoring capacity/performance Assessing adaptation capacities ‘Conflict’ is not always the good term!!!

Sustainability Draft Guidance follow ups Revision Dissemination in different forms Capacity Building/Training for Eval Managers + (potential) Evaluators Work on Evaluation Culture Awareness Building in different communities DAC EVAL Net to draft harmonized SUPER Guidance How to use the Guidance for Learning in CPPB INCAF to make use of policy lessons Ongoing feedback loop needs to be institutionalised