Policy and Operations Evaluation Department (IOB) Evaluability Asessment: Preparatory Steps before Starting an Evaluation Prof. dr. Ruerd Ruben Director.

Slides:



Advertisements
Similar presentations
KYRGYZ REPUBLIC Programmatic Public Expenditure Review Monitoring and Evaluation The MOF/Donor Workshop on PPER Bishkek, September 26, 2005.
Advertisements

Site-Based Decision Making Campus Planning. Restructuring A process through which a district or school alters the pattern of its structures (vision, rules,
Introduction to Monitoring and Evaluation
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Donald T. Simeon Caribbean Health Research Council
Focuser - Director Key concern: Wants to know: Preferred rule: Values: Management style: Values of self: Values of others: As a follower, respects: Works.
SEM Planning Model.
INTEGRATING BENEFICIARY FEEDBACK INTO EVALUATION- A STRUCTURED APPROACH Presentation to UKES Conference May 2015 Theme: Theory and practice of inclusion.
An Introduction to Monitoring and Evaluation for National TB Programs.
PPA 502 – Program Evaluation
PPA 502 – Program Evaluation
Monitoring, Review and Reporting Project Cycle Management A short training course in project cycle management for subdivisions of MFAR in Sri Lanka.
Session 3 - Plenary on implementing Principle 1 on an Explicit Policy on Regulatory Quality, Principle 3 on Regulatory Oversight, and Principle 6 on Reviewing.
Health Systems and the Cycle of Health System Reform
Standards and Guidelines for Quality Assurance in the European
EuropeAid’s today’s quality framework and the Results agenda
Continuous Quality Improvement (CQI)
Monitoring & Evaluation in World Bank Agriculture and Rural Development Operations SASAR Monitoring & Evaluation Workshop New Delhi; June 20, 2006.
EVALUATION IN THE GEF Juha Uitto Director
Lesson 3: Monitoring and Indicator Macerata, 23 nd October Alessandro Valenza, Director, t33 srl.
Performance Measurement and Analysis for Health Organizations
 Summary Presentation of Haiti  Norway’s Evaluation: Basic Information  Challenges Leading to Policy Level Findings  Lessons from the Norwegian Portfolio.
EQARF Applying EQARF Framework and Guidelines to the Development and Testing of Eduplan.
11 Saleh Saeed, CEO Disasters Emergency Committee The Disasters Emergency Committee: A Strategic Alliance for Effective Fundraising During Disasters.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Monitoring Monitoring forms part of the project cycle: Project Identification Planning Appraisal - Decision Implementation – Monitoring Evaluation Difference.
Ref: Youth Peer Education toolkit: Standard of Peer Education Programmes, 2005.
Project design & Planning The Logical Framework Approach An Over View Icelandic International Development Agency (ICEIDA) Iceland United Nations University.
1 Monitoring and Evaluation John Hough RBEC Environment & Energy Practice Workshop Almaty, 6-9 October 2004.
IAOD Evaluation Section, the Development Agenda (DA) and Development Oriented Activities Julia Flores Marfetan, Senior Evaluator.
MaineHousing Organizational Assessment Strategic Plan engaged our external partners, stakeholders, and staff and set broad goals for the agency Organizational.
Quality Management (WP5) Roman CHIRCA Agency for Innovation and Technological Transfer TecTNet ………... This project has been funded with support from the.
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
Workshop II Monitoring and Evaluation INTERACT ENPI Annual Conference December 2009 | Rome.
MAINSTREAMING MONITORING AND EVALUATION IN EDUCATION Can education be effectively managed without an M & E system in place?
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
BASELINE SURVEYS AND MONITORING OF PHARMACEUTICAL SITUATION IN COUNTRIES. Joseph Serutoke NPO/EDM WHO Uganda November 2002.
Aaron Zazueta Chief Evaluation Officer 2013 EVALUATION IN THE GEF.
Project Cycle Management for International Development Cooperation Indicators Teacher Pietro Celotti Università degli Studi di Macerata 16 December 2011.
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
 Summary Presentation of Haiti  Challenges Leading to Policy Level Findings  Lessons from the Norwegian Portfolio in Haiti  Lessons Learned.
Pilot Projects on Strengthening Inventory Development and Risk Management-Decision Making for Mercury: A Contribution to the Global Mercury Partnership.
DANIDA’s Experience of Results Managing for Development Results Peter Ellehøj – Quality Assurance Department November 2011.
DAC Evaluation Quality Standards Workshop, Auckland 6/2 & 7/ Evaluation quality standards in Dutch Development Cooperation Ted Kliest Policy and.
Experiences, Trends, and Challenges in Management of Development Assistance Operators for Effective Monitoring & Evaluation of Results Linda Morra Imas.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Livia Bizikova and Laszlo Pinter
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
Quality Assessment of MFA’s evaluations Rita Tesselaar Policy and operations Evaluation Department Netherlands Ministry of Foreign Affairs.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Session 2: Developing a Comprehensive M&E Work Plan.
Quick Instruction Guide Developing an Airport Performance-Measurement System ACRP Report 19: Developing an Airport Performance-Measurement System Performance.
IMPLEMENTING LEAPS IN CENTRAL AND EASTERN EUROPE: TRAINERS’ HANDBOOK Monitoring and Evaluating Results.
Folie 1 Sarajevo, October 2009 Stefan Friedrichs Managing Partner Public One // Governance Consulting Project Management in the Public Sector Monitoring.
African Agriculture Planning Scorecard. The challenge Wide variations in how African countries practice agricultural planning and budgeting Some practices.
Stages of Research and Development
Monitoring and Evaluating Rural Advisory Services
Background Non-Formal Education is recognized as an important sub-sector of the education system, providing learning opportunities to those who are not.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
Conducting Efficacy Trials
Tracking development results at the EIB
Policy and Budget Monitoring and Evaluation
The SWA Collaborative Behaviors
Evaluation network DG REGIO 14th April 2011
Monitoring & Reporting 2019
OGB Partner Advocacy Workshop 18th & 19th March 2010
ESF monitoring and evaluation in Draft guidance
Integrating Gender into Rural Development M&E in Projects and Programs
Presentation transcript:

Policy and Operations Evaluation Department (IOB) Evaluability Asessment: Preparatory Steps before Starting an Evaluation Prof. dr. Ruerd Ruben Director Policy & Operations Evaluation Department (IOB)

2 Rationale Growing number of pseudo evaluations Repositioning Evaluation in PM&E cycle Concept of ‘evaluability’ (Whiley 1979; Smith 1989) Procedures for Evaluability Assessment (EA) Further steps

3 Pseudo evaluation Growing number of aid projects/ programs. DAC Evaluation Resource Centre (DeRec): > 1000 evaluations MinBuza: substantial decentralized & local (embassy) evaluations Broad definition of evaluation: systematic assessment if and how activities contributed to stated goals

4 Loss of Evaluability Review of Dutch Private Sector Programs  2/3 of executed ‘evaluations’ cannot be used Major reasons: - evaluation agency not fully independent - stated objectives too broad/vague - no clear indicators defined - data at too aggregate level - absence of baseline data - no representative sampling - too general intervention theory

5 Evaluation in the PME Cycle Ex-post assessment of program performance Separation of ‘monitoring’ (internal) and ‘evaluation’ (external) Tracking of evaluation results Ex-ante guarantees for effective monitoring/evaluation Ex post Ex ante EA

6 Theory of Evaluability Systematic process to determine whether program evaluation is feasible. Decide whether an evaluation is useful to provide timely, relevant findings for decision makers. Wholey (1979): initial steps to assess if a program can be evaluated  (a) set of clear objectives (+ side effects) (b) measurable performance indicators Smith (1989): ten steps procedure for Evaluability Assessment  initial diagnostics

7 Evaluability Assessment 1. Identify relevant stakeholders 2. Define boundaries of the program 3. Analyze available program documents 4. Clarify intervention theory (goals, resources, activities, outcomes) 5. Analyze stakeholders perceptions of the program 6. Assess target population(s) 7. Discuss differences in outcome perceptions 8. Determine plausibility of intervention model 9. Discuss validity of the program 10.Decide about continuation (= full evaluation)

8 Key EA Questions 1.Does the program serve the population for whom it was designed? 2.Does the program have the resources (available/used) as scheduled in the program design? 3.Are the program activities being implemented as designed? 4. Does the program have the capacity to provide data for an evaluation?

9 Crucial Steps EA has five crucial tasks that an evaluator must successfully complete: Task 1. Study the program history, design, and operation; Task 2. Watch the program in action (participants) Task 3. Determine the program’s capacity for data collection, management and analysis; Task 4. Assess the likelihood that the program will reach its goals and objectives (consistency) Task 5. Show whether an evaluation is feasibly (will help the program and its stakeholders).

10 Towards EA operationalization Screening of programs with expenditures > € 5 million Develop EA Protocol (e.g. Unifem’s Checklist for Programme Evaluability UNEF Delivery as One United Nations) Systematic review by sector, by channel, by country, etc Ex-ante assessment (part of project approval procedure) Yearly reporting (feedback to program design & operations)