Lessons Learned from Evaluation: A platform for sharing knowledge Mike Spilsbury, Catrina Perch, Seg Norgbey, Ganesh Rauniyar and Cristina Battaglino.

Slides:



Advertisements
Similar presentations
Chapter 7 System Models.
Advertisements

Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
ITIL: Service Transition
©Ian Sommerville 2004Software Engineering, 7th edition. Chapter 8 Slide 1 System modeling 2.
Corporate-Level Evaluation on IFAD’s Engagement in Fragile and Conflict Affected States and Situations: Draft Final Report 4 February 2014.
©Ian Sommerville 2006Software Engineering, 8th edition. Chapter 8 Slide 1 System models.
Modified from Sommerville’s originalsSoftware Engineering, 7th edition. Chapter 8 Slide 1 System models.
©Ian Sommerville 2000Software Engineering, 6/e, Chapter 71 System models l Abstract descriptions of systems whose requirements are being analysed.
Modified from Sommerville’s originalsSoftware Engineering, 7th edition. Chapter 8 Slide 1 System models.
Action Writing Action Statements Writing action statements is the first step in the second (action) stage of the public health nutrition (PHN) intervention.
Purpose of the Standards
Clinical Decision Support and Knowledge Management System in Healthcare.
Summary Report of CSO Meeting GEF Expanded Constituency Workshop (ECW), Southern Africa 15 th July 2013 Livingstone, Zambia (
OPTIONS AND REQUIREMENTS FOR ENGAGEMENT OF CIVIL SOCIETY IN GEF PROJECTS AND PROGRAMMES presented by Faizal Parish Regional/Central Focal Point GEF NGO.
UNDP and the Human Rights-based Approach to Programming; Enhanced attention to Minorities in Development United Nations Development Programme.
EVALUATION IN THE GEF Juha Uitto Director
Chapter 17 Nursing Diagnosis
1 “Adaptation to the consequences of Climate Change: Progress achieved and capacity building needed” Budapest, November 19-20, 2007 Strategic Environmental.
Assessing Student Performance in Practice Colin Bright – Solent University June Tilling – Soton University.
Kazakhstan Centres of Excellence Teacher Education Programme Assessment of teachers at Level Two.
©Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 7 Slide 1 System models l Abstract descriptions of systems whose requirements are being.
Chapter 4 System Models A description of the various models that can be used to specify software systems.
System models Abstract descriptions of systems whose requirements are being analysed Abstract descriptions of systems whose requirements are being analysed.
Curriculum development A brief guide to the construction of relevant curricula.
Evaluation in the GEF and Training Module on Terminal Evaluations
Participatory research to enhance climate change policy and institutions in the Caribbean: ARIA toolkit pilot 27 th meeting of the CANARI Partnership January.
IAEVG Quebec June Career Development Benchmarks-Tertiary Programmatic rather than practitioner competencies Compare o UK Matrix standard o AGCAS.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Facilitating reflective practice – experiences to date. Dr Alan Masson Director: CETL(NI) for Institutional E- learning Services (CIES), University of.
8 TH -11 TH NOVEMBER, 2010 UN Complex, Nairobi, Kenya MEETING OUTCOMES David Smith, Manager PEI Africa.
Presenting Evidence to meet the Graduating Teacher Standards at the conclusion of Edprac 608 The use of MyPortfolio.
Chapter 7 System models.
1 PROJECT CYCLE MANAGEMENT Gilles Ceralli TR Methodology – HI Luxembourg 06/2008.
System models l Abstract descriptions of systems whose requirements are being analysed.
Pertemuan 19 PEMODELAN SISTEM Matakuliah: D0174/ Pemodelan Sistem dan Simulasi Tahun: Tahun 2009.
Modified by Juan M. Gomez Software Engineering, 6th edition. Chapter 7 Slide 1 Chapter 7 System Models.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
OUR LAND – OUR WEALTH, OUR FUTURE, IN OUR HANDS THE STRATEGIC INVESTMENT PROGRAM (SIP) Monitoring and Evaluation Plan July 2006 Midrand, South Africa.
Sommerville 2004,Mejia-Alvarez 2009Software Engineering, 7th edition. Chapter 8 Slide 1 System models.
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
GEOSS Mid-Term Evaluation Detailed Framework. Issues with Plan Clear Direction on Scoping Data Issues- Made explicit so as not to raise expectations or.
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Planning schemes Statutory documents under s.20(2) of LUPAA –outline policies and objectives –regulate or prohibit use or development of land –reserve.
UNITAR SEMINAR – February 22, 2012 Paul Balogun- Consultant
META-ANALYSIS, RESEARCH SYNTHESES AND SYSTEMATIC REVIEWS © LOUIS COHEN, LAWRENCE MANION & KEITH MORRISON.
Software Architecture Evaluation Methodologies Presented By: Anthony Register.
1 UNEP/IETC EST Initiative Proposed Cooperation Framework 4 December 2003 Otsu, Japan.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Achieving Semantic Interoperability at the World Bank Designing the Information Architecture and Programmatically Processing Information Denise Bedford.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
 To explain why the context of a system should be modelled as part of the RE process  To describe behavioural modelling, data modelling and object modelling.
Collaborative Actions for Sustainable Tourism Achievements, Best Practices and Challenges Hugh Gibbon Regional Coordination Unit, Nairobi, Kenya.
Evaluation Capacity Building at Country Level: GEF Focal Points 1 Osvaldo Néstor Feinstein AEA 2011 Conference GEF Evaluation Office Panel.
GCE Software Systems Development A2 Agreement Trial Implementing Solutions October 2015.
Engineering, 7th edition. Chapter 8 Slide 1 System models.
Title of presentation Copyright IDS and MeTA 2010
Chapter 33 Introduction to the Nursing Process
Lisa Dawson – Head of Student Systems Operations
HR0277 Change, Work and Diversity
Programme Review Dhaya Naidoo Director: Quality Promotion
Measuring Outcomes of GEO and GEOSS: A Proposed Framework for Performance Measurement and Evaluation Ed Washburn, US EPA.
Abstract descriptions of systems whose requirements are being analysed
Cross-Cutting Capacity Development (CCCD)in the GEF – A REVIEW GEF Expanded Constituency Workshops 2017.
Eurostat Quality Management (in the ESS context)
Topic Principles and Theories in Curriculum Development
Schools of Excellence Recognition Program For Century Learning
Regulated Health Professions Network Evaluation Framework
M & E Plans and Frameworks
Presentation transcript:

Lessons Learned from Evaluation: A platform for sharing knowledge Mike Spilsbury, Catrina Perch, Seg Norgbey, Ganesh Rauniyar and Cristina Battaglino

Common views on lessons learned ‘Lessons’ are: Often of variable quality and are generally underutilised Often “platitudes borne of a felt need to demonstrate engagement in the ‘knowledge society’ or to satisfy evaluation requirements” ‘Lessons learned’ should more accurately be regarded as ‘lessons to be learned’

Key Problems with Lessons 1.) Lessons are often poorly formulated (low quality) 2.) Processes to promote dissemination and uptake of lessons are often weak 3.) Lessons are often considered individually and patterns across lessons are not known

Improving Lessons: Developing a platform for sharing lessons EOU embarked on a process: 1.To screen all existing lessons applying minimum quality standards based on lesson definitions 2.Classify lessons in a ‘problem tree framework’ 3.Use the framework as a tool for enhancing uptake, dissemination and access to UNEP evaluation lessons

Screening existing lessons What constitutes a lesson? A quality lesson must: –concisely capture the context from which it is derived –be applicable in a different context (generic), have a clear ‘application domain’ and identify target users. –should suggest a prescription and should guide action Low quality lessons were identified and removed from the UNEP lessons database

Screening existing lessons UNEP EOU had a database of approximately three hundred ‘lessons’ from evaluations (1999-date) Nearly 50% of all the lessons analysed failed to satisfy the criteria This led us to specify attributes of quality lessons in TORs, evaluation guidelines and evaluation quality control processes Whole process – 4 eval. professionals x 2 weeks = 2 man months (spread over 5 months)

Developing a framework of lessons A ‘problem tree’ approach was adopted - since the bulk of UNEP’s lessons are derived from evaluations of projects or sub-programmes a generic or ‘central’ problem was defined as: “UNEP projects and programmes have sub-optimal impact”

‘Central’ and ‘cornerstone’ problems for clustering lessons Lessons were debated and ‘underlying’ problems were identified (or inferred). The problems were then clustered and organised in a hierarchy of causality.

Lesson I.D. numbers

Lesson 112 “It is critical that the internal logic of the project be very clearly spelled out in the project document and that the strategic linkages between outcomes and objectives are made very clear. Those implementing or supervising a project are frequently completely different people from those who developed the project. The Project Document needs to be a self-explanatory, stand-alone document.” Mid-Term Evaluation of the UNEP UNDP GEF Project “Botswana, Kenya and Mali: Management of Indigenous Vegetation for the Rehabilitation of Degraded Lands in Arid Zones of Africa”GF/

Advantages of the Lessons Framework Multiple lessons can be clustered around commonly occurring issues (or ‘root causes’), - can provide ‘triangulation’ for commonly articulated lessons Lessons can be associated with more than one issue or problem - rather than applying a mutually exclusive (‘taxonomic’) classification approach Aids identification of commonly occurring problems across a project / programme portfolio

Application of the lessons framework The framework is NOT a definitive statement on causality EOU regards the categorisation of lessons in the framework per se as much less important than the process of discussion and debate about such categorisation The process of classifying lessons within the framework with key intended users will provide an excellent interactive means of promoting their uptake or ‘influence’ as new lessons are examined within the context of all others The application of the ‘lessons platform’ is currently in a pilot phase

Findings and Conclusions Many lessons failed to meet the criteria developed for high quality lessons and were deleted from the EOU evaluation lessons database. This prompted the unit to specify more clearly the requirements for drafting lessons in our standard evaluation guidelines, TORs and evaluation quality control processes. The lessons screening process reduced both the volume of information and significantly raised the overall quality and relevance of our lessons. The framework of lessons learned from evaluation provides a means to visualise all lessons at once, and to see how different lessons relate to common problems and to one another. It is intended to be a user-friendly way of presenting and storing information in relation to lessons from evaluation.

Findings and Conclusions We regard the framework of lessons from evaluation as useful ‘platform’ for both collating and disseminating lessons – it provides a tool for discussing evaluation lessons with intended users. The problem-oriented nature of the lessons framework provides an intuitive and interactive ‘user interface’ to the usual databases of lessons that are commonly collated by evaluation units.