Monitoring in Outcome Mapping Principles, Design & Practice

Slides:



Advertisements
Similar presentations
Linking regions and central governments: Indicators for performance-based regional development policy 6 th EUROPEAN CONFERENCE ON EVALUATION OF COHESION.
Advertisements

Introduction to Monitoring and Evaluation
Intelligence Step 5 - Capacity Analysis Capacity Analysis Without capacity, the most innovative and brilliant interventions will not be implemented, wont.
Learning from imGoats experiences with Outcome Mapping
Outcome mapping. Outcome Mapping Developed by the evaluation unit of Developed by the evaluation unit of
Performance management guidance
European Social Fund Evaluation in Italy Stefano Volpi Roma, 03 maggio 2011 Isfol Esf Evaluation Unit Human Resources Policies Evaluation Area Rome, Corso.
Project Monitoring Evaluation and Assessment
An Introduction to an Integrated P,M&E System developed by IDRC Kaia Ambrose October, 2005.
Huib Fred C Jan v O Beatrice B ? Petra K Nele B Heidi S Ben R Lorra T Daniel R Gordon M I want to learn what are the “10 most common arguments against.
Comprehensive M&E Systems
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
Overview of UNDAF process and new guidance package March 2010 u nite and deliver effective support for countries.
Participants should expect to understand: Concept of M&E Importance of gender in M&E Different steps in the M&E process Integrating gender into program/project.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
Gender and Value Chain Training for LIVES Project Team,
Results-Based Management
OSSE School Improvement Data Workshop Workshop #1 January 30, 2015 Office of the State Superintendent of Education.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
May 8, 2012 MWP-K Learning Event Monitoring, evaluation, and learning (MEL) framework for the Millennium Water Program, Kenya.
Impact evaluation: External and internal stakes Impact evaluation seminar - 2 to 6 December, Phnom Penh.
Program (project team) Overall Goal / Vision Interest, Motivation, Role & Responsibility of the Project Intermediate Goal(s) Boundary Partners & their.
Framework for Monitoring Learning & Evaluation
EVALUATION APPROACHES Heather Aquilina 24 March 2015.
Commissioning Self Analysis and Planning Exercise activity sheets.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
1 CORAT AFRICA MANAGEMENT WORKSHOP FOR IMBISA/AMECEA COMMUNICATION COORDINATORS, MAPUTO, MOZAMBIQUE.
ISM 5316 Week 3 Learning Objectives You should be able to: u Define and list issues and steps in Project Integration u List and describe the components.
Monitoring & Evaluation Presentation for Technical Assistance Unit, National Treasury 19 August 2004 Fia van Rensburg.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
UNDP-GEF Community-Based Adaptation Programme Anne-France WITTMANN CBA-Morocco Programme Manager (UNV) Tools & Tips to foster Gender Mainstreaming & Inclusion.
European Commission Joint Evaluation Unit common to EuropeAid, Relex and Development Methodology for Evaluation of Budget support operations at Country.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Engendering Change Mid- Term Learning Review Laura Haylock Kaia Ambrose October 2012.
M & E TOOLKIT Jennifer Bogle 11 November 2014 Household Water Treatment and Water Safety Plans International and Regional Landscape.
Monitoring and Evaluation
Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
Transforming Patient Experience: The essential guide
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Livia Bizikova and Laszlo Pinter
Kathy Corbiere Service Delivery and Performance Commission
SJI Impact Assessment 2014 Based on Impact Assessment for Development Agencies – Learning to Value Change (Chris Roche, Oxfam, digital file 2010)
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Introduction to Outcome Mapping
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Assessing Contribution and Enhancing Social Learning: An Introduction to Outcome Mapping (OM) 16 December 2004 Kaia Ambrose.
27/04/2017 Strengthening of the Monitoring and Evaluation system for FTPP/FTTP in FAO /SEC December 2015 FTPP/FTFP Workshop, Bishkek, Kyrgyzstan.
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Comprehensive M&E Systems: Identifying Resources to Support M&E Plans for National TB Programs Lisa V. Adams, MD E&E Regional Workshop Kiev, Ukraine May.
Anastasia Muiti, NEMA Monitoring of adopt a river project.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 11. Reporting.
Monitoring and Evaluating Rural Advisory Services
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
KEY PRINCIPLES OF THINKING SYSTEMICALLY
CATHCA National Conference 2018
Introduction to Outcome Mapping
I want to share “tips for overcoming resistance”
Evaluation: Framing Outcomes For Impact
Engendering Change Mid-Term Learning Review
OGB Partner Advocacy Workshop 18th & 19th March 2010
Interest, Motivation, Role & Responsibility of the Project
Presentation transcript:

Monitoring in Outcome Mapping Principles, Design & Practice Steff Deprez & Kaia Ambrose OM Lab Dar Es Salaam, Tanzania 22-23 Sept, 2014 Dimensions for ice breaker p, m or e eval for self, eval for client Development? tangible challenges in M&E experience of OM intention to use OM projects in the room

Monitoring in Outcome Mapping 1. Core principles of monitoring in Outcome Mapping 2. Monitoring Design Issues Experiences from practice 3. Monitoring Practice Approaches, tools & instruments Steff Deprez & Kaia Ambrose (Sept 2014) 2

1. Core principles of monitoring in Outcome Mapping Based on OM Manual

Steff Deprez & Kaia Ambrose (Sept 2014) 4

Outcome Mapping Monitoring Systematic collection of data on outcomes and performance A regular learning & improvement cycle Credit a program for its contribution to bringing about change Encourages the program to challenge itself Steff Deprez & Kaia Ambrose (Sept 2014) 5

Outcome Mapping Monitoring Flexibility Participatory Evaluative thinking Organisational & social learning Power of self-assessment Regular face-to-face meetings Steff Deprez & Kaia Ambrose (Sept 2014) 6

Steff Deprez & Kaia Ambrose (Sept 2014) 7

Steff Deprez & Kaia Ambrose (Sept 2014) 8

within the broadest development context or sphere of interest design boldly within the broadest development context or sphere of interest M&E modestly within the program’s sphere of influence Clear separation between the two Steff Deprez & Kaia Ambrose (Sept 2014) 9

Outcome Mapping offers a system/process to gather data & encourage reflection on: The progress of external partners towards the achievement of outcomes (progress markers) 2. The internal performance of the program (strategy maps) 3. The program's functioning as an organizational unit (Organisational Practices) Steff Deprez & Kaia Ambrose (Sept 2014) 10

Monitoring in Outcome Mapping indirect influence Sphere of interest Beneficiary 1 Beneficiary 2 Focus of M&E Beneficiary 3 direct influence Outcomes Sphere of influence Boundary partner 2 Boundary partner 1 Behavioral changes Intervention Strategies Boundary partner 3 direct control Sphere of control Efficiency & Relevancy Implementing team Organisational Practices Viability Steff Deprez & Kaia Ambrose (Sept 2014) 11

3 types of monitoring journals indirect influence Beneficiary 1 Beneficiary 2 Focus of M&E OM Journals Beneficiary 3 direct influence Outcomes Outcome Journal Boundary partner 2 Boundary partner 1 Behavioral changes Intervention Strategies Strategy Journal Boundary partner 3 direct control Efficiency & Relevancy Implementing team Organisational Practices Performance Journal viability Steff Deprez & Kaia Ambrose (Sept 2014) 12

Monitoring Plan Steff Deprez & Kaia Ambrose (Sept 2014) 13

Critical questions? Programme response What should we keep doing? What do we need to change in order to improve? Are we still working with the right BPs? What strategies/practices do we need to add? What strategies do we need to end? What should be evaluated in more depth? Steff Deprez & Kaia Ambrose (Sept 2014) 14

2. Monitoring design issues Experiences from practice

Making learning explicit Use of Spaces & Rhythms (*) Conventional M&E design Information needs related to the programme framework (objectives, results, outcomes, … + indicators) Data collection methods Reporting Outcome Mapping (as presented in the OM manual) Based on principles of Utilisation-Focused Evaluation Focus on monitoring priorities: Who will use it? Purpose? Use of outcome, strategy and performance journals (*) introduced by Guijt & Ortiz (2008) Steff Deprez & Kaia Ambrose (Sept 2014) 16

Monitoring Plan Steff Deprez & Kaia Ambrose (Sept 2014) 17

Assumptions about monitoring (in OM) Monitoring process = learning process > reflection and analysis happen automatically An actor-centered design leads to a participatory monitoring process M&E results will be used Users have the capacity, time and willingness to participate or facilitate the monitoring process Using outcome journals & strategy journals is enough to pave the way forward The monitoring process is embedded in organisational or programme management cycles … Steff Deprez & Kaia Ambrose (Sept 2014) 18

Making learning explicit For learning to happen > data is not the starting point Start from the intended use Start with defining the spaces that are crucial for debate, sharing, reflection and decision-making Make the monitoring integral to the thinking and doing of the organisation and programme Steff Deprez & Kaia Ambrose (Sept 2014) 19

Learning-oriented monitoring (Source: Seeking Surprise (Guijt, 2008) Steff Deprez & Kaia Ambrose (Sept 2014) 20 20

Three core steps in the design of a learning-oriented monitoring system 1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS Which information is required, for who, at what time/event, in what form, to do what? Steff Deprez & Kaia Ambrose (Sept 2014) 21

1. BE CLEAR ON PURPOSE, USES & USERS E.g. Intended uses of M&E process (Patton) Steff Deprez & Kaia Ambrose (Sept 2014) 22

1. BE CLEAR ON PURPOSE, USES & USERS e.g. Wheel of Learning Purposes (Guijt, 2008) Steff Deprez & Kaia Ambrose (Sept 2014) 23 23

1. BE CLEAR ON PURPOSE, USES & USERS e.g. PLA system VECO ACCOUNTABILITY Programme improvement LEARNING Knowledge creation VECO & partners Programmatic & financial accountability Negotiation & understanding chain actors Upward & downward accountability Evidence building & upscaling Short-term planning Long-term & strategic planning PLANNING Steff Deprez & Kaia Ambrose (Sept 2014) 24 24 24

1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS What are the spaces and rhythms central to planning, learning, accountability, debate, decision-making, … (Guijt & Ortiz, 2007) How can we ensure that monitoring is integral to the thinking and doing of the organisation and programme? Steff Deprez & Kaia Ambrose (Sept 2014) 25

2. DEFINE ORGANISATIONAL SPACES & RHYTHMS Formal and informal meetings and events which bring organisations and programmes to life Rhythms Patterns in time, the regular activities or processes which provide a structure-in-time, through which an organisation can direct, mobilise and regulate its efforts, i.e. regular weekly, monthly, annual activities that characterise the tempo of organisational functioning.   When do people interact and share information and make sense of what is happening? Steff Deprez & Kaia Ambrose (Sept 2014) 26

Description of the main spaces & rhythms EVENT What is the purpose of the event Time/ frequency Who participates coordinates What is the expected output Which data Information is required … > Group exercise Steff Deprez & Kaia Ambrose (Sept 2014) 27

1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS Which data & information is required? What type of data / information? From ‘Nice-to-know’ to ‘Must-know’ information Steff Deprez & Kaia Ambrose (Sept 2014) 28

Steff Deprez & Kaia Ambrose (Sept 2014) 29

Information needs linked to the main spaces & processes EVENT What is the purpose of the event Time/ frequency Who participates coordinates What is the expected output Which data Information is required … Information needs linked to the main spaces & processes Steff Deprez & Kaia Ambrose (Sept 2014) 30

Masterlist of info needs General info need Specific information need? (indicators or questions) Frequency By when? Who has the info? Or where is the data generated? Data collection method/approach? Who will use the info at which event?

Plan for Sensemaking 1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS Plan how data is used and analysed > make it usable for action Focus on social interaction: sharing, reflection, debate, decision Should be well-planned & facilitated > it will not happen by itself Steff Deprez & Kaia Ambrose (Sept 2014) 32

Institutionalising a learning-oriented M&E practice How to make sure that your monitoring principles and design is translated in an effective monitoring practice? 1. Creating organisational conditions: motives, means & oppportunities 2. The ‘web of institutionalisation’ > Should be reflected in the Organisational Practices Steff Deprez & Kaia Ambrose (Sept 2014) 33

Creating the motives, means & opportunities Creating Motives -Guiding ideas -Support by management -Develop a learning Culture -Provide incentives Creating Means -Human capacities -Specialist support -Concepts, methods and tools -Budget Creating Opportunities -Integration in planning and management -Clear M&E plans and responsabilities -Responsive information management system -Trust and respect – speak out, challenge, feedback (Steff Deprez, 2009) Steff Deprez & Kaia Ambrose (Sept 2014) 34

2. The web of institutionalisation (Levy, 2006) See: THE PROCESS OF INSTITUTIONALISING GENDER IN POLICY AND PLANNING: THE ‘WEB’ OF INSTITUTIONALISATION (Levy, 2006) Steff Deprez & Kaia Ambrose (Sept 2014) 35

3. Monitoring Practice in OM Experiences from practice

Monitoring Practice in OM Working with progress markers, boundary partners and organizational practices Sense-making with boundary partners Ongoing challenges …by showing you a methodology that IDRC developed to respond to its needs – OM.

Thinking through the different aspects of monitoring M&E plan (Performance Measurement Framework) – more narrative Unpack different moments Data from grantees survey quan and qual Data from POs trip reports Collate who? where? (database) Team interpretation process for doing this rubrics / dashboard feed into program level

Working with progress markers What is your purpose and use? What is your monitoring culture? What resources do you have for monitoring? What qualitative and quantitative data needs do you have?

GrOW program Show matrix

Improved social and economic status of 35,000 low-income women in 11 districts in East Harague Intermediate outcome: Increased financial inclusion and entrepreneurship of low-income women Immediate outcome: Improved business and technical skills/capacities of low- income women for enterprise development Immediate outcome: Increased access of low-income women to financial institutions and services Intermediate outcome: Increased participation of low-income women in household and community level decision making Immediate outcome: Increased access of low-income women to legal and social service providers men ? FSP leaders Outputs come below These intended / desired results, get translated into indicators of increase and decrease in: # of F/M accessing financial services from a variety of different sources % of F/M VS&L members reporting having utilized at least one type of financial service (bank, MFI, insurance, etc.) # of VSLA members, M and W, with accounts (number, type, size) % or points of gap in women’s and men’s participation in HH decision making

Outcome Mapping Logic Model So this is where OM came in to help us unpack pieces of the LF for our monitoring. We saw this fusion like at accordian

Project Monitoring now involves: Baseline and endline Output monitoring Outcome monitoring using OM Qualitative inquiry (using Rolling Profiles tool) Knowledge + Practice Surveys (drawing from some of the progress markers) Observation (using Field Diaries based on progress markers) Regular, facilitated reflection and sense-making processes to feed into program decision-making Using the tools and the reflection, a focus on first and foremost the change, including unexpected change, and then the reasons for change Each of the changes is tagged to the appropriate indicator and outcome in the logic model (tells the back story) The diaries and the rolling profiles also are meant to collect unexpected change, negative and positive (need to know who to ask / how to ask or how to observe)

First look for change (positive, negative and unexpected), then go back to your pathways and see if that hypothesis has changed and why or why not. What are the influencing factors and actors?

Challenges Qualitative data collection – informal interviews, observation (including looking for unexpected – positive and negative) Qualitative analysis – looking for patterns and trends Critical analysis and sense-making – the need for facilitated, well-constructed (agenda, exercises) spaces and processes Usage of information - challenges: observing and note taking; informal inteviews, looking for patterns, trends and deviants; using different data to tell a story gather real time data to inform on-going decision-making and adaptation from the beginning to end of the development of the innovation.

Evolving Lessons Monitoring beyond outputs Good Enough (in terms of tools, capacity) and build from there M&E – mande – evaluative thinking – - explicit sense-making spaces

Working with progress markers Use progress markers as a checklist to track progression against pre-defined behavioural changes for a specific partner in a specific period of time; use of scoring (LMH, 1234, colour) Write a qualitative description of change (I.e. every 4-6 months) for each pre-defined PM for a respective period Other monitoring tools, qual or quan, that are then cross-referenced with pre-defined PMs (new ones added)

Working with progress markers: who? BP describe their own change - then send to implementing team Implementing team describes change based on their own observations Mutual reflection process with team and BPs External evaluator judges progression in change

Working with progress markers: what? Every single PM monitored Only PMs that are relevant for a specific period PMs and / or OCs used to trigger discussion during reflection process; key changes documented Depth of analysis can vary Across different BPs (comparison) In combination with SMs (effective intervention?)

Working with progress markers Using progress markers at an organisational scale (across programmes) Different geographical regions Different thematic foci Different type and bigger numbers of boundary partners

Working with progress markers Use of progress markers for whom? Use of general & standard progress markers for each type of BP less (or not) useful for the individual projects and their actors Tailor-made progress markers for individual BPs + relevant to guide and steer local projects and their actors less useful for higher levels, overload detailed data & aggregation is difficult

Sensemaking with BPs Regular (4-6 months) reflection with staff + BP (one or more), external evaluator, other stakeholders Embedded in spaces and rhythms of the programme

Working with strategy maps Useful for monitoring relevance, effectiveness and efficiency of the supporting activities of the implementing team Understand how you spend time and money within a programme team Get direct feedback from BPs on the required / requested support

e.g. VECO Indonesia - 9 categories of strategy maps Every activity that is carried out by the VECO can be linked to one of the 9 support strategies. 55

9 categories of strategy maps EXAMPLE MONTHLY OVERVIEW I1 I2 I3 I4 E1 E2 E5 56

Working with organisational practices Use as is in manual Use of OCs and PMs at the level of the implementing organization Motives, Means and Opportunities Web of institutionalisation

Reports and Journals Simplified journal formats Combined PM and SM reports Integrate elements from journals into regular programme reports Integration in MIS, databases, etc. Used as final report, or as report embedded within another report OM journal as a database SM as an activity guide

Evaluation using OM OM in retrospective - whether the initiative has used Intentional Design or not Oxfam Mid-Term Learning Review - asking BPs to reconstruct most significant changes, influencing actors and factors of that change, and effects / consequences of that change; modified OM journals + workshops for group reflection Reconstruction of results using OM as a framework McKnight Foundation evaluation - tracking progression of change / reconstructing pathway of change (“then what happened”); create outcome journals for future monitoring Outcome Harvesting

Combined use of OM and the logical framework

Three ways of combining LFA & OM Modifications of OM framework with elements of LFA LFA+ Modificiations of an LFA with elements of OM Combined uses OM frameworks that are linked to/ feed into LFA

OM+ Additional layer of objectives (+indicators) to pinpoint exactly to what BPs are contributing Adaptations: An OM framework for each programme objective The specific objective ‘translates the vision’ in something tangible and measurable (use of SMART indicators) – is key for integration with LFA Some programmes add an ‘ouput’ layer between strategy maps and outcomes

OM+ Additional layer of objectives (+indicators) + intermediate tangible results (+indicators) to clearly state what BPs are contributing

LFA+ Logical Framework of a specific project n t e rve i o l g c d cat r s S ou rce of Ve if cati on A ss um p G oal / D evel opm en t O bj R es u s / O om ivi ies P oj ct Ob jec ive / po se ≈ Specific objectives + indicators ≈ Outcome Challenge + Progress markers ≈ Activities VECO Logical Framework of a specific project Actor-centred results: results describing changes in behaviour OM Framework of a specific project

Combined use of LFA & OM Logical framework of the over-all programme rve i o l g c d cat r s S ou rce of Ve if cati on A ss um p G oal / D evel opm en t O bj R es u s / O om ivi ies P oj ct Ob jec ive / po se Logical framework of the over-all programme Synthesis of the different projects ≈ Specific objectives + indicators ≈ Chain results + indicators ≈ PMs BPs + Activities VECO OM frameworks are guiding the PM&E of specific projects