Presentation is loading. Please wait.

Presentation is loading. Please wait.

Monitoring in Outcome Mapping Principles, Design & Practice

Similar presentations


Presentation on theme: "Monitoring in Outcome Mapping Principles, Design & Practice"— Presentation transcript:

1 Monitoring in Outcome Mapping Principles, Design & Practice
Steff Deprez & Kaia Ambrose OM Lab Dar Es Salaam, Tanzania 22-23 Sept, 2014 Dimensions for ice breaker p, m or e eval for self, eval for client Development? tangible challenges in M&E experience of OM intention to use OM projects in the room

2 Monitoring in Outcome Mapping
1. Core principles of monitoring in Outcome Mapping 2. Monitoring Design Issues Experiences from practice 3. Monitoring Practice Approaches, tools & instruments Steff Deprez & Kaia Ambrose (Sept 2014)

3 1. Core principles of monitoring in Outcome Mapping
Based on OM Manual

4 Steff Deprez & Kaia Ambrose (Sept 2014) 4

5 Outcome Mapping Monitoring
Systematic collection of data on outcomes and performance A regular learning & improvement cycle Credit a program for its contribution to bringing about change Encourages the program to challenge itself Steff Deprez & Kaia Ambrose (Sept 2014)

6 Outcome Mapping Monitoring
Flexibility Participatory Evaluative thinking Organisational & social learning Power of self-assessment Regular face-to-face meetings Steff Deprez & Kaia Ambrose (Sept 2014)

7 Steff Deprez & Kaia Ambrose (Sept 2014) 7

8 Steff Deprez & Kaia Ambrose (Sept 2014) 8

9 within the broadest development context or sphere of interest
design boldly within the broadest development context or sphere of interest M&E modestly within the program’s sphere of influence Clear separation between the two Steff Deprez & Kaia Ambrose (Sept 2014)

10 Outcome Mapping offers a system/process to gather
data & encourage reflection on: The progress of external partners towards the achievement of outcomes (progress markers) 2. The internal performance of the program (strategy maps) 3. The program's functioning as an organizational unit (Organisational Practices) Steff Deprez & Kaia Ambrose (Sept 2014)

11 Monitoring in Outcome Mapping
indirect influence Sphere of interest Beneficiary 1 Beneficiary 2 Focus of M&E Beneficiary 3 direct influence Outcomes Sphere of influence Boundary partner 2 Boundary partner 1 Behavioral changes Intervention Strategies Boundary partner 3 direct control Sphere of control Efficiency & Relevancy Implementing team Organisational Practices Viability Steff Deprez & Kaia Ambrose (Sept 2014)

12 3 types of monitoring journals
indirect influence Beneficiary 1 Beneficiary 2 Focus of M&E OM Journals Beneficiary 3 direct influence Outcomes Outcome Journal Boundary partner 2 Boundary partner 1 Behavioral changes Intervention Strategies Strategy Journal Boundary partner 3 direct control Efficiency & Relevancy Implementing team Organisational Practices Performance Journal viability Steff Deprez & Kaia Ambrose (Sept 2014)

13 Monitoring Plan Steff Deprez & Kaia Ambrose (Sept 2014)

14 Critical questions? Programme response What should we keep doing?
What do we need to change in order to improve? Are we still working with the right BPs? What strategies/practices do we need to add? What strategies do we need to end? What should be evaluated in more depth? Steff Deprez & Kaia Ambrose (Sept 2014)

15 2. Monitoring design issues
Experiences from practice

16 Making learning explicit Use of Spaces & Rhythms (*)
Conventional M&E design Information needs related to the programme framework (objectives, results, outcomes, … + indicators) Data collection methods Reporting Outcome Mapping (as presented in the OM manual) Based on principles of Utilisation-Focused Evaluation Focus on monitoring priorities: Who will use it? Purpose? Use of outcome, strategy and performance journals (*) introduced by Guijt & Ortiz (2008) Steff Deprez & Kaia Ambrose (Sept 2014)

17 Monitoring Plan Steff Deprez & Kaia Ambrose (Sept 2014)

18 Assumptions about monitoring (in OM)
Monitoring process = learning process > reflection and analysis happen automatically An actor-centered design leads to a participatory monitoring process M&E results will be used Users have the capacity, time and willingness to participate or facilitate the monitoring process Using outcome journals & strategy journals is enough to pave the way forward The monitoring process is embedded in organisational or programme management cycles Steff Deprez & Kaia Ambrose (Sept 2014)

19 Making learning explicit
For learning to happen > data is not the starting point Start from the intended use Start with defining the spaces that are crucial for debate, sharing, reflection and decision-making Make the monitoring integral to the thinking and doing of the organisation and programme Steff Deprez & Kaia Ambrose (Sept 2014)

20 Learning-oriented monitoring
(Source: Seeking Surprise (Guijt, 2008) Steff Deprez & Kaia Ambrose (Sept 2014) 20

21 Three core steps in the design of a
learning-oriented monitoring system 1. BE CLEAR ON PURPOSE, USES & USERS 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS Which information is required, for who, at what time/event, in what form, to do what? Steff Deprez & Kaia Ambrose (Sept 2014)

22 1. BE CLEAR ON PURPOSE, USES & USERS
E.g. Intended uses of M&E process (Patton) Steff Deprez & Kaia Ambrose (Sept 2014)

23 1. BE CLEAR ON PURPOSE, USES & USERS
e.g. Wheel of Learning Purposes (Guijt, 2008) Steff Deprez & Kaia Ambrose (Sept 2014) 23

24 1. BE CLEAR ON PURPOSE, USES & USERS
e.g. PLA system VECO ACCOUNTABILITY Programme improvement LEARNING Knowledge creation VECO & partners Programmatic & financial accountability Negotiation & understanding chain actors Upward & downward accountability Evidence building & upscaling Short-term planning Long-term & strategic planning PLANNING Steff Deprez & Kaia Ambrose (Sept 2014) 24 24

25 1. BE CLEAR ON PURPOSE, USES & USERS
2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS What are the spaces and rhythms central to planning, learning, accountability, debate, decision-making, … (Guijt & Ortiz, 2007) How can we ensure that monitoring is integral to the thinking and doing of the organisation and programme? Steff Deprez & Kaia Ambrose (Sept 2014)

26 2. DEFINE ORGANISATIONAL SPACES & RHYTHMS
Formal and informal meetings and events which bring organisations and programmes to life Rhythms Patterns in time, the regular activities or processes which provide a structure-in-time, through which an organisation can direct, mobilise and regulate its efforts, i.e. regular weekly, monthly, annual activities that characterise the tempo of organisational functioning. When do people interact and share information and make sense of what is happening? Steff Deprez & Kaia Ambrose (Sept 2014)

27 Description of the main spaces & rhythms
EVENT What is the purpose of the event Time/ frequency Who participates coordinates What is the expected output Which data Information is required > Group exercise Steff Deprez & Kaia Ambrose (Sept 2014)

28 1. BE CLEAR ON PURPOSE, USES & USERS
2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS Which data & information is required? What type of data / information? From ‘Nice-to-know’ to ‘Must-know’ information Steff Deprez & Kaia Ambrose (Sept 2014)

29 Steff Deprez & Kaia Ambrose (Sept 2014) 29

30 Information needs linked to the main spaces & processes
EVENT What is the purpose of the event Time/ frequency Who participates coordinates What is the expected output Which data Information is required Information needs linked to the main spaces & processes Steff Deprez & Kaia Ambrose (Sept 2014)

31 Masterlist of info needs
General info need Specific information need? (indicators or questions) Frequency By when? Who has the info? Or where is the data generated? Data collection method/approach? Who will use the info at which event?

32 Plan for Sensemaking 1. BE CLEAR ON PURPOSE, USES & USERS
2. DEFINE ORGANISATIONAL SPACES & RHYTHMS 3. DECIDE ON INFORMATION NEEDS Plan how data is used and analysed > make it usable for action Focus on social interaction: sharing, reflection, debate, decision Should be well-planned & facilitated > it will not happen by itself Steff Deprez & Kaia Ambrose (Sept 2014)

33 Institutionalising a learning-oriented M&E practice
How to make sure that your monitoring principles and design is translated in an effective monitoring practice? 1. Creating organisational conditions: motives, means & oppportunities 2. The ‘web of institutionalisation’ > Should be reflected in the Organisational Practices Steff Deprez & Kaia Ambrose (Sept 2014)

34 Creating the motives, means & opportunities
Creating Motives -Guiding ideas -Support by management -Develop a learning Culture -Provide incentives Creating Means -Human capacities -Specialist support -Concepts, methods and tools -Budget Creating Opportunities -Integration in planning and management -Clear M&E plans and responsabilities -Responsive information management system -Trust and respect – speak out, challenge, feedback (Steff Deprez, 2009) Steff Deprez & Kaia Ambrose (Sept 2014)

35 2. The web of institutionalisation (Levy, 2006)
See: THE PROCESS OF INSTITUTIONALISING GENDER IN POLICY AND PLANNING: THE ‘WEB’ OF INSTITUTIONALISATION (Levy, 2006) Steff Deprez & Kaia Ambrose (Sept 2014)

36 3. Monitoring Practice in OM
Experiences from practice

37 Monitoring Practice in OM
Working with progress markers, boundary partners and organizational practices Sense-making with boundary partners Ongoing challenges …by showing you a methodology that IDRC developed to respond to its needs – OM.

38 Thinking through the different aspects of monitoring
M&E plan (Performance Measurement Framework) – more narrative Unpack different moments Data from grantees survey quan and qual Data from POs trip reports Collate who? where? (database) Team interpretation process for doing this rubrics / dashboard feed into program level

39 Working with progress markers
What is your purpose and use? What is your monitoring culture? What resources do you have for monitoring? What qualitative and quantitative data needs do you have?

40 GrOW program Show matrix

41 Improved social and economic status of 35,000 low-income women in 11 districts in East Harague
Intermediate outcome: Increased financial inclusion and entrepreneurship of low-income women Immediate outcome: Improved business and technical skills/capacities of low- income women for enterprise development Immediate outcome: Increased access of low-income women to financial institutions and services Intermediate outcome: Increased participation of low-income women in household and community level decision making Immediate outcome: Increased access of low-income women to legal and social service providers men ? FSP leaders Outputs come below These intended / desired results, get translated into indicators of increase and decrease in: # of F/M accessing financial services from a variety of different sources % of F/M VS&L members reporting having utilized at least one type of financial service (bank, MFI, insurance, etc.) # of VSLA members, M and W, with accounts (number, type, size) % or points of gap in women’s and men’s participation in HH decision making

42 Outcome Mapping Logic Model
So this is where OM came in to help us unpack pieces of the LF for our monitoring. We saw this fusion like at accordian

43 Project Monitoring now involves: Baseline and endline
Output monitoring Outcome monitoring using OM Qualitative inquiry (using Rolling Profiles tool) Knowledge + Practice Surveys (drawing from some of the progress markers) Observation (using Field Diaries based on progress markers) Regular, facilitated reflection and sense-making processes to feed into program decision-making Using the tools and the reflection, a focus on first and foremost the change, including unexpected change, and then the reasons for change Each of the changes is tagged to the appropriate indicator and outcome in the logic model (tells the back story) The diaries and the rolling profiles also are meant to collect unexpected change, negative and positive (need to know who to ask / how to ask or how to observe)

44

45 First look for change (positive, negative and unexpected), then go back to your pathways and see if that hypothesis has changed and why or why not. What are the influencing factors and actors?

46 Challenges Qualitative data collection – informal interviews, observation (including looking for unexpected – positive and negative) Qualitative analysis – looking for patterns and trends Critical analysis and sense-making – the need for facilitated, well-constructed (agenda, exercises) spaces and processes Usage of information - challenges: observing and note taking; informal inteviews, looking for patterns, trends and deviants; using different data to tell a story gather real time data to inform on-going decision-making and adaptation from the beginning to end of the development of the innovation.

47 Evolving Lessons Monitoring beyond outputs
Good Enough (in terms of tools, capacity) and build from there M&E – mande – evaluative thinking – - explicit sense-making spaces

48 Working with progress markers
Use progress markers as a checklist to track progression against pre-defined behavioural changes for a specific partner in a specific period of time; use of scoring (LMH, 1234, colour) Write a qualitative description of change (I.e. every 4-6 months) for each pre-defined PM for a respective period Other monitoring tools, qual or quan, that are then cross-referenced with pre-defined PMs (new ones added)

49 Working with progress markers: who?
BP describe their own change - then send to implementing team Implementing team describes change based on their own observations Mutual reflection process with team and BPs External evaluator judges progression in change

50 Working with progress markers: what?
Every single PM monitored Only PMs that are relevant for a specific period PMs and / or OCs used to trigger discussion during reflection process; key changes documented Depth of analysis can vary Across different BPs (comparison) In combination with SMs (effective intervention?)

51 Working with progress markers
Using progress markers at an organisational scale (across programmes) Different geographical regions Different thematic foci Different type and bigger numbers of boundary partners

52 Working with progress markers
Use of progress markers for whom? Use of general & standard progress markers for each type of BP less (or not) useful for the individual projects and their actors Tailor-made progress markers for individual BPs + relevant to guide and steer local projects and their actors less useful for higher levels, overload detailed data & aggregation is difficult

53 Sensemaking with BPs Regular (4-6 months) reflection with staff + BP (one or more), external evaluator, other stakeholders Embedded in spaces and rhythms of the programme

54 Working with strategy maps
Useful for monitoring relevance, effectiveness and efficiency of the supporting activities of the implementing team Understand how you spend time and money within a programme team Get direct feedback from BPs on the required / requested support

55 e.g. VECO Indonesia - 9 categories of strategy maps
Every activity that is carried out by the VECO can be linked to one of the 9 support strategies. 55

56 9 categories of strategy maps
EXAMPLE MONTHLY OVERVIEW I I I I E E E5 56

57 Working with organisational practices
Use as is in manual Use of OCs and PMs at the level of the implementing organization Motives, Means and Opportunities Web of institutionalisation

58 Reports and Journals Simplified journal formats
Combined PM and SM reports Integrate elements from journals into regular programme reports Integration in MIS, databases, etc. Used as final report, or as report embedded within another report OM journal as a database SM as an activity guide

59 Evaluation using OM OM in retrospective - whether the initiative has used Intentional Design or not Oxfam Mid-Term Learning Review - asking BPs to reconstruct most significant changes, influencing actors and factors of that change, and effects / consequences of that change; modified OM journals + workshops for group reflection Reconstruction of results using OM as a framework McKnight Foundation evaluation - tracking progression of change / reconstructing pathway of change (“then what happened”); create outcome journals for future monitoring Outcome Harvesting

60 Combined use of OM and the logical framework

61 Three ways of combining LFA & OM
Modifications of OM framework with elements of LFA LFA+ Modificiations of an LFA with elements of OM Combined uses OM frameworks that are linked to/ feed into LFA

62 OM+ Additional layer of objectives (+indicators) to pinpoint exactly to what BPs are contributing Adaptations: An OM framework for each programme objective The specific objective ‘translates the vision’ in something tangible and measurable (use of SMART indicators) – is key for integration with LFA Some programmes add an ‘ouput’ layer between strategy maps and outcomes

63 OM+ Additional layer of objectives (+indicators) + intermediate tangible results (+indicators) to clearly state what BPs are contributing

64 LFA+ Logical Framework of a specific project
n t e rve i o l g c d cat r s S ou rce of Ve if cati on A ss um p G oal / D evel opm en t O bj R es u s / O om ivi ies P oj ct Ob jec ive / po se ≈ Specific objectives + indicators ≈ Outcome Challenge + Progress markers ≈ Activities VECO Logical Framework of a specific project Actor-centred results: results describing changes in behaviour OM Framework of a specific project

65 Combined use of LFA & OM Logical framework of the over-all programme
rve i o l g c d cat r s S ou rce of Ve if cati on A ss um p G oal / D evel opm en t O bj R es u s / O om ivi ies P oj ct Ob jec ive / po se Logical framework of the over-all programme Synthesis of the different projects ≈ Specific objectives + indicators ≈ Chain results + indicators ≈ PMs BPs + Activities VECO OM frameworks are guiding the PM&E of specific projects


Download ppt "Monitoring in Outcome Mapping Principles, Design & Practice"

Similar presentations


Ads by Google