OGB Partner Advocacy Workshop 18th & 19th March 2010

Slides:



Advertisements
Similar presentations
Planning for Learning and Teaching, Assessment and Moderation
Advertisements

Evidence-based Policy in DEFRA
Improve your Club’s project design and implementation Presented by: Ulrike Neubert SIE PD Ann-Christine Soderlund SIE APD.
Theory of Change, Impact Monitoring, and Most Significant Change EWB-UK Away Weekend – March 23, 2013.
Ray C. Rist The World Bank Washington, D.C.
An Introduction to an Integrated P,M&E System developed by IDRC Kaia Ambrose October, 2005.
® © 2005, CARE USA. All rights reserved. A System for Measuring Impact Organizational Performance and Learning November 2, 2010.
Bond.org.uk The Bond Effectiveness Programme: developing a sector wide framework for assessing and demonstrating effectiveness July 2011.
Brendan Halloran Transparency and Accountability Initiative Beyond Theories of Change: Working Politically for Transparency.
1 RBM Background Development aid is often provided on a point to point basis with no consistency with countries priorities. Development efforts are often.
UNDAF M&E Systems Purpose Can explain the importance of functioning M&E system for the UNDAF Can support formulation and implementation of UNDAF M&E plans.
Monitoring and Evaluation of GeSCI’s Activities GeSCI Team Meeting 5-6 Dec 2007.
Kathy Corbiere Service Delivery and Performance Commission
Capacity Development Results Framework A strategic and results-oriented approach to learning for capacity development.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
OGB Partner Advocacy Workshop 18 th & 19 th March 2010 Indicators.
The Leader’s Role in the Process of Change Changing the learning landscape A clear sense of direction Communicating and Involving Influencing people Maximising.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Good teaching for diverse learners
Logic Models How to Integrate Data Collection into your Everyday Work.
Responding to Complexity in Impact Evaluation
Introduction to advocacy
Self Assessment for Pastoral Care
Workshop to develop theories of change
Project Cycle Management
Managing for Results Capacity in Higher Education Institutions
Fundamentals of Monitoring and Evaluation
THE risk management in the period of Innovation
ICT PSP 2011, 5th call, Pilot Type B, Objective: 2.4 eLearning
Assist. Prof.Dr. Seden Eraldemir Tuyan
Needs assessment and evaluation : service improvement and women
Introduction to Program Evaluation
Advocacy and CampaiGning
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 7. Managers’ and stakeholders’ information needs.
© Julie Hodges and Roger Gill
Business Environment Dr. Aravind Banakar –
Business Environment Dr. Aravind Banakar –
Business Environment
Business Environment
Business Environment
Business Environment
Developing & Refining a Theory of Action
Asset Governance – Integrated Strategic Asset Management
United Nations Security and Stabilization Support Strategy for Eastern DRC Measuring Impact New York, 7 May 2009.
KEY PRINCIPLES OF THINKING SYSTEMICALLY
FRAMEWORK FOR BUSINESS ACTION ON WASH
Draft OECD Best Practices for Performance Budgeting
Logic Models and Theory of Change Models: Defining and Telling Apart
منهج الإطار المنطقي وإطار الرصد والتقييم وإطار النتائج
CATHCA National Conference 2018
WHAT is evaluation and WHY is it important?
Regulated Health Professions Network Evaluation Framework
Learning that deepens knowledge and understanding
Standard for Teachers’ Professional Development July 2016
Portfolio, Programme and Project
Lecturette 1: Leveraging Change through Strategic Planning
Ricardo Furman Senior Evaluation Officer- Geneva
Lecturette 1: Leveraging Change through Strategic Planning
SUSTAINABLE MICRO-FINANCE for WOMEN’S EMPOWERMENT
THEORY OF CHANGE VS. LOGICAL FRAMEWORK
Results Based Management for Monitoring & Evaluation
Group work on challenges Learning oriented M&E System
Leadership of and for learning
Reform Support Network Sustainability Rubric
OGB Partner Advocacy Workshop 18th & 19th March 2010
Reading Paper discussion – Week 4
A brief introduction to the usefulness of Outcome Mapping for assessing impact
Integrating Gender into Rural Development M&E in Projects and Programs
Data for PRS Monitoring: Institutional and Technical Challenges
Root Cause Analysis Identifying critical campaign challenges and diagnosing bottlenecks.
Presentation transcript:

OGB Partner Advocacy Workshop 18th & 19th March 2010 Introduction to M&E OGB Partner Advocacy Workshop 18th & 19th March 2010

What do we mean by… Monitoring? Systematic assessment over time Ongoing collection and review of data Provides stakeholders with indications of progress against plans and towards objectives Evaluation? Complements and builds from ongoing monitoring In-depth, verifiable assessments at a particular point in time Makes a judgement about the relevance, efficiency, effectiveness, impact and sustainability of an intervention

Why monitor & evaluate? To acknowledge that we are testing theories about how best to contribute to positive change To encourage critical questioning, and satisfy curiosity To improve the effectiveness of our interventions - To build an evidence base of reliable, verifiable information to inform decisions To engage with donor demands for evaluative information

If we aren’t rewarding success, we risk of repeating failure Why monitor & evaluate? If we aren’t rewarding success, we risk of repeating failure What is the cost of not knowing if we are making a difference? Maximize Effectiveness: Using data in real time to learn what strategies or tactics are working well and where midcourse corrections many be needed. Understand which strategies and tactics are effective under which conditions Accountability: Being responsible for the resources with which we have been entrusted - making a credible and defensible case that we contributed to the outcomes and goals we said we would achieve.

M & E is ongoing and cyclical M&E information may lead to a decision to change the programme. As the programme changes, the programme is revised to reflect the change. M&E assesses if these changes are having the desired effect, and so on … The mantra : create, validate, update.

Underlying principles of M&E in OGB Use different processes for different needs Integrate M&E into everyday work Link learning and decision-making Secure adequate resources Involve key stakeholders

Basic elements of good M&E Clear Logic – what are you trying to achieve? and how? What does success look like? Indicators –How will you know if you: have implemented well? are making progress? Have succeeded? Data Collection – how will you collect information against your indicators? Moments for Review – how/ when will we use this information?

Experience with M&E What has been your experience with monitoring and evaluation to date? What particular challenges have you faced in monitoring and evaluating your work?

Challenges of M&E of advocacy Advocacy objectives are often imprecise or unclear, with buried assumptions Time Frame - Social and political change is slow and complex, and the ultimate goal of advocacy efforts is often long term, but campaigns tend to be relatively short Advocacy interventions usually operate in fragile, complex and/ or fast changing contexts and require flexibility Contribution, not attribution Collecting and managing data from multiple actors/ across a coalition can be difficult Ensuring rights-holders’ voices are heard Incentives - prioritising M&E of outcomes in a culture that manages to activities/outputs

Overcoming the Challenges Clear theory of change – a roadmap of how we will get from “here” to “there” Clarifies thinking – focus on outcomes, not activities Lifts up the importance of advocacy’s ‘interim outcomes’ -recognising much of the progress occurs in landscape along the way Identifies the unique contribution Infuse M&E into your work from the beginning Collect data systematically over time – consider ways to integrate evaluation into your everyday activities Ensure sufficient space to use data to adapt and evolve Develop measures of success – turn the informal cues that signal success in your work and try to convert them to measurable statements Contribution, not attribution