Presentation is loading. Please wait.

Presentation is loading. Please wait.

35 Measuring Mission – Revisiting the Premise AEA - October 2014 Presented by Barbara Willett Director of Monitoring, Evaluation and Learning Mercy Corps.

Similar presentations


Presentation on theme: "35 Measuring Mission – Revisiting the Premise AEA - October 2014 Presented by Barbara Willett Director of Monitoring, Evaluation and Learning Mercy Corps."— Presentation transcript:

1 35 Measuring Mission – Revisiting the Premise AEA - October 2014 Presented by Barbara Willett Director of Monitoring, Evaluation and Learning Mercy Corps

2 What Was Mission Metrics? An agency initiative to understand how we – collectively – are achieving our mission. Mission Statement Mercy Corps exists to alleviate suffering, poverty and oppression by helping people build secure, productive and just communities. Mission Statement Mercy Corps exists to alleviate suffering, poverty and oppression by helping people build secure, productive and just communities.

3 The Challenges! 42 countries - more than 350 active programs Average length of grant – 24 months Wide, ever-changing program scope Field driven (no centralized M&E systems) 42 countries - more than 350 active programs Average length of grant – 24 months Wide, ever-changing program scope Field driven (no centralized M&E systems)

4 4 Themes Definitions Indicators

5 What do results look like? 52,355 out of 96,954 households (54%) showed evidence of increased prosperity… MI 4.1 Number and percentage of households reporting greater prosperity Examples of Results Types of improvement… 19,642 households participated in loan or savings programs - average loan amount $552.50 Average increase of paddy production 3.3 MT per hectare (18,520 households benefitting) Among 6570 farmers in Helmand, surveys showed there was a 33% increase in total sales Types of improvement… 19,642 households participated in loan or savings programs - average loan amount $552.50 Average increase of paddy production 3.3 MT per hectare (18,520 households benefitting) Among 6570 farmers in Helmand, surveys showed there was a 33% increase in total sales

6 35 Reflect/ Discuss Analyze Review Results/Learn Make Decisions Take Action Incoming Data Cycle of Learning Mission Metrics Learning Cycle

7 Mission focus too abstract – no ownership or process for use Incomplete dataset – not useful to field Complicated and time-consuming- technology, conversion, validation - What else went wrong?

8 Focus early and often on use – what questions do you want to answer and who will listen/act? Is this about PR (‘telling our story’) or about improvement? Think hard about what it will take and what you will give up to make it happen – consider organizational culture – does it fit? Link closely to program M&E processes but limit extra data or work – put what you can on the front-end, even if it’s cost-loaded Allocate quality IT and data management systems – invest more up front to avoid re-building and ongoing maintenance later Limit expectations as you go – can’t be all things to all people, but they will still expect it – keep reminding them Be super careful about terminology – how to use ‘impact’ ; agency-level systems versus agency-level results Keep track of the successes and examples of how it is used – you will be asked to explain and justify again and again Give it time – it may take up to 10 years to develop and fully embed - will your organization have that patience? LESSONS LEARNED

9 What We’re Doing Now Web-based data management system o People still want a searchable ‘master dataset’ o Field needs a common system for indicator tracking o Program managers need to track better o Country directors need higher analysis Standard indicators o Sector specialists define and own o Linked to Theories of Change o Less focus on aggregation (limit expectations) o More focus on contextual use and understanding o Measurement guidance and improved quality Link to Adaptive Management o Platform for sharing info and requirements – team ownership of program logic o Focus on practice and culture – critical (and periodic) reflection and improvement o Documentation of changes and innovations Web-based data management system o People still want a searchable ‘master dataset’ o Field needs a common system for indicator tracking o Program managers need to track better o Country directors need higher analysis Standard indicators o Sector specialists define and own o Linked to Theories of Change o Less focus on aggregation (limit expectations) o More focus on contextual use and understanding o Measurement guidance and improved quality Link to Adaptive Management o Platform for sharing info and requirements – team ownership of program logic o Focus on practice and culture – critical (and periodic) reflection and improvement o Documentation of changes and innovations

10 So what about agency impact? We have prioritized Three I’s as our leadership cornerstones: Ideas/Innovation, Impact, and Influence There is no ‘agency-level impact’ There are specific examples of ‘impact’ Program results (with a focus on outcomes) Effective country strategies (work in progress) On time, on scope, on budget More important is the culture in which leaders and managers can articulate impactful results [see above] in which programs drive toward targets, metrics, meaning, and maximum positive impact


Download ppt "35 Measuring Mission – Revisiting the Premise AEA - October 2014 Presented by Barbara Willett Director of Monitoring, Evaluation and Learning Mercy Corps."

Similar presentations


Ads by Google