35 Measuring Mission – Revisiting the Premise AEA - October 2014 Presented by Barbara Willett Director of Monitoring, Evaluation and Learning Mercy Corps.

Slides:



Advertisements
Similar presentations
The PRR: Linking Assessment, Planning & Budgeting PRR Workshop – April 4, 2013 Barbara Samuel Loftus, Ph.D. Misericordia University.
Advertisements

SGA1 – The evolving role of UNAIDS in a changing financial environment UNAIDS has adapted to a new funding environment and developed strong and positive.
MAPP Process & Outcome Evaluation
1 World Bank Support TFSCB STATCAP Monitoring systems / Core Welfare Indicators Questionnaire (CWIQ) Readiness Assessment.
35 Measuring Agency-Level Results Part II: Mission Metrics April 10, 2013 Presented by Barbara Willett Director of Monitoring, Evaluation and Learning.
Learning from imGoats experiences with Outcome Mapping
Donald T. Simeon Caribbean Health Research Council
Local Control and Accountability Plan: Performance Based Budgeting California Association of School Business Officials.
 Financial Planning Tool  Legal Document  Managerial-Operational Guide  Communication Device-To Lots of People  Public Policy-Statement of Priorities.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
Theory of Change, Impact Monitoring, and Most Significant Change EWB-UK Away Weekend – March 23, 2013.
John Supra & Nathan Strong October 2012 Using HR Metrics to Support Strategic Planning/Employee Development.
Aaron J. Scott Research Associate October 13 th, 2009.
V i s i o n ACCOMPLISHED ™ Portfolio Management Breakthroughs Shelley Gaddie President Project Corps Pacific Northwest Portfolio Management Roundtable.
Improvement Service / Scottish Centre for Regeneration Project: Embedding an Outcomes Approach in Community Regeneration & Tackling Poverty Effectively.
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
Notes for a presentation to the EEN (Canada) Forum Blair Dimock Director, Research, Evaluation and Knowledge Management October 1, 2010 Sharing Practical.
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
Professional Learning in the Learning Profession Effective Practice  Increased Student Learning Frederick Brown Director of Strategy.
P2P Progress to date Jan-Sept 08. The (Shift) Path.
City, State. ©Copyright Financeware, Inc All rights reserved PAGE 2 A completely new advice process: Specifically designed to: » Avoid unnecessary.
The County Health Rankings & Roadmaps Take Action Cycle.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Special Session II Increasing Investment for Disaster Risk Reduction.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
Impact Measurement and You Understanding Impact Measurement, Practical Tools, Key Questions.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Communication System Coherent Instructional Program Academic Behavior Support System Strategic FocusBuilding Capacity.
CHOLLA HIGH MAGNET SCHOOL Plc Workshop
MOST SIGNIFICANT CHANGE Youth Focused Monitoring and Evaluation System.
Enterprise Challenge Fund for the Pacific & South East Asia Designing a results system for improved development outcomes 22 July 2015.
MARKETS II M&E FRAMEWORK AND CHALLENGES Joseph Obado.
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
IPMA Executive Conference Value of IT September 22, 2005.
Development Hypothesis or Theory of Change M&E Capacity Strengthening Workshop, Maputo 19 and 20 September 2011 Arif Rashid, TOPS.
Tell the Story: Why Performance Measures Matter Cat Keen, M.S.W., M.B.A National Service Programs Director.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
IT Strategic Planning.
Lecture 3 Title: Information Technology Project Methodology By: Mr Hashem Alaidaros MIS 434.
The Value Driven Approach
David Abbott Director of Student Services Mary Barton CFN 204 SATIF November 21, 2011.
Liberating the NHS : An Information Revolution Kathy Mason DH Informatics Directorate.
The Logic Model An Outcomes-Based Program Model. What is a Logic Model? “a systematic and visual way to present and share your understanding of the relationships.
CGIAR Research Program on Grain Legumes Integrating Gender Research: Some ideas Esther Njuguna-Mungai 1.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
Using OMB Section 508 reporting in addressing your agency's program maturity. How to Measure Your Agency's 508 Program.
Impact Measurement why what how Atlanta. Today Imperatives Questions Why Now? Significant Challenges Breakthroughs in the field CARE’s Long-Term.
1 CHAPTER 4 DEVELOPING A BUSINESS PLAN: BUDGETING.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Integrating Evaluation into the Design of the Minnesota Demonstration Project Paint Product Stewardship Initiative St. Paul, MN May 1, 2008 Matt Keene,
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Getting Public Agencies Started on Fund Mapping evidence2success Strategic Financing.
Presentation By L. M. Baird And Scottish Health Council Research & Public Involvement Knowledge Exchange Event 12 th March 2015.
Strategic and Business Planning for Ensuring of Cooperatives Sustainability Dr. Hakkı Çetin TARIS Union of Olive and Olive Oil Agricultural Sale Cooperatives.
MIS Project Management Instructor: Sihem Smida Project Man agent 3Future Managers1.
Theme 3: Business decisions and strategy. 3.1 Business objective and strategy syllabus Corporate objectives Theories of corporate strategy.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
JMFIP Financial Management Conference
Finding, Developing and Capitalizing on the Capacity Dividend
Module 2 Basic Concepts.
Workshop to develop theories of change
Background of the Workshop Managing Information for Results
Mission Metrics: One Agency's Effort to Capture Mission Level Results
INVESTING IN SYRIAN HUMANITARIAN ACTION (ISHA)
Closing the circle and Reviewing the plan
Evaluating Community Change
Evaluating Partnerships
CIFSRF Phase 2 (Call 5) SIAC/PSC/Team meeting 13 May 2016, Hawassa
It’s about more than just tracking numbers
Training module on anthropometric data quality
Presentation transcript:

35 Measuring Mission – Revisiting the Premise AEA - October 2014 Presented by Barbara Willett Director of Monitoring, Evaluation and Learning Mercy Corps

What Was Mission Metrics? An agency initiative to understand how we – collectively – are achieving our mission. Mission Statement Mercy Corps exists to alleviate suffering, poverty and oppression by helping people build secure, productive and just communities. Mission Statement Mercy Corps exists to alleviate suffering, poverty and oppression by helping people build secure, productive and just communities.

The Challenges! 42 countries - more than 350 active programs Average length of grant – 24 months Wide, ever-changing program scope Field driven (no centralized M&E systems) 42 countries - more than 350 active programs Average length of grant – 24 months Wide, ever-changing program scope Field driven (no centralized M&E systems)

4 Themes Definitions Indicators

What do results look like? 52,355 out of 96,954 households (54%) showed evidence of increased prosperity… MI 4.1 Number and percentage of households reporting greater prosperity Examples of Results Types of improvement… 19,642 households participated in loan or savings programs - average loan amount $ Average increase of paddy production 3.3 MT per hectare (18,520 households benefitting) Among 6570 farmers in Helmand, surveys showed there was a 33% increase in total sales Types of improvement… 19,642 households participated in loan or savings programs - average loan amount $ Average increase of paddy production 3.3 MT per hectare (18,520 households benefitting) Among 6570 farmers in Helmand, surveys showed there was a 33% increase in total sales

35 Reflect/ Discuss Analyze Review Results/Learn Make Decisions Take Action Incoming Data Cycle of Learning Mission Metrics Learning Cycle

Mission focus too abstract – no ownership or process for use Incomplete dataset – not useful to field Complicated and time-consuming- technology, conversion, validation - What else went wrong?

Focus early and often on use – what questions do you want to answer and who will listen/act? Is this about PR (‘telling our story’) or about improvement? Think hard about what it will take and what you will give up to make it happen – consider organizational culture – does it fit? Link closely to program M&E processes but limit extra data or work – put what you can on the front-end, even if it’s cost-loaded Allocate quality IT and data management systems – invest more up front to avoid re-building and ongoing maintenance later Limit expectations as you go – can’t be all things to all people, but they will still expect it – keep reminding them Be super careful about terminology – how to use ‘impact’ ; agency-level systems versus agency-level results Keep track of the successes and examples of how it is used – you will be asked to explain and justify again and again Give it time – it may take up to 10 years to develop and fully embed - will your organization have that patience? LESSONS LEARNED

What We’re Doing Now Web-based data management system o People still want a searchable ‘master dataset’ o Field needs a common system for indicator tracking o Program managers need to track better o Country directors need higher analysis Standard indicators o Sector specialists define and own o Linked to Theories of Change o Less focus on aggregation (limit expectations) o More focus on contextual use and understanding o Measurement guidance and improved quality Link to Adaptive Management o Platform for sharing info and requirements – team ownership of program logic o Focus on practice and culture – critical (and periodic) reflection and improvement o Documentation of changes and innovations Web-based data management system o People still want a searchable ‘master dataset’ o Field needs a common system for indicator tracking o Program managers need to track better o Country directors need higher analysis Standard indicators o Sector specialists define and own o Linked to Theories of Change o Less focus on aggregation (limit expectations) o More focus on contextual use and understanding o Measurement guidance and improved quality Link to Adaptive Management o Platform for sharing info and requirements – team ownership of program logic o Focus on practice and culture – critical (and periodic) reflection and improvement o Documentation of changes and innovations

So what about agency impact? We have prioritized Three I’s as our leadership cornerstones: Ideas/Innovation, Impact, and Influence There is no ‘agency-level impact’ There are specific examples of ‘impact’ Program results (with a focus on outcomes) Effective country strategies (work in progress) On time, on scope, on budget More important is the culture in which leaders and managers can articulate impactful results [see above] in which programs drive toward targets, metrics, meaning, and maximum positive impact