1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs  Day 3: Performance Information.

Slides:



Advertisements
Similar presentations
Evaluation at NRCan: Information for Program Managers Strategic Evaluation Division Science & Policy Integration July 2012.
Advertisements

Designing and Building a Results-Based Monitoring and Evaluation System: A Tool for Public Sector Management.
Role of CSOs in monitoring Policies and Progress on MDGs.
Lucila Beato UNMIL/HRPS
9 th Annual Public Health Finance Roundtable November 3, 2012 Boston, MA Peggy Honoré.
Monitoring and Evaluation for HES Activities
ASSESSMENT WORKSHOP: SESSION 1 ADMINISTRATIVE SUPPORT SERVICES ACADEMIC AND STUDENT SUPPORT SERVICES PRESENTED BY THE DIVISION OF INSTITUTIONAL EFFECTIVENESS.
PPA 502 – Program Evaluation Lecture 10 – Maximizing the Use of Evaluation Results.
Evaluation. Practical Evaluation Michael Quinn Patton.
BA 469 Strategic Management and Business Policy. Focus: General Management of an organization An organization is “a system of consciously coordinated.
Options for Evaluating the Performance of a Tax Administration Agency
Dimensions of Data Quality M&E Capacity Strengthening Workshop, Addis Ababa 4 to 8 June 2012 Arif Rashid, TOPS.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
Internal Auditing and Outsourcing
USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
Copyright 2010, The World Bank Group. All Rights Reserved. Planning a Statistical Project Section A 1.
Overview of the SPDG Competition Jennifer Doolittle, Ph.D. 1.
Evaluation methods and tools (Focus on delivery mechanism) Jela Tvrdonova, 2014.
Presented by Margaret Robbins Program Director, TMCEC.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Ensuring the quality of Qualitative Data Presented by Lorie Broomhall, Ph.D. Senior HIV/AIDS Advisor Nigeria Monitoring and Evaluation Management Services.
DATA QUALITY How closely do the data used reflect the truth about results?
The National Development Plan, Iraq 6 July 2010 “Developing Objectives & Indicators for Strategic Planning” Khaled Ehsan and Helen Olafsdottir UNDP Iraq.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
MARKETS II M&E FRAMEWORK AND CHALLENGES Joseph Obado.
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
M&E Basics Miguel Aragon Lopez, MD, MPH. UNAIDS M&E Senior Adviser 12 th May 2009.
BENCHMARKING For Best Practices. What is Benchmarking A method for identifying and importing best practices in order to improve performance A method for.
Indicators for ACSM.
Data Quality Assessment
Roadmap for a monitoring framework for the post-2015 development agenda OWG on Sustainable Development Goals Informal meeting on measuring progress (17.
LIBERIA TARGET SETTING Is the bang worth the buck? TARGET SETTING.
LIBERIA DATA QUALITY How closely do the data used reflect the truth about results?
Project Management Learning Program 7-18 May 2012, Mekong Institute, Khon Kaen, Thailand Writing Project Report Multi-Purpose Reporting.
Independent Evaluation Group World Bank November 11, 2010 Evaluation of Bank Support for Gender and Development.
Market Research & Product Management.
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
SEEP Minimum Economic Recovery Standards ( MERS ): Overview MBRRR Training Session 1.3 Source: SEEP MERS training materials, 2014.
Development Impact Evaluation in Finance and Private Sector 1.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Driving towards Impact through Development Goals Washington, DC 04/13/2011.
Defining Clear Goals and ObjectivesDefining Clear Goals and Objectives Barbara A. Howell, M.A. PAD 5850.
1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs Day 1: Big Picture Managing for Results or Performance Management.
Evaluating Engagement Judging the outcome above the noise of squeaky wheels Heather Shaw, Department of Sustainability & Environment Jessica Dart, Clear.
Using Classroom Data to Monitor Student Progress Lani Seikaly, Project Director School Improvement in Maryland Web Site.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
Strategic and Business Planning for Ensuring of Cooperatives Sustainability Dr. Hakkı Çetin TARIS Union of Olive and Olive Oil Agricultural Sale Cooperatives.
Supporting measurement & improvement of primary health care (PHC) at the facility and community levels Dr. Jennifer Adams, Deputy Assistant Administrator,
Detailed Implementation and Management Planning (DIMP) Workshop Kampala, Uganda 8-11 December 2009.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Stages of Research and Development
M&E Basics Miguel Aragon Lopez, MD, MPH
2A. Develop a Formal Action Plan: Objectives
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Multi-Sectoral Nutrition Action Planning Training Module
The Estonian experience with ex-ante evaluation – set-up and progress
Using State and Local Data to Improve Results
Monitoring and Evaluation in Communication Management
Presentation transcript:

1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs  Day 3: Performance Information

2 Integrated Managing for Results Task 5: Selecting indicators (“X” rejection criteria) Results & Indicators DirectObjectiveUseful PracticalAttributeTimelyAdequate 1. Growth of Agribusiness Annual sales of assisted enterprises 2. Growth of Agribusiness Annual sales attributed to USAID assistance 3. Growth of Agribusiness # customers for 1 year or more Annual customer sales attributed to USAID assistance 1. Effective Community-Based Resource Management Implemented # hectares of target area using collaborative management model 2. Effective Community-Based Resource Management Implemented # protected areas using collaborative management model 1. Improved Food Security of Vulnerable Groups % target group population with access to emergency relief supplies 2. Improved Food Security of Vulnerable Groups % target group population consuming minimum daily food requirement 3. Improved Food Security of Vulnerable Groups # person days of employment created

3 Integrated Managing for Results Task 5: Selecting indicators (“X” rejection criteria) Results & Indicators DirectObjectiveUseful PracticalAttributeTimelyAdequate 1. Family health improved Under 5 mortality rate in target areas 2. Family health improved Ratio of under 5 mortality rate in targeted areas to national average 3. Family health improved National under 5 mortality rate 1. Increased contraceptive security Average % warehouses with stock-outs of 1 or more contraceptives 2. Increased contraceptive security Average % points-of-sale with stock-outs of 1 or more contraceptives 1. Increased private sector-led economic growth Cumulative dollar volume of private investment in target area 2. Increased private sector-led economic growth Present value of cumulative dollar volume of private investment in target area 3. Increased private sector-led economic growth Value of investment in target area

4 A few words about baselines and targets Baseline The condition or level of performance that exists prior to implementation of the program or interventionTarget The expected level of achievement of the result, as stated in the terms of the performance indicator, within a given period of time. Targets convey an understanding of the anticipated magnitude of change vis á vis USAID’s investment.

5 Integrated Managing for Results Performance Baseline Value of the performance indicator at the beginning of the planning period.  Baselines can/should be: Set just prior to the implementation of USAID-supported activities that contribute to the achievement of the relevant SO or IR Measured using the same data collection method that the SO team will use to assess progress Changed if the data collection method changes (document!)

6 Target and Baseline: An Illustration Intermediate Result Strategic Objective: A competitive, private financial sector that is more responsive to the needs of a market-oriented economy Key Indicator: Value of credit/equity provided to small and medium enterprises by private financial institutions Baseline: 2001 $10 million Target: 2007 $50 million

7 Integrated Managing for Results Performance Target Commitments made by the SO team about the level and timing of results to be achieved in a specified time period.  Targets: Can be expressed in quantity, quality or efficiency May be determined by setting final target first, then interim targets May need to be set after activities or sites are selected Can be adjusted over time Should be realistic! Should be outside the margin of error of historical trend If you don’t know where you’re going, you’ll end up somewhere else - Yogi Berra

8 Integrated Managing for Results Baseline and Target Setting - Best Practices  Look at historical trends  Consider partner and customer expectations of performance  Think about social norms and cultural factors  Consult experts/research findings  Benchmark accomplishments elsewhere  Disaggregate where relevant and possible

Contract AEP-C The client has determined that this document (please check one): ___ Is appropriate for public distribution on the Internet ___ Is not appropriate for public distribution on the Internet Assessing Data Quality USAID Performance Management Workshop “There are three kinds of lies: lies, damned lies, and statistics.” --Mark Twain

10 Integrated Managing for Results Issues MANAGEMENT  Can you make decisions based on this data?  Better quality data leads to better informed management and planning. REPORTING  Is this data believable?  Audiences want to know how “credible” your data is so they can trust your analysis and conclusions.

11 Integrated Managing for Results Five standards for quality of data VALIDITY RELIABILITY PRECISION TIMELINESS INTEGRITY

12 Integrated Managing for Results Validity  Key question: Do data clearly and directly measure what we intend? Issue: Direct  Result: Poverty of vulnerable communities in conflict region reduced  Indicator: Number of people living in poverty  Source: government statistics office  The government doesn’t include internally displaced people (IDPs) in the poverty statistics Issue: Bias  Result: Modern sanitation practices improved  Indicator: Number of residents in targeted villages who report using “clean household” practices  Source: door-to-door survey conducted three times a year  Most of the people in the targeted region work long hours in the fields during the harvest season

13 Integrated Managing for Results Reliability  Key question: If you repeated the same measurement or collection process, would you get the same data? Issue: Consistency or Repeatability  Result: Employment opportunities for targeted sectors expanded  Indicator: Number of people employed by USAID-assisted enterprises  Source: Structured interviews with USAID-assisted enterprises, as reported by implementing partner AAA, BBB, and CCC  The SO Team found out that the implementing partners were using these definitions: AAA – employees means receives wages from the enterprise BBB – employees means receives full-time wages from the enterprise CCC – employees means works at least 25 hours a week

14 Integrated Managing for Results Timeliness  Key question: Are data available timely enough to inform management decisions? Issue: How Frequent  Result: Use of modern contraceptives by targeted population increased  Indicator: Number of married women of reproductive age reporting using modern contraceptives (CPR)  Source: DHS Survey  The DHS survey is conducted approximately every 5 years Issue: How Current  Result: Primary school attrition in targeted region reduced  Indicator: Rate of student attrition for years 1 and 2 at targeted schools  Source: Enrollment analysis report from Ministry of Education  In July 2002 the MOE published full enrollment analysis for school year August 2000 – June 2001

15 Integrated Managing for Results Precision  Key question: Are the data precise enough to inform management decisions? Issue: Enough detail  Result: CSO representation of citizen interests at national level increased  Indicator: Average score of USAID-assisted CSOs on the CSO Advocacy Index  Source: Ratings made by Partner XXX after interviews with each CSO  The SO team reported this data to the Mission Director:  1999 = = = Issue: Margin of error  Result: Primary school attrition in targeted region reduced  Indicator: Rate of student attrition for years 1 and 2 at targeted schools  Source: Survey conducted by partner. Survey is informal and has a margin of error of +/- 10%  The the USAID intervention is expected to cause 5 more students (for every 100) to stay in school longer

16 Integrated Managing for Results Integrity  Key question: Are there mechanisms in place to reduce the possibility that data are manipulated for political or personal gain? Issue: Intentional manipulation  Result: Financial sustainability of targeted CSOs improved  Indicator: Dollars of funding raised from local sources per year  Source: Structured interviews with targeted CSOs  When a SO Team member conducted spot checks with the CSOs, she found out that organizations CCC and GGG counted funds from other donors as part of the “locally raised” funds.

17 Integrated Managing for Results Techniques to Assess Data Quality WHY Goal is to ensure SO team is aware of: Data strengths and weaknesses Extent to which data can be trusted when making management decisions and reporting All data reported to Washington must have had a data quality assessment at some time in the three years before submission. ADS

18 Integrated Managing for Results HOW? Steps to Conduct Assessment 1. Review performance data  Examine data collection, maintenance, processing procedures and controls 2. Verify performance data against Agency data quality standards  Reliability, precision, timeliness, validity, integrity 3. If data quality limitations are identified, take actions to address them  Triangulate; Supplement with data from multiple sources  Report the limitations  Revise indicator 4. Document the assessment and the limitations in the Performance Indicator Reference Sheet 5. Retain supporting documentation in files  Decisions and actions concerning data quality problems  Approach for conducting data quality assessment 6. If data will be included in the annual report, disclose the DQA findings in the “data quality limitations” section of the Annual report

19 Integrated Managing for Results Task 6: 1 hour DQA on Middleland Data 1. Collect the rating sheet from the back of the MEMS DQA form 2. Each person selects one indicator from the Middleland PMP in tab 1 3. Each person reviews the Indicator Reference Sheet for the indicator selected 4. Check off the yes/no column for the indicator and note any issues/recommendations in the appropriate column 5. List those indicators with problems on a flip chart

20 Integrated Managing for Results Analyze data for a single indicator  Compare actual performance against target(s)  Compare current performance against prior year  Compare current performance to baseline(s)

21 Integrated Managing for Results Analyze trends in performance  Analyze target trend(s) against actual trend(s)  Examine performance (met, exceeded or short of target) of lower results in relation to higher results  Examine trend data from critical assumption monitoring to help interpret results

22 Integrated Managing for Results Assess USAID contribution  Examine timing of results in relation to timing of USAID efforts  Compare trends in results to trends in changes of level of effort  Compare performance to control group or benchmarks in similar environments

23 Integrated Managing for Results What does the data tell you?  Do you trust your PMP data?  Is the hypothesis working?  Why and/or why not?  Do you need more information?

24 Integrated Managing for Results Performance monitoring and evaluation Performance Monitoring  Focuses on whether results are being achieved or not  Ongoing, routine  Often quantitative  A process that involves identifying indicators, baselines and targets collecting actual results data comparing performance against target  Contributes to management decision making Performance Monitoring  Focuses on whether results are being achieved or not  Ongoing, routine  Often quantitative  A process that involves identifying indicators, baselines and targets collecting actual results data comparing performance against target  Contributes to management decision making Evaluation  Focuses on why/how results are achieved or not  Occasional, selective  Quantitative and qualitative  A structured, analytical effort to answer managers’ questions about validity of hypothesis unexpected progress customer needs sustainability unintended impacts lessons learned  Makes management recommendations Evaluation  Focuses on why/how results are achieved or not  Occasional, selective  Quantitative and qualitative  A structured, analytical effort to answer managers’ questions about validity of hypothesis unexpected progress customer needs sustainability unintended impacts lessons learned  Makes management recommendations

25 An Evaluation Statement of Work States purpose, audience and use of the evaluation Clarifies the evaluation question(s) Identifies activity, program, or approach to be evaluated Provides a brief background on implementation Identifies existing performance information sources

26 An Evaluation Aims to Produce… RECOMMENDATIONS PROPOSED ACTIONS FOR MANAGEMENT BASED ON THE CONCLUSIONS CONCLUSIONS INTERPRETATIONS & JUDGMENTS BASED ON THE FINDINGS FINDINGS FACTS & EVIDENCE COLLECTED DURING THE EVALUATION

27 Data analysis and use What is happening with respect to each individual result? Were performance targets met? How did performance compare with last year’s? What is the trend from the baseline year? How does the trend compare with expectations?

28 Rules of thumb… Results were achieved as intended Program was implemented as planned Implementation success, strategy success Program was implemented as planned Results were not achieved as intended Implementation success, strategy failure Program was not implemented as planned It is not clear whether results would have been achieved Implementation failure, strategy uncertainty

29 Implementation problems? Are the activities on track and are outputs being produced on time? Are outputs being produced in sufficient quantity? Is the quality of the activities and outputs adequate? Are our partners achieving critical results and outputs for which they are responsible?

30 Data analysis and use What are we going to do about it? Do we change the... Program? Results framework? Activities? Indicators? Targets?

31 Integrated Managing for Results Task 7: 1 hour Performance Information and Use 1. Review the SOT analysis of performance information for Bangladesh. 2. Consider these questions: Do you trust the data? Does the Development Hypothesis work? 3. Record on a flipchart what decisions you might make, including decisions to evaluate questions that arise.