1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs Day 2: PMP Performance Management Plan.

Slides:



Advertisements
Similar presentations
Results Based Monitoring (RBM)
Advertisements

Introduction to Monitoring and Evaluation
Ray C. Rist The World Bank Washington, D.C.
Measurement and Evaluation: Indicators of Engagement Professor Ronnie Munck (DCU) & Dr Rhonda Wynne (UCD)
Comprehensive M&E Systems
Action Implementation and Evaluation Planning Whist the intervention plan describes how the population nutrition problem for a particular target group.
1 Designing a Monitoring and Evaluation System for a Rural Travel and Transport Project Michael Bamberger Gender and Development Group The World Bank RTTP.
Standards and Guidelines for Quality Assurance in the European
Indicator Baseline Target Milestones PERFORMANCE MEASUREMENTS.
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Improving Results Reporting Operations Managers Workshop Kiev, October 2008 Bratislava Regional Center Management Practice.
Indicators of Success -- Applying the TOC What will change? You must be able to test your theory!
The Evaluation Plan.
Work Programme for the specific programme for research, technological development and demonstration "Integrating and strengthening the European Research.
Working Definition of Program Evaluation
Certificate IV in Project Management Introduction to Project Management Course Number Qualification Code BSB41507.
Michalis Adamantiadis Transport Policy Adviser, SSATP SSATP Capacity Development Strategy Annual Meeting, December 2012.
Developing Indicators
KEYWORDS REFRESHMENT. Activities: in the context of the Logframe Matrix, these are the actions (tasks) that have to be taken to produce results Analysis.
The National Development Plan, Iraq 6 July 2010 “Developing Objectives & Indicators for Strategic Planning” Khaled Ehsan and Helen Olafsdottir UNDP Iraq.
Getting Started Conservation Coaches Network New Coach Training.
Assessments. Assessment in the Project Cycle DESIGN IMPLEMENTATION MONITORING EVALUATION ASSESSMENT.
The LOGICAL FRAMEWORK Scoping the Essential Elements of a Project Dr. Suchat Katima Mekong Institute.
1 Women Entrepreneurs in Rural Tourism Evaluation Indicators Bristol, November 2010 RG EVANS ASSOCIATES November 2010.
MARKETS II M&E FRAMEWORK AND CHALLENGES Joseph Obado.
Sub-Regional Workshop for GEF Focal Points in West and Central Africa Accra, Ghana, 9-11 July 2009 Tracking National Portfolios and Assessing Results.
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
Ahmad Al-Ghoul. Learning Objectives Explain what a project is,, list various attributes of projects. Describe project management, discuss Who uses Project.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Gulana Hajiyeva Environmental Specialist World Bank Moscow Safeguards Training, May 30 – June 1, 2012.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
USAID’s Approach to Monitoring Capacity Building Activities Experiences, lessons learned, and best practices Duane Muller, USAID November 5, 2007 UNFCCC.
UK Aid Direct Introduction to Logframes (only required at proposal stage)
Results achieved under IFAD VII and directions for results measurement under IFAD VIII Edward Heinemann Programme Manager, Action Plan Secretariat, Office.
Training Resource Manual on Integrated Assessment Session UNEP-UNCTAD CBTF Process of an Integrated Assessment Session 2.
Screen 1 of 22 Food Security Policies – Formulation and Implementation Policy Monitoring and Evaluation LEARNING OBJECTIVES Define the purpose of a monitoring.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
OPTIONS AND REQUIREMENTS FOR ENGAGEMENT OF CIVIL SOCIETY IN GEF PROJECTS presented by Ermath Harrington GEF Regional Focal Point.
LIBERIA INDICATOR CHARACTERISTICS How closely do the indicators used measure the results?
Policies and Procedures for Civil Society Participation in GEF Programme and Projects presented by GEF NGO Network ECW.
Tracking national portfolios and assessing results Sub-regional Workshop for GEF Focal Points in West and Central Africa June 2008, Douala, Cameroon.
1 EMS Fundamentals An Introduction to the EMS Process Roadmap AASHTO EMS Workshop.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs  Day 4: Performance Information Management.
Changing the way the New Zealand Aid Programme monitors and evaluates its Aid Ingrid van Aalst Principal Evaluation Manager Development Strategy & Effectiveness.
Health Management Dr. Sireen Alkhaldi, DrPH Community Medicine Faculty of Medicine, The University of Jordan First Semester 2015 / 2016.
Key Components of a successful Proposal OAS / IACML Workshop on Technical Assistance. San José, Costa Rica May 8, 2007 By: José Luis Alvarez R. Consultant.
The United States Foreign Assistance Reforms: An Overview.
Country Strategy Development and Planning Subhi Mehdi, AFR/SD June
1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs Day 1: Big Picture Managing for Results or Performance Management.
Project Management Processes for a Project Chapter 3 PMBOK® Fourth Edition.
Presentation to the Ad-hoc Joint Sub-Committee on Parliamentary Oversight and Accountability Wednesday 20 March 2002 PUBLIC SERVICE MONITORING AND EVALUATION.
Company LOGO. Company LOGO PE, PMP, PgMP, PME, MCT, PRINCE2 Practitioner.
Session 2: Developing a Comprehensive M&E Work Plan.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Development of Gender Sensitive M&E: Tools and Strategies.
Detailed Implementation and Management Planning (DIMP) Workshop Kampala, Uganda 8-11 December 2009.
Developing Program Indicators Measuring Results MEASURE Evaluation.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Country Level Programs
Lecture 3: Procedures of extension work
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Monitoring and Evaluation using the
4.2 Identify intervention outputs
Changing the Game The Logic Model
Presentation transcript:

1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs Day 2: PMP Performance Management Plan

2 Managing for Results Results Results that USAID wants to achieve programs activities USAID implements programs & activities Information Information to assess & improve performance GPRA & Core Values & USG Priorities Joint State & USAID Strategi c Plan

3 Integrated Managing for Results What is a Performance Monitoring Plan? A PMP is:  Tool for management  Living document  Constant desk reference  Tool for organizational learning  Tool to tell your story better A PMP is:  Tool for management  Living document  Constant desk reference  Tool for organizational learning  Tool to tell your story better A PMP should not be:  Done solely to satisfy USAID/Washington  Something to satisfy congress  Done just to “be a team player”  Filed away to gather dust A PMP should not be:  Done solely to satisfy USAID/Washington  Something to satisfy congress  Done just to “be a team player”  Filed away to gather dust

4 Integrated Managing for Results Elements of a PMP (ADS Pg. 10) MANDATORY  A PMP must define at least one indicator to measure progress towards the SO level and at least one indicator to measure progress towards each IR.  Each of those indicators must include baseline levels and targets. A complete PMP should include…………..  A calendar of performance management tasks  Performance indicators  Baseline and target values for each indicator  Date Source and method of data collection  Schedule for data collection  Description of known data limitations  Description of data quality assessment procedures

5 Integrated Managing for Results Task 3: ADS criteria for a PMP (Page 10) ADS criteria that “should” be met Middle Yes Land No Nigeria Yes Mission No A. Calendar of performance management tasks XX B. Performance Indicators C. Baseline and target values D. Data source and method of data collection E. Schedule for data collection F. Known data limitations G. Quality assessment procedures

6 Strategic Planning and Reporting Process Need for more effective cross-Agency planning processes Agency Overseers require clear, aggregated USAID results reporting How do we consult effectively with stakeholders within this new model? OU articulates a Strategy Statement, citing SOs and related development hypotheses OU Strategy Statement, 3 to 10+ years Operational Plan (3 yrs horizon) as part of the Annual Report and budget submission OU develops and annually updates Strategy Statement via the Operational Plan and Reports on: Agency Results (via Common Indicators) & Program results (via SO indicators) Issues Deliverables Regional Strategic Frameworks and Technical Strategies Bureaus communicate their priorities in light of Agency Planning Framework Joint State / USAID Strategi c Plan

7 Strategic Planning

8 Technical Analyses Newly revised ADS 201 will shift focus of technical analyses to activity planning Strategy Statements must address “statutorily required analysis” Bureau Frameworks will help provide analytical support for Operating Unit Strategic Objectives

9 Environmental Analyses Biological Diversity Conservation of Biological Diversity tropical forests Actions necessary to achieve conservation and sustainable management of tropical forests

10 Gender Analysis Should ask: How will gender roles/relations affect the achievement of sustainable results? How will proposed results affect the relative status of men and women?

11 Additional technical analyses for Strategy Statements and SOs Donor coordination analysis Humanitarian relief/food aid assessments Macro-economic analysis Sector assessments Social soundness analysis Determines compatibility of an SO or activity with the socio-cultural environment, and anticipated impact on different groups

12 How? What else? Assuming what? Unit Strategic Objective Intermediate Results Outputs  Activities Performance Goal Joint SOs and Strategic Goals Why? The Development Hypothesis

13 Critical Assumptions 1.Market prices for farmers’ products remain stable or increase 2.Prices of agricultural inputs remain stable or decrease 3.Roads needed to get produce to market are maintained 4.Rainfall and other critical weather conditions remain stable Key USAID responsible USAID + Partner(s) responsible Partner(s) responsible IR 1 Farmers’ access to commercial capital increased (5 years) IR 2 Farmers’ transport costs decreased (5 years) IR 3 Farmers’ knowledge about production options increased (4 years) IR 1.1 Farmers’ capacity to develop bank loan applications increased (4 years) IR 1.2 Banks’ loan policies become more favorable for rural sector (3 years) IR 2.1 Village associations capacity to negotiate contracts increased (4 years) IR 2.2 Additional local wholesale market facilities constructed (5 years) With the World Bank IR 3.1 New technologies available (4 years) World Bank IR 3.2 Farmers’ exposure to on- farm experiences of peers increased (3 years) Performance Goal Enhanced food security and agricultural development. Operating Unit Strategic Objective Increased production by farmers in the Upper River Zone (6 years) Strategic Goal Economic prosperity and security An illustrative Results Framework

14 Results framework “do’s” Start at the top Start at the top, with an SO that… Is within USAID’s “manageable interest” Can be achieved within the planning timeframe Has a defined geographical focus results Make sure the SO and all IRs are stated as results, not processes, which are clear, precise, uni-level, unidimensional and measurable causal Make sure that the linkages between SO and IRs are causal Use arrows to demonstrate this

15 Results framework “do’s” necessary Include only IRs and sub- IRs that are necessary to achieve the SO sufficient Include all IRs that together are sufficient to achieve the SO USAID-only, partner-only, USAID and partner joint IRs

16 Results framework “do’s” critical assumptions Identify and assess critical assumptions governing the causal linkages What external conditions, over which USAID has no control, are necessary for success? What are the external risks inherent in the development hypothesis? ?

17 Intermediate Result More transparent local governments OtherOutputsOtherOutputs Output Improved local government financial management skills

18 A set of actions through which inputs, such as commodities, technical assistance, training, or resource transfers, are mobilized to produce specific outputs

19 Project… Structured undertaking Often involving considerable money, personnel, and equipment Limited duration Developed through various administrative, analytical, and approval processes in order to achieve a tangible objective Such as a school construction project or an adult literacy project

20 How might we organize the following outputs into a “causal hierarchy?” (The IR is “Improved M&E Skills/Knowledge among USAID Personnel &IPs) 3 Training Venue Identified & Scheduled 2 Information on Skills & Knowledge Needed by Mission Personnel and IPs Identified 4 Complete Set of Training Materials & Instructional Methods in Place 1 Training Course Objectives & Design Developed 5 Trained Trainers in Place 7 Commitments From Participants to Attend Workshops Received 6 Managing For Results Workshop Delivered

21 6 Managing For Results Workshop Delivered 3 Training Venue Identified & Scheduled 2 Information on Skills & Knowledge Needed by Mission Personnel and IPs identified 4 Complete Set of Training Materials & Instructional Methods in Place 1 Training Course Objectives & Design Developed 7 Commitments From Participants to Attend Workshops Received 5 Trained Trainers in Place

22 Performance Standards for Outputs Are… QUANTITYQUALITY TIMELINESS Targeted indicators that tell ourselves and potential implementing partners our minimum requirements or expectations in terms of the QUANTITY, QUALITY and TIMELINESS of the outputs called for in our SOW or Program Description Key tools for post-award monitoring and assessment of our project’s progress and contribution to program results

23 “The cause-and-effect linkage between outputs and intermediate results is exactly the thing that drafters of a statement of work critically need in order to say ‘what’ a contractor will do, rather than ‘how’ it will be done…. This makes the work of drafting a statement of work so very much more simple. It also makes the work of implementing the A&A instruments much simpler because we don’t have to keep modifying contracts to say what it is we want. It helps to eliminate the ‘managing by contract modification,’ which occurs when we we don’t know what we want or how to say it.” -- Mary Reynolds (one of our favorite Contracting Officers of all time)

24 Performance standards at the project level are much like performance indicators at the program level… Direct Objective Useful for management Timely Practical Adequate

25 Should your project include higher-level program performance data/information as additional “outputs?”

26 Integrated Managing for Results Building a Results Framework (RF)  A problem is a discrepancy in somebody’s head between a Perception (based on Data) and an Ideal (based on ideal Data) floating on an Emotion supported by Values.  Key questions to build an RF:  1. Whose head(s) is it in?Where to start  2. How do you know?Data supporting Perception  3. How bad is it?Measure?Baseline  4. What measure is “solved”?Ideal Data and Target  5. What is the phenomenon?Indicator  6. What’s the “problem solved”? SO Result  7. What are the contributing problems? IR Results

27 Integrated Managing for Results Task 4: 1 hour Build a Results Framework  1.One person volunteers a real problem that is really in their head.  2.The groups asks that person, “How do you know?” until the data are identified.  3.The group asks that person for a measure of “how bad it is” until a baseline is identified.  4. The group asks that person for a measure of that same data indicating the “problem is solved” until a target is identified.  5. The group asks the person to identify the phenomenon the person is observing that relates to the baseline and target for an indicator statement.  6. The group states the original problem as if it were solved for a Results Statement.  7. Write the Result as an SO in a box at the top of the flip chart, the indicator, baseline and target  8. The group creates Intermediate Results by asking themselves, “Why hasn’t this problem been solved already? What are the contributing problems?  9. Start at step 2 with each IR level problem with the person suggesting it

28 Integrated Managing for Results Levels of Indicators  Agency Level Decision making External reporting (GPRA, special reports, e.g., HIV/AIDS)  Mission Level Decision making Internal management Reporting to Washington  Activity Level Decision making Management of activities Reporting to Mission Indicators used for management at any level should be maintained in a PMP format.

29 A performance indicator is... An observable or measurable characteristic that shows, or “indicates,” the extent to which an intended result is being achieved (problem being solved) “How will we know achievement when we see it?” A performance indicator answers the question, “How will we know achievement when we see it?”

30 A few examples... Intended Result Increased sustainability of NGOs Improved cleanliness of city streets Increased accountability of government agencies Increased responsiveness of local government Performance Indicator Percentage of annual revenues from non-donor sources Degree of difference in appearance between baseline and subsequent photos of randomly selected city streets Percentage of government agencies conducting and publicizing standard annual financial audits Average ratings of local government responsiveness on an annual citizens’ survey

31 Integrated Managing for Results STEP 1: Develop List of Potential Indicators The most important source of indicators……. Your portfolio of activities What problems do they solve? How do you know they are a problem?

32 Integrated Managing for Results STEP 2: Assess Potential Indicators Use USAID’s criteria for performance indicators 1. D irect 2. O bjective 3. U seful for Management 4. P ractical 5. A ttributable to USAID efforts 6. T imely 7. A dequate (ADS )

33 Integrated Managing for Results Direct?  The indicator closely tracks the result it is intended to measure. Result: Citizens’ knowledge of their rights expanded Direct Indicator: % survey respondents able to identify 3 or more key civil rights

34 Integrated Managing for Results Proxy Indicators  Indirect measures of the intended result.  Use if data for direct indicator is unavailable or not feasible to collect. Result: - Citizens’ knowledge of their rights expanded Direct Indicator: - % survey respondents able to identify 3 or more key civil rights Proxy Indicator: - Number of civil rights cases brought to court by targeted community organizations

35 Integrated Managing for Results Objective? Objective indicators are:  Unambiguous and operationally precise about What is being measured What data are being collected  Uni-dimensional  Consistent over time Result: - Performance of export firms improved non-Objective Indicator: - Number of successful export firms Objective Indicator: - % of export firms experiencing annual increase in revenues of at least 5%

36 Integrated Managing for Results Useful?  Is the indicator useful for management?  Which indicators are most meaningful at a given point in time? Context

37 Integrated Managing for Results Practical?  An indicator is practical if data are available when required for decision making data can be obtained at reasonable cost

38 Integrated Managing for Results Attributable?  Indicators selected for the PMP should measure changes that are clearly and reasonably attributable, at least in part, to USAID effort.  Attribution exists when the links between USAID outputs and the results being measured are clear and significant.  A simple way to assess attribution: If there had been no USAID activity, would the measured change have been different?

39 Integrated Managing for Results Timely?  Indicators should be available when they are needed for decision making. Are the data available frequently enough? Are the data current?

40 Integrated Managing for Results Adequate?  Taken as a group, the indicator and its companion indicators should be the minimum necessary to capture progress towards the desired result.  How many indicators? As many as are necessary and cost effective for management purposes  Too many indicators results in information overload  Too few indicators could be misleading RULE OF THUMB 2-3 PER RESULT

41 Integrated Managing for Results Gender & Performance Indicators MANDATORY Performance management systems and evaluations at the SO and IR level MUST include gender-sensitive indicators and sex-disaggregated data when supporting technical analyses demonstrate that: The activities or their anticipated results involve or affect women and men differently; and, If so, this difference would be an important factor in managing for sustainable program impact.

42 Integrated Managing for Results Quantitative or Qualitative Indicators? USAID Guidance: Use the most appropriate type for the result being measured The PMP should represent your overall strategy for getting the right mix of the two to effectively track progress.

43 Integrated Managing for Results Quantifying Qualitative Indicators Milestone Scales Rating Scales Indexes

44 Integrated Managing for Results Milestone Scale  Tracks incremental progress in a series of steps  Clearly defines each critical step Example: Key Law is Changed What are some milestones?

45 Integrated Managing for Results Rating Scale A measurement device that captures a range of subjective responses on a single issue or dimension of an issue. Ratings can be done by:  You or your team: Rating of financial management capacity of USAID supported NGOs  Beneficiaries: Rating by women who have received services from a health clinic On a scale of 1-5, where 1 is strongly agree and 5 is strongly disagree, please answer the following questions: -The staff were professional during your visit On a scale of 0-5, where 0 = no capacity and 5 = strong, sustainable capacity - Financial management capacity of NGO A Financial management capacity of NGO B

46 Integrated Managing for Results Types of Rating Systems  Expert Panels: Rating by an expert panel on progress in fiscal reform Rate each element of progress in the fiscal reform agenda on a scale of –5 to +5, where –5 represents serious/significant backward movement; 0 represents no movement and +5 represents serious/significant forward movement. -Fiscal deficit Taxes Social sector spending

47 Integrated Managing for Results Index Combination of two or more ratings collapsed into a single measure. Indicator: score of target CSO on the “CSO Capacity Index”. 1. Organizational Capacity Financial Management Networking Strategic Planning/Vision Impact (in selected CSO area) Actual Total = 16 / 25 or 64% - 1 is low and 5 is outstanding - Minimum 0, Maximum 25