Detailed Implementation and Management Planning (DIMP) Workshop Kampala, Uganda 8-11 December 2009.

Slides:



Advertisements
Similar presentations
1 Mateja Bizilj PEMPAL BCOP KEY CONCEPTS AND APPROACHES TO PROGRAMS AND PERFORMANCE Tbilisi, June 28, 2007.
Advertisements

Results Based Monitoring (RBM)
Introduction to the Results Framework. What is a Results Framework? Graphic and narrative representation of a strategy for achieving a specific objective.
Introduction to Monitoring and Evaluation
Donald T. Simeon Caribbean Health Research Council
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
OUTCOME INVESTING (OI) Results-based approach to designing and managing investments Strategy, Measurement & Evaluation © Bill & Melinda Gates Foundation.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Ray C. Rist The World Bank Washington, D.C.
Results-Based Management: Logical Framework Approach
Action Logic Modelling Logic Models communicate a vision for an intervention as a solution to a public health nutrition (PHN) problem to:  funding agencies,
Objectives and Indicators for MCH Programs MCH in Developing Countries January 25, 2011.
Objectives and Indicators for MCH Programs
Indicator Baseline Target Milestones PERFORMANCE MEASUREMENTS.
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
LIBERIA 1 1 How do you build a Results Framework? DEVELOPMENT HYPOTHESIS.
Monitoring Evaluation Impact Assessment Objectives Be able to n explain basic monitoring and evaluation theory in relation to accountability n Identify.
CONCEPT PAPER RESULT BASED PLANNING. RESULT-ORIENTED PLANNING Overall Objective/ Goal Specific Objective/ Purposes Expected Result/ Output Activities.
TEMPUS IV- THIRD CALL FOR PROPOSALS Recommendation on how to make a good proposal TEMPUS INFORMATION DAYS Podgorica, MONTENEGRO 18 th December 2009.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
The Evaluation Plan.
Outcome Based Evaluation for Digital Library Projects and Services
Developing Indicators
© 2005 Pearson Education Canada Inc. Chapter 2 Sociological Investigation.
Semester 2: Lecture 9 Analyzing Qualitative Data: Evaluation Research Prepared by: Dr. Lloyd Waller ©
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Lesson 8: Effectiveness Macerata, 11 December Alessandro Valenza, Director, t33 srl.
The National Development Plan, Iraq 6 July 2010 “Developing Objectives & Indicators for Strategic Planning” Khaled Ehsan and Helen Olafsdottir UNDP Iraq.
Logic Models and Theory of Change Models: Defining and Telling Apart
LIBERIA THE BIG PICTURE Can the Agency tell the truth about results? Can the participants tell the truth about results?
M&E Basics Miguel Aragon Lopez, MD, MPH. UNAIDS M&E Senior Adviser 12 th May 2009.
Monitoring & Evaluation: The concepts and meaning Day 9 Session 1.
1 Designing Effective Programs: –Introduction to Program Design Steps –Organizational Strategic Planning –Approaches and Models –Evaluation, scheduling,
Indicators for ACSM.
Integrated Risk Management Charles Yoe, PhD Institute for Water Resources 2009.
General Grant Writing Tips. Research the problem/need and the program before beginning the grant proposal Review research on similar problems/needs and.
LIBERIA TARGET SETTING Is the bang worth the buck? TARGET SETTING.
LIBERIA 1 PERFORMANCE INDICATOR REFERENCE SHEET (PIRS) How good are the indicator resumes?
1 Performance Measures A model for understanding the behavior of our work Presented by Wendy Fraser.
LIBERIA INDICATOR CHARACTERISTICS How closely do the indicators used measure the results?
1 The Good, the Bad, and the Ugly: Collecting and Reporting Quality Performance Data.
Community Planning 101 Disability Preparedness Summit Nebraska Volunteer Service Commission Laurie Barger Sutter November 5, 2007.
1 Results-based Monitoring, Training Workshop, Windhoek, Results-based Monitoring Purpose and tasks Steps 1 to 5 of establishing a RbM.
1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs Day 2: PMP Performance Management Plan.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
1 Integrated Managing for Results Managing for Results: USAID Nigeria & IPs  Day 3: Performance Information.
1 Module 1 Introduction: The Role of Gender in Monitoring and Evaluation.
Session 2: Developing a Comprehensive M&E Work Plan.
How to Write a Project Proposal Specialization Introductory Module Thursday, May 9, 2013 Barbados.
Performance Management Planning Development Grants Program (DGP) Detailed Implementation and Management Planning (DIMP) Workshop PART 2 Dakar, Senegal.
Session 5: Selecting and Operationalizing Indicators.
Performance Management Planning Development Grants Program (DGP) Detailed Implementation and Management Planning (DIMP) Workshop PART 1 Dakar, Senegal.
Developing Program Indicators Measuring Results MEASURE Evaluation.
Developing a Monitoring & Evaluation Plan MEASURE Evaluation.
OGB Partner Advocacy Workshop 18 th & 19 th March 2010 Indicators.
USAID’s Data Quality Standards & Conducting a DQA # ? % ?
Stages of Research and Development
Gender-Sensitive Monitoring and Evaluation
Gender-Sensitive Monitoring and Evaluation
M&E Basics Miguel Aragon Lopez, MD, MPH
Session 1 – Study Objectives
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Measuring Outcomes of GEO and GEOSS: A Proposed Framework for Performance Measurement and Evaluation Ed Washburn, US EPA.
Multi-Sectoral Nutrition Action Planning Training Module
Logic Models and Theory of Change Models: Defining and Telling Apart
4.2 Identify intervention outputs
Changing the Game The Logic Model
M & E Plans and Frameworks
Presentation transcript:

Detailed Implementation and Management Planning (DIMP) Workshop Kampala, Uganda 8-11 December 2009

2 OBJECTIVES:  To understand how the Grant PMP supports Performance Management  To practice using the 6 key questions to build the development hypotheses of each Grant Agreement  To examine what makes good indicators and apply that to Grant indicators  Apply a practical approach to target setting, given the work plan of activities, to Grant indicators  Complete selected sections of a Performance Indicator Reference Sheet (PIRS) using the instructions

What Connects Activities to an Objective?  What is your Grant Objective?  List 3-4 activities that you will do to achieve it.  When will your Grant Objective be achieved? 1 st year? 2 nd year? 3 rd year?  What connects the activities to the Grant Objective?  If dominoes arranged to stand on end serve as analogy, and initial pushes are activities, why does the last domino (objective) fall?

4 USAID Response  Ultimate customer or customer’s environment impact 1-20 years  Do not always succeed but produce learning from experience  Produce data on the road of causality traveled to benefiting customers: new knowledge & impact

5 Let’s Call It Performance Management  Performance Management (Def. p 67)  USAID Performance Report (Def. p 68) Pre-selected indicators must report data against targets. The data are produced by implementation of a PMP based on a Development Hypothesis expressed by a Results Framework with corresponding indicators.

6 What Is a Development Hypothesis? (Def/ p.61) DEVELOPMENT HYPOTHESIS = A theory about how a specific development result will be achieved. It is a proposed model of reality around the desired development result. It expresses the causal linkages among contributing problems, which if solved, would cause the specific development result to be achieved. Let’s read the development hypothesis of the PMP Case

7 What Is a Results Framework?  Graphic representation of a strategy for achieving a specific objective (Development Hypothesis) –Includes the objective, necessary intermediate results, and any critical assumptions that must hold –Conveys the implicit development hypothesis (cause-and-effect linkages) –Used as a planning, communications, and management tool Obj.: Water resources management in the watershed basins of Santa/Piura Chira improved 1 Number of Hectares (HA) classified as “restored” 2 Number of HA classified as “conserved” 3 Number of water resource management investments validated both scientifically and politically IR1: Models for restoration/ conservation validated 1 Number of models replicated 2 Number of replications IR2: Water management decision making restructured 1 Cases of highland/lowland decision making processes functioning 2 Shared planning tools support decision making 3 Scientific input reviewed IR3: Investment in improved water resources management increased 1. $ value of qualified projects

8 Characteristics of Results Framework Logic Assistance Objective Intermediate Result 1Intermediate Result 2 What Else? IF THEN Context? Assumptions? Risks? Activity AActivity BActivity CActivity DActivity E How? Why?

The Results Framework and the Development Hypothesis  Let’s look at the Results Framework in the PMP Case  What is the difference between the Results Framework and the Development Hypothesis?

10 What Is a Result? A RESULT = A problem solved Def. p 69, Result Def. p 60, Customer

11 Characteristics of Good Result Statements Clearly expresses intended Result Measurable Uni-dimensional Realistic

12 Types of Results: USAID Definitions Assistance Objective (Def. p 59, Strategic Objective – same) The most ambitious result that a USAID Operating Unit, along with its partners, can materially affect, and for which it is willing to be held accountable. AOs can be designed for an Operating Unit to provide analytic, technical, logistical or other types of support to the AOs of other Operating units (whether bilateral, multi-lateral or global in nature) Intermediate Result An important result that is seen as an essential step to achieving a Strategic Objective. IRs are measurable results that may capture a number of discrete and more specific results. IRs may also help to achieve other IRs.

13 If a Result is a Problem Solved: What Is a Problem?  Discrepancy in somebody’s head between a Perception (based on current data) and an Ideal (based on ideal data) floating on an Emotion supported by Values P(cd) / I(id) E V Reality Data

Task: Turning the Grant Objective into a Result Statement  Key Question #1: Whose head(s) contain the problem?  Key Question # 2: How would you state your Grant Objective as if the problem were already solved?  State your Objective as if the problem were already solved.  Write it in a box at the top of a flip chart as the beginning of a Results Framework.  Leave room for one or more indicators in the same box.

What Is An Indicator? (Def. p 67 Performance Indicator) An indicator is a phenomenon we observe that tells us there is a problem and whether it is being solved.

Task: Creating Indicators for the Grant Objective  Key Question #3: How do you know? –Ask this question until you get to the data that generated the problem in your heads.  Key Question #4: What phenomenon are you observing that generated the data? –Ask this question to define the indicator(s).  Use these questions to create one or more indicators for your Grant Objective and write it/them in the box.

17 Context Indicators  Reality surrounding a development problem is usually greater than USAID’s manageable interest.  Example: jobs for victims of trafficking or domestic violence are more difficult to find when unemployment is increasing.  Tracking unemployment as a context indicator informs the analysis of USAID efforts to reinsert victims into society.

18 What Is a Critical Assumption?  A general condition under which the development hypothesis, or strategy for achieving the objective, will hold true. Critical assumptions are outside the control or influence of USAID and its partners.  May exist from one level of the causal chain of a results framework to the next.  May apply to the entire strategy.  Enough Definitions!

19 Building a Results Framework 1. Whose head(s) contain(s) the problem? = Focus 2. How would the problem be stated if solved? = Result 3. How do you know it is a problem? = Data 4. What phenomenon are we measuring? = Indicator 5.What are the contributing problems, which if solved, would produce the above result? (apply questions 2 – 4 to those problems to build a Results Framework) 6.What are the relevant context indicators and critical assumptions?

20 Task: Creating a Results Framework 1.Key Question # 5: What are the contributing problems, which if solved, would be necessary and sufficient to cause the Grant Objective to be achieved? For each one:  Question # 2: How would you state it as if the problem were already solved? (Result Statement)  Question # 3:How do you know? (Data)  Question # 4: What phenomenon are we measuring? (Indicator) 2.Create Intermediate Results boxes with indicators. 3.Use the necessary and sufficient rule at each level. Add any relevant context indicators and critical assumptions.

21 Selecting Performance Indicators OBJECTIVES:  Understand 1.Job description 2.Criteria for selecting indicators 3.Indicator types and levels 4.Apply understanding to Grant indicators

22 Indicator Job Description and Criteria  Objective  Practical  Useful for Management  Direct  Attributable to USAID efforts  Timely  Adequate (ADS – p 16-17) Job description: To tell the most truth about whether a result is being achieved at the least cost when you need to hear it. USAID’s characteristics of good performance indicators:

23 Objective?  Unambiguous and operationally precise about –What is being measured –What data are being collected  Uni-dimensional - measures only one thing  Consistent over time Result: - Performance of CSOs active in local governance improved Imprecise Indicator: - Number of successful democracy-building CSOs Precise Indicator: - Number of CSOs that achieve at least 1 measurable objective in increased government transparency.

24 Practical?  Data are available when required for decisionmaking.  Data can be obtained at reasonable cost.

25 Useful?  Is the indicator useful for management?  Which indicators are most meaningful at a given point in time?

26 Direct?  The indicator closely tracks the result it is intended to measure. Result: Citizens’ knowledge of their rights expanded Direct Indicator: % survey respondents able to identify 3 or more key civil rights

27 Proxy Indicators  Indirect measures of the intended result.  Use if data for direct indicator is unavailable or not feasible to collect. Result: - Citizens’ knowledge of their rights expanded Direct Indicator: - % survey respondents able to identify 3 or more key civil rights Proxy Indicator: - Number of civil rights cases brought to court by targeted community organizations

28 If there had been no USAID activity, would the measured change have been different? Attributable?  Indicators selected for the PMP should measure changes that are clearly and reasonably attributable, at least in part, to USAID effort.  Attribution exists when the links between USAID outputs and the results being measured are clear and significant.  A simple way to assess attribution:

29 Timely?  Indicators should be available when they are needed for decision making. –Are the data available frequently enough? –Are the data current?

30 Adequate?  Taken as a group, the indicator and its companion indicators should be the minimum necessary to capture progress towards the desired result.  How many indicators? –As many as are necessary and cost effective for management purposes Too many indicators results in information overload Too few indicators could be misleading Use the “necessary and sufficient” rule to select the minimum number per result RULE OF THUMB 2-3 PER RESULT

31 Indicators: Quantitative, Qualitative or Both? QUALITATIVE  Expert opinion on comprehensiveness of a law QUANTITATIVE  Dollar value of plantains exported BOTH  Country score on “Corruption Perceptions Index”

32 Quantifying Qualitative Indicators (ADS p 14)  Milestone Scales  Rating Scales  Indexes  Examples?

33 Example of Indicators at Different Levels  Reduced incidence of diphtheria, pertussis and tetanus in young children  No. of children vaccinated against diphtheria (D)  No. of children vaccinated against pertussis (P)  No. of children vaccinated against tetanus (T)  Provide vaccines to clinics  Decline in the under-five mortality rate Input/Activity Output Outcome Impact

34 Disaggregation and Gender  ADS (p 17-18): Performance management systems and evaluations at the AO and project or activity levels must include gender-sensitive indicators and sex- disaggregated data when the technical analysis supporting the AO, project or activity to be undertaken demonstrate that: The activities or their anticipated results involve or affect women and men differently; and If so, this difference would be an important factor in managing for sustainable program impact.

35 TASK: SELECTING INDICATORS (“X” REJECTION CRITERIA) Result: Citizens’ awareness increased DirectObjectiveUsefulPracticalAttrib.TimelyAdequate 1. No. of people trained 2. Ratio of people trained/target population 3. % of people trained to target population 4. Ratio of cumulative No. of people trained to target population 5. No. of courses delivered 6. Average pre- and post-test scores of citizen awareness 7. Random focus group pre- and post-test scores of citizen awareness 8. Key informant interviews 9. No. 4 and No. 7

TASK: Selecting Performance Indicators  Use the table to put your indicators on the left with the 7 standards across the top.  Do your indicators meet all the criteria?  Fill in the boxes with Yes or No.  Did the exercise cause you to change anything?

37 Baselines and Targets OBJECTIVES:  Understand role in Performance Management  Review ADS definitions  Understand how to set baselines and targets  Explore Matrix approach to target setting

38 Role in Performance Management How will you know whether your Development Hypothesis works?

39 Performance Baseline Value of the performance indicator at the beginning of the planning period.  Baselines can/should be: –Set just prior to the implementation of USAID-supported activities that contribute to the achievement of the relevant Result –Measured using the same data collection method that the NGO will use to assess progress –Changed if the data collection method changes (document!)

40 Performance Target Commitments made by the NGO about the level and timing of results to be achieved in a specified time period.  Targets: –Can be expressed in quantity, quality or efficiency –May be determined by setting final target first, then interim targets –May need to be set after activities or sites are selected –Can be adjusted over time –Should be realistic! –Should be outside the margin of error of historical trend If you don’t know where you’re going, you’ll end up somewhere else - Yogi Berra

41 When setting targets, review:  Baseline (condition before intervention)  Historical trends  Expert judgments  Research findings  Achievements of similar programs elsewhere  Stakeholder expectations  Objectives and Results Frameworks  Prospective budgets  WORKPLAN ACTIVITIES: Review Case Example Matrix and Summary Data Table How Do We Set Targets?

42 TASK: The Indicator/Work Plan Matrix Approach to Setting Targets  Select 2 Intermediate Results and list their indicators.  For those activities in your Work Plan or Grant Proposal that impact each of the, indicators draw the time line for when they occur.  What target would you set for the year 2010, 2011, 2012?  Set targets for 3 years and fill in the Summary Data Table.

43 Performance Indicator Reference Sheets (PIRS) OBJECTIVES:  Review the relationship between the ADS PMP requirements and the PIRS  Understand the PIRS form and instructions  Complete selected sections of a PIRS using the instructions

44 ADS Contents of a Complete PMP (p. 11) –Full set of performance indicators (1 per result in RF) –Base line and targeted values (Disaggregated by sex if indicated) –Source and method for data collection, schedule, data limitations, DQA procedures, estimated cost, evaluation –Calendar of Performance Management Tasks –How much of the above is in the PIRS? –Look at PIRS and Instructions Handouts

45 Indicator Job Description  Tell the most truth about the related result when we need to hear it at the least possible cost.  Let’s look at the headings of an Indicator Resume (PIRS) to see what is included. (See PMP Case PIRS example.)  Can it do the job?

46 Indicator Resume Sections  DESCRIPTION (What is it made of?)  PLAN FOR DATA ACQUISITION (How are data born and who are the midwives?)  DATA QUALITY ISSUES (Does it have any defects?)  PLAN FOR DATA ANALYSIS, REVIEW, & REPORTING (What is its job in life? How does it do it?)  BASELINE, TARGETS & ACTUALS (Expected/actual performance on the job?)

47 TASK: Complete Sections of a PIRS 1.Select two indicators from your Grant Results Framework 2.Review the PIRS instructions and apply them to completing the following sections on a flip chart or computer form: –Indicator –Precise Definition –Unit of Measure –Disaggregated by –Justification y Management Utility –Data Collection Method 3. Test: give the sheet to another NGO and ask if they could use it to gather good data with no further explanation or information.

48

Putting It All Together in a PMP  Review the Case PMP.  Review what sections Participants have/have not produced.

50 Summary Thoughts  Reality doesn’t have any problems, we do.  Reality doesn’t lie, we do.  The Lord God [reality] is very complicated; but not downright mean. – Albert Einstein  No amount of data can prove me right; any amount can prove me wrong. – Albert Einstein  Blessed are those who know what they are doing; for they shall know whether they have done it.