Presentation is loading. Please wait.

Presentation is loading. Please wait.

Detailed Implementation and Management Planning (DIMP) Workshop Kampala, Uganda 8-11 December 2009.

Similar presentations


Presentation on theme: "Detailed Implementation and Management Planning (DIMP) Workshop Kampala, Uganda 8-11 December 2009."— Presentation transcript:

1 Detailed Implementation and Management Planning (DIMP) Workshop Kampala, Uganda 8-11 December 2009

2 2 OBJECTIVES:  To understand how the Grant PMP supports Performance Management  To practice using the 6 key questions to build the development hypotheses of each Grant Agreement  To examine what makes good indicators and apply that to Grant indicators  Apply a practical approach to target setting, given the work plan of activities, to Grant indicators  Complete selected sections of a Performance Indicator Reference Sheet (PIRS) using the instructions

3 What Connects Activities to an Objective?  What is your Grant Objective?  List 3-4 activities that you will do to achieve it.  When will your Grant Objective be achieved? 1 st year? 2 nd year? 3 rd year?  What connects the activities to the Grant Objective?  If dominoes arranged to stand on end serve as analogy, and initial pushes are activities, why does the last domino (objective) fall?

4 4 USAID Response  Ultimate customer or customer’s environment impact 1-20 years  Do not always succeed but produce learning from experience  Produce data on the road of causality traveled to benefiting customers: new knowledge & impact

5 5 Let’s Call It Performance Management  Performance Management (Def. p 67)  USAID Performance Report (Def. p 68) Pre-selected indicators must report data against targets. The data are produced by implementation of a PMP based on a Development Hypothesis expressed by a Results Framework with corresponding indicators.

6 6 What Is a Development Hypothesis? (Def/ p.61) DEVELOPMENT HYPOTHESIS = A theory about how a specific development result will be achieved. It is a proposed model of reality around the desired development result. It expresses the causal linkages among contributing problems, which if solved, would cause the specific development result to be achieved. Let’s read the development hypothesis of the PMP Case

7 7 What Is a Results Framework?  Graphic representation of a strategy for achieving a specific objective (Development Hypothesis) –Includes the objective, necessary intermediate results, and any critical assumptions that must hold –Conveys the implicit development hypothesis (cause-and-effect linkages) –Used as a planning, communications, and management tool Obj.: Water resources management in the watershed basins of Santa/Piura Chira improved 1 Number of Hectares (HA) classified as “restored” 2 Number of HA classified as “conserved” 3 Number of water resource management investments validated both scientifically and politically IR1: Models for restoration/ conservation validated 1 Number of models replicated 2 Number of replications IR2: Water management decision making restructured 1 Cases of highland/lowland decision making processes functioning 2 Shared planning tools support decision making 3 Scientific input reviewed IR3: Investment in improved water resources management increased 1. $ value of qualified projects

8 8 Characteristics of Results Framework Logic Assistance Objective Intermediate Result 1Intermediate Result 2 What Else? IF....... THEN Context? Assumptions? Risks? Activity AActivity BActivity CActivity DActivity E How? Why?

9 The Results Framework and the Development Hypothesis  Let’s look at the Results Framework in the PMP Case  What is the difference between the Results Framework and the Development Hypothesis?

10 10 What Is a Result? A RESULT = A problem solved Def. p 69, Result Def. p 60, Customer

11 11 Characteristics of Good Result Statements Clearly expresses intended Result Measurable Uni-dimensional Realistic

12 12 Types of Results: USAID Definitions Assistance Objective (Def. p 59, Strategic Objective – same) The most ambitious result that a USAID Operating Unit, along with its partners, can materially affect, and for which it is willing to be held accountable. AOs can be designed for an Operating Unit to provide analytic, technical, logistical or other types of support to the AOs of other Operating units (whether bilateral, multi-lateral or global in nature) Intermediate Result An important result that is seen as an essential step to achieving a Strategic Objective. IRs are measurable results that may capture a number of discrete and more specific results. IRs may also help to achieve other IRs.

13 13 If a Result is a Problem Solved: What Is a Problem?  Discrepancy in somebody’s head between a Perception (based on current data) and an Ideal (based on ideal data) floating on an Emotion supported by Values P(cd) / I(id) E V Reality Data

14 Task: Turning the Grant Objective into a Result Statement  Key Question #1: Whose head(s) contain the problem?  Key Question # 2: How would you state your Grant Objective as if the problem were already solved?  State your Objective as if the problem were already solved.  Write it in a box at the top of a flip chart as the beginning of a Results Framework.  Leave room for one or more indicators in the same box.

15 What Is An Indicator? (Def. p 67 Performance Indicator) An indicator is a phenomenon we observe that tells us there is a problem and whether it is being solved.

16 Task: Creating Indicators for the Grant Objective  Key Question #3: How do you know? –Ask this question until you get to the data that generated the problem in your heads.  Key Question #4: What phenomenon are you observing that generated the data? –Ask this question to define the indicator(s).  Use these questions to create one or more indicators for your Grant Objective and write it/them in the box.

17 17 Context Indicators  Reality surrounding a development problem is usually greater than USAID’s manageable interest.  Example: jobs for victims of trafficking or domestic violence are more difficult to find when unemployment is increasing.  Tracking unemployment as a context indicator informs the analysis of USAID efforts to reinsert victims into society.

18 18 What Is a Critical Assumption?  A general condition under which the development hypothesis, or strategy for achieving the objective, will hold true. Critical assumptions are outside the control or influence of USAID and its partners.  May exist from one level of the causal chain of a results framework to the next.  May apply to the entire strategy.  Enough Definitions!

19 19 Building a Results Framework 1. Whose head(s) contain(s) the problem? = Focus 2. How would the problem be stated if solved? = Result 3. How do you know it is a problem? = Data 4. What phenomenon are we measuring? = Indicator 5.What are the contributing problems, which if solved, would produce the above result? (apply questions 2 – 4 to those problems to build a Results Framework) 6.What are the relevant context indicators and critical assumptions?

20 20 Task: Creating a Results Framework 1.Key Question # 5: What are the contributing problems, which if solved, would be necessary and sufficient to cause the Grant Objective to be achieved? For each one:  Question # 2: How would you state it as if the problem were already solved? (Result Statement)  Question # 3:How do you know? (Data)  Question # 4: What phenomenon are we measuring? (Indicator) 2.Create Intermediate Results boxes with indicators. 3.Use the necessary and sufficient rule at each level. Add any relevant context indicators and critical assumptions.

21 21 Selecting Performance Indicators OBJECTIVES:  Understand 1.Job description 2.Criteria for selecting indicators 3.Indicator types and levels 4.Apply understanding to Grant indicators

22 22 Indicator Job Description and Criteria  Objective  Practical  Useful for Management  Direct  Attributable to USAID efforts  Timely  Adequate (ADS 203.3.4.2– p 16-17) Job description: To tell the most truth about whether a result is being achieved at the least cost when you need to hear it. USAID’s characteristics of good performance indicators:

23 23 Objective?  Unambiguous and operationally precise about –What is being measured –What data are being collected  Uni-dimensional - measures only one thing  Consistent over time Result: - Performance of CSOs active in local governance improved Imprecise Indicator: - Number of successful democracy-building CSOs Precise Indicator: - Number of CSOs that achieve at least 1 measurable objective in increased government transparency.

24 24 Practical?  Data are available when required for decisionmaking.  Data can be obtained at reasonable cost.

25 25 Useful?  Is the indicator useful for management?  Which indicators are most meaningful at a given point in time?

26 26 Direct?  The indicator closely tracks the result it is intended to measure. Result: Citizens’ knowledge of their rights expanded Direct Indicator: % survey respondents able to identify 3 or more key civil rights

27 27 Proxy Indicators  Indirect measures of the intended result.  Use if data for direct indicator is unavailable or not feasible to collect. Result: - Citizens’ knowledge of their rights expanded Direct Indicator: - % survey respondents able to identify 3 or more key civil rights Proxy Indicator: - Number of civil rights cases brought to court by targeted community organizations

28 28 If there had been no USAID activity, would the measured change have been different? Attributable?  Indicators selected for the PMP should measure changes that are clearly and reasonably attributable, at least in part, to USAID effort.  Attribution exists when the links between USAID outputs and the results being measured are clear and significant.  A simple way to assess attribution:

29 29 Timely?  Indicators should be available when they are needed for decision making. –Are the data available frequently enough? –Are the data current?

30 30 Adequate?  Taken as a group, the indicator and its companion indicators should be the minimum necessary to capture progress towards the desired result.  How many indicators? –As many as are necessary and cost effective for management purposes Too many indicators results in information overload Too few indicators could be misleading Use the “necessary and sufficient” rule to select the minimum number per result RULE OF THUMB 2-3 PER RESULT

31 31 Indicators: Quantitative, Qualitative or Both? QUALITATIVE  Expert opinion on comprehensiveness of a law QUANTITATIVE  Dollar value of plantains exported BOTH  Country score on “Corruption Perceptions Index”

32 32 Quantifying Qualitative Indicators (ADS 203.3.4.1 p 14)  Milestone Scales  Rating Scales  Indexes  Examples?

33 33 Example of Indicators at Different Levels  Reduced incidence of diphtheria, pertussis and tetanus in young children  No. of children vaccinated against diphtheria (D)  No. of children vaccinated against pertussis (P)  No. of children vaccinated against tetanus (T)  Provide vaccines to clinics  Decline in the under-five mortality rate Input/Activity Output Outcome Impact

34 34 Disaggregation and Gender  ADS 203.3.4.3 (p 17-18): Performance management systems and evaluations at the AO and project or activity levels must include gender-sensitive indicators and sex- disaggregated data when the technical analysis supporting the AO, project or activity to be undertaken demonstrate that: The activities or their anticipated results involve or affect women and men differently; and If so, this difference would be an important factor in managing for sustainable program impact.

35 35 TASK: SELECTING INDICATORS (“X” REJECTION CRITERIA) Result: Citizens’ awareness increased DirectObjectiveUsefulPracticalAttrib.TimelyAdequate 1. No. of people trained 2. Ratio of people trained/target population 3. % of people trained to target population 4. Ratio of cumulative No. of people trained to target population 5. No. of courses delivered 6. Average pre- and post-test scores of citizen awareness 7. Random focus group pre- and post-test scores of citizen awareness 8. Key informant interviews 9. No. 4 and No. 7

36 TASK: Selecting Performance Indicators  Use the table to put your indicators on the left with the 7 standards across the top.  Do your indicators meet all the criteria?  Fill in the boxes with Yes or No.  Did the exercise cause you to change anything?

37 37 Baselines and Targets OBJECTIVES:  Understand role in Performance Management  Review ADS definitions  Understand how to set baselines and targets  Explore Matrix approach to target setting

38 38 Role in Performance Management How will you know whether your Development Hypothesis works?

39 39 Performance Baseline Value of the performance indicator at the beginning of the planning period.  Baselines can/should be: –Set just prior to the implementation of USAID-supported activities that contribute to the achievement of the relevant Result –Measured using the same data collection method that the NGO will use to assess progress –Changed if the data collection method changes (document!)

40 40 Performance Target Commitments made by the NGO about the level and timing of results to be achieved in a specified time period.  Targets: –Can be expressed in quantity, quality or efficiency –May be determined by setting final target first, then interim targets –May need to be set after activities or sites are selected –Can be adjusted over time –Should be realistic! –Should be outside the margin of error of historical trend If you don’t know where you’re going, you’ll end up somewhere else - Yogi Berra

41 41 When setting targets, review:  Baseline (condition before intervention)  Historical trends  Expert judgments  Research findings  Achievements of similar programs elsewhere  Stakeholder expectations  Objectives and Results Frameworks  Prospective budgets  WORKPLAN ACTIVITIES: Review Case Example Matrix and Summary Data Table How Do We Set Targets?

42 42 TASK: The Indicator/Work Plan Matrix Approach to Setting Targets  Select 2 Intermediate Results and list their indicators.  For those activities in your Work Plan or Grant Proposal that impact each of the, indicators draw the time line for when they occur.  What target would you set for the year 2010, 2011, 2012?  Set targets for 3 years and fill in the Summary Data Table.

43 43 Performance Indicator Reference Sheets (PIRS) OBJECTIVES:  Review the relationship between the ADS PMP requirements and the PIRS  Understand the PIRS form and instructions  Complete selected sections of a PIRS using the instructions

44 44 ADS 203.3.3.1 Contents of a Complete PMP (p. 11) –Full set of performance indicators (1 per result in RF) –Base line and targeted values (Disaggregated by sex if indicated) –Source and method for data collection, schedule, data limitations, DQA procedures, estimated cost, evaluation –Calendar of Performance Management Tasks –How much of the above is in the PIRS? –Look at PIRS and Instructions Handouts

45 45 Indicator Job Description  Tell the most truth about the related result when we need to hear it at the least possible cost.  Let’s look at the headings of an Indicator Resume (PIRS) to see what is included. (See PMP Case PIRS example.)  Can it do the job?

46 46 Indicator Resume Sections  DESCRIPTION (What is it made of?)  PLAN FOR DATA ACQUISITION (How are data born and who are the midwives?)  DATA QUALITY ISSUES (Does it have any defects?)  PLAN FOR DATA ANALYSIS, REVIEW, & REPORTING (What is its job in life? How does it do it?)  BASELINE, TARGETS & ACTUALS (Expected/actual performance on the job?)

47 47 TASK: Complete Sections of a PIRS 1.Select two indicators from your Grant Results Framework 2.Review the PIRS instructions and apply them to completing the following sections on a flip chart or computer form: –Indicator –Precise Definition –Unit of Measure –Disaggregated by –Justification y Management Utility –Data Collection Method 3. Test: give the sheet to another NGO and ask if they could use it to gather good data with no further explanation or information.

48 48

49 Putting It All Together in a PMP  Review the Case PMP.  Review what sections Participants have/have not produced.

50 50 Summary Thoughts  Reality doesn’t have any problems, we do.  Reality doesn’t lie, we do.  The Lord God [reality] is very complicated; but not downright mean. – Albert Einstein  No amount of data can prove me right; any amount can prove me wrong. – Albert Einstein  Blessed are those who know what they are doing; for they shall know whether they have done it.


Download ppt "Detailed Implementation and Management Planning (DIMP) Workshop Kampala, Uganda 8-11 December 2009."

Similar presentations


Ads by Google