Presentation is loading. Please wait.

Presentation is loading. Please wait.

LIBERIA 1 Why/when do we have to do them? What does a good one look like? Evaluations.

Similar presentations


Presentation on theme: "LIBERIA 1 Why/when do we have to do them? What does a good one look like? Evaluations."— Presentation transcript:

1 LIBERIA 1 Why/when do we have to do them? What does a good one look like? Evaluations

2 LIBERIA 2 Evaluation: Objectives  Understand monitoring vs. evaluation  Know some of the evaluation triggers  Use the 3 key questions for evaluation design  Assess an evaluation

3 LIBERIA 3 Let’s start with the word: E*val*u*ate What does it mean? 1. To ascertain or fix the value or worth of; 2.To examine and carefully judge. Source: The American Heritage Dictionary

4 LIBERIA 4 Evaluation is the systematic collection of information about the characteristics and outcomes of Assistance Objectives, projects or activities in order to make judgments, improve effectiveness, and/or inform decisions about current and future programming. See Evaluation Policy in Tab 11 ADS 200.6 Definition. (p. 61)

5 LIBERIA 5 Monitoring (or Performance Measurement) Evaluation What happened? Why? Why Not? Planned results Planned & Unplanned Results PredeterminedWide variety indicators of measures Usually accepts Usually free to design challenge design Continuous Periodic Relationship: Monitoring to Evaluation

6 LIBERIA 6 Evaluation Goes Beyond What Monitoring Can Tell U s  Why were results above/below expectations, and what is facilitating/impeding activity success?  Are the planned results the only results the activity is producing? Are there any side-effects?  Did the activity actually “cause” the results we see, or were other factors responsible?  Is there any evidence to suggest that results achieved by the activity will be sustained? Evaluation Goes Beyond What Monitoring Can Tell Us

7 LIBERIA 7  A key management decision is required, and there is inadequate information;  Performance information indicates an unexpected result (positive or negative) that should be explained (such as gender differential results);  Customer, partner, or other informed feedback suggests that there are implementation problems, unmet needs, or unintended consequences or impacts; USAID “Triggers” for Evaluation:

8 LIBERIA 8  Issues of sustainability, cost-effectiveness, or relevance arise;  The validity of Results Framework hypotheses or critical assumptions is questioned;  Project Reviews have identified key questions that need to be answered or that need consensus.  Extracting lessons is important for the benefit of other on-going or future programs, here or elsewhere. USAID “Triggers” for Evaluation:

9 LIBERIA 9 Source: ADS, 203 What questions need answers and when? MSI Management Systems International

10 LIBERIA 10 Mid-Term (Formative) Evaluations  Process questions  Initial results Final (Summative) Evaluations  Intended End of Project Results  Plans for Sustaining the Activity Ex Post Facto (Impact) Evaluations  Long term impact  Sustainability What Do We Want to Know – and When? What questions need answers and when?

11 LIBERIA 11 One useful technique for controlling the total number of evaluation questions is clustering. Impact Questions Design Questions Cost Questions Put a limit on the total number of questions. Cluster similar questions together. Eliminate redundancy. Put the remaining questions in priority order. Drop the least important questions in each cluster.

12 LIBERIA 12 Evaluation Design The term design refers to the conceptual structure we use to carry out an evaluation. An evaluation’s design or plan responds to: --The evaluation’s purpose, audience and intended uses. --The evaluation questions it must address. --The evidence required to answer those questions.

13 LIBERIA 13 The 3 key questions for evaluation design 1.What are the evaluation questions? 2.What data sources can respond to the questions? (In reality out there?) 3.How do I get them to tell the truth with evidence? (Design)

14 LIBERIA 14 Let’s look at the case Read III. Project Description (p.6-7) Discuss for common understanding Read IV. Purpose of the Evaluation (p.7) What are the evaluation questions? Read 1 st paragraph of V. Methodology What are the data sources? (p.8) Paragraph 1

15 LIBERIA 15 QuestionData sources Docs Staff Course Miners Mine Orgs Design 1.Is the strategy valid? x x Doc./staff review for clear strategy model. Build it into miner/org. interview form 2.Activity Impact? x x x x x Doc./staff review for activity list. Build into miner interview form & mine checklist 3.Project resources adequate x x x Docs/Staff for org chart and span of control. Compare to Miner interview results 4.Participants interested? x Activity list built into miner interview form –ask what should be continued. 5.Improvements sustainable? x x Miner interview form asks question after improvements identified. Ask also of miner/org. 6.Impact monitoring? x Look for evidence in doc./staff review meetings.

16 LIBERIA 16 Data Collection Method Instruments Timing & LOE Scale of Application Document Review Checklist: Strategy, Activities Org. chart, Impact Proposal, Reports, pamphlets Project Personnel interviews Director, 4 Engineers, Trainer 40+ in 6 mines in Oruro & Atocha Prepare a Detailed Data Collection Worksheet Course attendance Mine inspection Q’s Miner interviews Support Org. interviews 1,2, 3,6 1,2, 3,6 1-5 2 1,5 Checklist: Doc. Review products & Impact Monitoring Standard Questionnaire for Qs 1-5 Observation checklists Standard questionnaire 1 course and 2 mines +/- 10 people 5 organizations ½ day Mon 1pd ½ day Mon 1 pd Oruro 2 Atocha 2 Tu-Th 8 pd Atocha Th- Fri 1 pd 1 hr. ea. M-F, 0 pd

17 LIBERIA 17 Findings Findings should be factual representations of the data collected and relationships among them Everybody review the VI. Findings top of p. 10 on CEPROMIN Project Organization Then divide your group into ½ and one sub-group reads Miner interviews Oruro p. 10-12 One sub-group reads Miner interviews Atocha p. 12 - 15

18 LIBERIA 18 Conclusions/Recommendations These should be evaluator judgments backed up by data in the findings. One sub-group reads VII Conclusions/ Recommendations for Qs 1. and 2. (p.15- 18) The other sub-group reads for Qs 3 – 6. (p.18 – 20) Discuss: Are the questions really answered? Is there sufficient evidence to support the answers?

19 LIBERIA 19 Evaluation Pyramid Decisions Recommendations Interpretations/ C o n c l u s i o n s F i n d i n g s D A T A DATA COLLECION METHODS E V A L U A T I O N D E S I G N EVALUATION QUESTIONS

20 LIBERIA 20 Key Elements of an Evaluation SOW Program Or Project Description Purpose Of the Evaluation Questions Evaluation Methods (optional) Description of Deliverables Evaluation Schedule Evaluation Budget Key Substantive Sections Main Contractual Sections Key Elements of an Evaluation SOW

21 LIBERIA 21 Scope of Work in Work Book ELEMENTS YESNO Program/Project description adequate? Purpose of the evaluation clear? Evaluation questions clear? Evaluation methods match data sources? Deliverables clear? Evaluation schedule make sense? Budget complete?

22 LIBERIA 22 Task XIII: Rate Evaluation Using Policy Checklist 1.The evaluation report should represent a thoughtful, well- researched and well organized effort to objectively evaluate what worked in the project, what did not and why. YES___ NO ___ 2.Evaluation reports shall address all evaluation questions included in the scope of work. YES___ NO ___ 3.The evaluation report should include the scope of work as an annex. YES___ NO ___ 4.Evaluation methodology shall be explained in detail and all tools used in conducting the evaluation such as questionnaires, checklists and discussion guides will be included in an Annex in the final report. YES___ NO ___ 5.Evaluation findings will assess outcomes and impact on males and females. YES___ NO ___

23 LIBERIA 23 Task XIII: Rate Evaluation Using Policy Checklist 6.Limitations to the evaluation shall be disclosed in the report, with particular attention to the limitations associated with the evaluation methodology (selection bias, recall bias, unobservable differences between comparison groups, etc.). YES___ NO ___ 7.Evaluation findings should be presented as analyzed facts, evidence and data and not based on anecdotes, hearsay or the compilation of people’s opinions. Findings should be specific, concise and supported by strong quantitative or qualitative evidence. YES___ NO ___ 8.Sources of information need to be properly identified and listed in an annex. YES___ NO ___ 9.Recommendations need to be supported by a specific set of findings. YES___ NO ___ 10.Recommendations should be action-oriented, practical and specific, with defined responsibility for the action. YES___ NO ___

24 LIBERIA 24 Session 8 Summary Monitoring and Evaluation Evaluation Triggers Evaluation Questions 3 Key Questions for Evaluation Design Evaluation Pyramid Elements of an Evaluation SOW Evaluation Checklist We do not know what we do not know (even if we think we do.)


Download ppt "LIBERIA 1 Why/when do we have to do them? What does a good one look like? Evaluations."

Similar presentations


Ads by Google