Download presentation
Presentation is loading. Please wait.
Published byRodger Curtis Modified over 8 years ago
1
Module 7- Evaluation: Quality and Standards
2
17/02/20162 Overview of the Module How the evaluation will be done Questions and criteria Methods and techniques Quality
3
17/02/20163 Reproducibility of findings (another evaluator would reach the same conclusions), clear and total distinction between conclusions and recommendations Objectivity, impartiality, pluralism in the implementation, independence from project management or policy bodies Relevance of the initial questions asked the evaluators in relation to the concerns of the sponsors, all services involved and all stakeholders in the project or the policy High quality and transparent methods, competent evaluators Implementation at the right time, in response to a demand on the part of the people in charge of the project / policy, with the improvement of the design or the implementation of their programs as a goal Communication of results Quality of the report: organization, size, style (clear, concise) Clear, exhaustive and detailed presentation of all the arguments, clear distinction between observations, hypotheses and opinions Credible (trustworthy) RelevantAccessible Useful For an evaluation to be: It must be: Which means: Qualities of an Evaluation
4
17/02/20164 Determinants of Quality This is a process. Quality must be ensured all along the chain: –Quality of the demand –Quality of the Terms of reference –Quality of the evaluation questions –Quality of the evaluators –Quality of the preparation –Quality of the execution –Quality of the analysis –Quality of reporting
5
17/02/20165 Quality of the Demand Justified by a will to improve In a climate of transparency and partnership Relevance of the questions Timing Coherence between the question and the means Quality of the terms of reference (which are an expression of the demand)
6
17/02/20166 Quality of the TORS Terms of reference (TOR) are a statement of expectations for the evaluation that generally include the issues and details about the required methodology, scheduling, cost and evaluator qualifications
7
17/02/20167 Terms of Reference (TOR) Should Include… Scope / Focus -- the Issues Stakeholders Requirements Cost Schedule Qualifications of Evaluators Deliverables/Products Who the clients are
8
17/02/20168 Why Are TOR Necessary? To clarify the reasons for the evaluation To flag issues that have become apparent To indicate the general depth and scope required To indicate any imperatives To protect the evaluator
9
17/02/20169 Quality of the Questions and "Evaluability" of the Program
10
17/02/201610 Discussion Qualities of the Evaluators What should be the profile of an evaluator? What skills, knowledge and attitudes should an evaluator have?
11
17/02/201611 Qualities Required of an Evaluator Initiative and innovation Independence Knowledge of the field Analytical skills Interpersonal skills Project management skills Synthesis and writing skills
12
17/02/201612 Quality of the Preparation At the core of the preparation is the choice of the approach that will be used (this approach must be adapted) but also the quality of the program of activities, well prepared tools and instruments, etc A process is followed to prepare for an evaluation and all the stages must be respected and properly done See Module 8 for a description of the various stages)
13
17/02/201613 Quality of the Implementation What was planned was implemented without any surprises or accidents (The final report will recount how the implementation unfolded) Quality of the actions taken Quality of the argumentation and the conclusions
14
17/02/201614 Quality of the Analysis: How to Make Judgments Put Criteria –Agree on criteria and assess case with respect to those criteria –Criteria need to be known and seen Norm –Look at what other good organizations in the setting do and use as benchmark Expert Panel –Eminent people look at data and use their judgement, based on their experience
15
17/02/201615 Interview Data Focus Group Data Questionnaire Data Multiple Lines of Evidence: Triangulation
16
17/02/201616 The Data Analysis Must be Carried out with Care Group discussion: –Based on your experience, what are the factors that can affect the analysis of the data collected? Give examples from your countries
17
17/02/201617 Issues in Data Analysis Problem –A lot of data but difficulty in making use of it –Contradictory data –Insufficient data –Unreliable data
18
17/02/201618 Formulating Data Facts Findings Conclusions Recommendations Lessons learned
19
17/02/201619 Fact Versus Finding A fact is a piece of information that has been verified –There has been a 20% increase in program costs in the last 3 years A finding is an analysis of related facts –Although the cost of the program has increased, there has been a 10% increase in productivity
20
17/02/201620 Conclusions A conclusion covers a major aspect of the evaluation and is generally based on a collection of findings Conclusions are often saved for the concluding chapter of an evaluation report
21
17/02/201621 Recommendations Directed to a responsible person/body State clearly what is to be done State when it is to be done by
22
17/02/201622 Lessons Learned A lesson is a hypothesis that is based on the findings of one or more evaluations A lesson is presumed to relate to a general principle that may be applied more widely
23
17/02/201623 Lessons Learned Example from an evaluation of a corporate training program –“The outcomes of training are more likely to be transferred to the job when the immediate supervisor supports the transfer process by meeting with the employee and developing a plan.”
24
17/02/201624 Quality of Reporting: When do you Communicate? Before the evaluation –To ensure that people are informed of purpose and objectives of the evaluation and their role in it During the evaluation –To ensure that people are informed of progress After the evaluation –To disseminate results, decisions made and follow-up
25
17/02/201625 Effective Reports Respond to the questions and issues defined in the TOR Short, succinct and well organized Judicious use of graphs, tables and charts Developed with stakeholder participation Delivered on time
26
17/02/201626 Formal Report Outline Executive summary Introduction Program description Evaluation questions Methods Findings Conclusions and recommendations Appendix - instruments, TORs
27
17/02/201627 Ways to Communicate in an Evaluation InformalPhone calls E-mails Quick faxes Internal correspondence FormalBriefings Presentations Written reports
28
17/02/201628 Different Audiences Have Different Needs Internal staff might need a verbal report and a memo with key points Donors and external stakeholders might need a full report Ministries might need an abstract Public at large might need a precise of findings Know your audience and match your reporting approach
29
17/02/201629 Effective Communication of Evaluation Results Captures the data in its conclusions Speaks the language of users Detached, non-possessive stance Objective - “truth” to power, but Pragmatic - goes only as far as the key stakeholders will accept
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.