Download presentation
Presentation is loading. Please wait.
Published byClinton Wilkerson Modified over 9 years ago
1
Types of Evaluation – For whom and why? Instrumental use – input decision making Instrumental use – input decision making Conceptual – deeper understanding, learning Conceptual – deeper understanding, learning Legitimisation – mobilise official support Legitimisation – mobilise official support Tactical use – gain time Tactical use – gain time Ritual use – empty and see the “big picture Ritual use – empty and see the “big picture Process use Process use Evaluation questions and responsibility of the evaluation. ToR critically important. Evaluation questions and responsibility of the evaluation. ToR critically important.
2
When? Mid-term evaluation – learning Mid-term evaluation – learning On-going – performance monitoring On-going – performance monitoring At the end of the project At the end of the project Ex-post - impact Ex-post - impact
3
By whom? Management Management Superior structure Superior structure Donor Donor Joint evaluation (mostly donor driven) Joint evaluation (mostly donor driven) Evaluations often perceived as: Instruments of donor control – partnership and ownership! Instruments of donor control – partnership and ownership!
4
From Sida’s Evaluation Manual Reporting format
5
Recommended Outline EXECUTIVE SUMMARY INTRODUCTION THE EVALUATED INTERVENTION FINDINGS EVALUATIVE CONCLUSIONS LESSONS LEARNED RECOMMENDATIONS ANNEXES
6
EXECUTIVE SUMMARY Summary of the evaluation, with particular emphasis on main findings, conclusions, lessons learned and recommendations. Should be short!!!
7
INTRODUCTION INTRODUCTION Presentation of the evaluation’s purpose, questions and main findings.
8
FINDINGS FINDINGS Factual evidence, data and observations that are relevant to the specific questions asked by the evaluation.
9
EVALUATIVE CONCLUSIONS Assessment of the intervention and Its results against given evaluation criteria, standards of performance and policy issues.
10
LESSONS LEARNED General conclusions that are likely to have a potential for wider application and use.
11
RECOMMENDATIONS Actionable proposals to the evaluation’s users for improved intervention cycle management and policy.
12
ANNEXES Terms of reference, methodology for data gathering and analysis, references, etc.
13
Some questions to be asked Was there a specific objective for the evaluation – also to be found in the Terms of Reference (ToR)? Was there a specific objective for the evaluation – also to be found in the Terms of Reference (ToR)? Were the ToR attached to the evaluation? Were the ToR attached to the evaluation? Were the qualifications of the evaluators explicitly stated? Were the qualifications of the evaluators explicitly stated? Were there OVIs (Objectively Verifiable Indicators)? Were there OVIs (Objectively Verifiable Indicators)? Were there any specific references to Guidelines, Manuals, Methods in the ToR and in the Evaluation itself Were there any specific references to Guidelines, Manuals, Methods in the ToR and in the Evaluation itself
14
Cont’d Is it clear from the document when, where and by whom the evaluation was made? Is it clear from the document when, where and by whom the evaluation was made? Was a base-line study needed? If so, was it carried out? Was a base-line study needed? If so, was it carried out? Has poverty alleviation explicitly been dealt with (over arching objective of the GoM)? Has poverty alleviation explicitly been dealt with (over arching objective of the GoM)? Have other cross-cutting issues been adequately dealt with? (Environment, gender, HIV/AIDS, good governance) Have other cross-cutting issues been adequately dealt with? (Environment, gender, HIV/AIDS, good governance) Has the issue of cost-effectiveness been dealt with? Is there any discussion of costs and benefits in the evaluation? Has the issue of cost-effectiveness been dealt with? Is there any discussion of costs and benefits in the evaluation?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.