Download presentation
Presentation is loading. Please wait.
Published byRonald Reed Modified over 8 years ago
1
PROGRAM EVALUATION A TOOL FOR LEARNING AND CHANGE
2
Evaluation Subjective conjecture vs. Disciplined inquiry
3
Evaluation research: A definition Determination of the worth of something, based on verifiable evidence verifiable..........
4
Evaluation research what’s of value here? – Is it judgment? Is it learning?
5
Types of Evaluation Student achievement Tests, Examinations Product quality Consumer reports Government policies Policy reviews Personnel performance Personnel reviews Programs or projects Program evaluation
6
Program Evaluation Overall unit being examined program/project Purposes/Need (rationale): – to account to donors or government or….. – to inform funding decision making – to direct program improvement activity – to address the implementing organization’s needs – to address beneficiary (learning) needs – to drive learning and promote change
7
On what to focus? Focus is a function of purpose/need – Cost/benefit – Process – Outcomes – Impact – Systems – Policies Reporting? or Learning?
8
Approaches Midterm vs. Final Internal vs. External Participatory vs. Prescribed Quantitative vs. Qualitative In-field vs. Distance
9
Tools Information gathering Interview guides Observation guides Questionnaires Tests Visual recording: Photography Audio recording Processing Mapping Problem-Objective trees Log Frames
10
Interview guide A simple list of main questions to be asked, highlighting key focus/purpose of question Criteria for expanding (if the answer is…)
11
Observation guide A list of observations to be made in a particular place or of a particular activity Guides observation so things aren’t missed Standardizes and disciplines observation
12
Questionnaire What you use when conducting a survey. You make a survey using a questionnaire
13
Some terms Monitoring vs. evaluation Outputs vs. outcomes Impact vs. outcomes Activity vs. inputs
14
Designing an evaluation: Some Steps Determine the audience and their info needs Define the scope (breadth/depth) of info need Determine the aspects of the program that need to be examined Develop main evaluation questions and sub- questions
15
Evaluation design steps--continued Determine the best methods for answering question (surveys, interviews, observations, time-series controls..) (Ask yourself: what method or tool will give the most accurate (valid) view? What method process will promote the most learning? What will give the information in a timely manner?)
16
Evaluation design steps--continued Define the evaluation’s process: 1.preparation, 2.in-field info gathering, 3.analysis 4.discussions 5.reporting (what format? what contents?) What, who, when and how of each LEARNING LEARNING LEARNING Include as many stakeholders as possible
17
The only lasting value of e-valuation is in the learning that takes place.
18
Evaluation design steps--continued Gather data Analyze data – Make “maps”—transport chains, activity chains, people chains, activity interactions, etc. – Construct a problem/objective tree – Construct a Log Frame Put together report(s)—multiple formats
19
Evaluation Report You need to plan your report before you begin your evaluation!!
20
Evaluation Report Program Background Problem Statement Response to Problem Purpose of the Evaluation Evaluation design and methods, time frame Project description –goals, objectives, etc., narrative summary
21
Evaluation Report--continued Project outputs and outcomes Impact Lessons learned Conclusions Recommendations Annexes/Appendices – Evaluation activity conducted and when (people interviewed, places observed, etc.) – Visuals (photos, graphs, charts—or these can be in the body of the report) – Case studies
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.