Download presentation
Presentation is loading. Please wait.
Published byBarry Tucker Modified over 6 years ago
1
Monitoring and Evaluation of Postharvest Training Projects
Dr. Lisa Kitinoja The Postharvest Education Foundation
2
WHY do we want to do M&E? Why do we want to monitor and evaluate postharvest projects and programs? Accountability purposes/may be required by donors Making program improvements For future project planning/new proposal development
3
KEY FOCUS What changes have occurred in the participant population since the beginning of the program? To what extent are these changes attributable to the program?
4
Attribution of change Typically we want to measure changes that can be attributed to the program or project, so we need a BASELINE measurement of INDICATORS to characterize the current situation. Changes can be positive or negative, intended or unintended A theory of action or logic model can help us to attribute any measured changes to the program’s activities (use if-then logic).
5
S.M.A.R.T. Types of OBJECTIVES
OUTPUTS are under the direct control of the project/program OUTCOMES are short term or medium term effects IMPACTS are long term effects which may take years to develop S.M.A.R.T.
6
Bennett’s Hierarchy of Evidence
One example of a logical framework where inputs and activities create outputs that can lead to outcomes and impacts. Having INDICATORS for each level will help to establish a plausible link to explain any measured changes. Inputs Activities Participation (OUTPUTS) Reactions of participants (short term OUTCOMES) Changes in participant Knowledge/Attitudes/Skills/Aspirations (short term OUTCOMES) Changes in participant Behaviors or Adoption of new Practices (medium term OUTCOMES) End results (long term IMPACTS) Objectives can be written in reference to any of the 7 levels of Bennett’s Hierarchy. A complex objective may include indicators that are related to two or more levels.
7
Counting the numbers of people trained
We can count the numbers of people who participate in our training programs # of men # of women # of youths # of groups (cooperatives or associations) # of leaders # of students # of extension workers Etc. ACTIVITY LEVEL
8
Results for our Hort CRSP ToT program (2010-12)
36 young professionals trained as postharvest specialists 19 women, 17 men 35% overall increase knowledge 45% overall increase in skills
9
Counting indicators of the adoption of improved postharvest practices
PRACTICE CHANGE LEVEL # of ZECCs constructed # of solar driers constructed or purchased # of new processed products being produced # of people using shade # of people using storage facilities # of different types (qualities, sizes, packaged produce, fresh cut, etc) of fresh produce being eaten at home or sold # of people selling new products in the market Etc.
10
Number of ZECCS constructed in SSA
Good choice of an indicator, since there were zero ZECCS in SSA when the project started in 2010 21 new ZECCs have been constructed and are in use as of 2014 7 in and near Arusha, 4 in Ghana, 3 in Ethiopia, 2 in Nigeria, 2 in Zambia, 3 in Kenya
11
Evaluation Design: what is the overall plan for making comparisons?
Experimental (requires random selection and assignment, large sample sizes, so is usually not possible to achieve) Quasi-experimental (may be possible if you have a lot of time and funding) Comparison to Baseline (if the indicators are expected to change over time) Comparison to a control group (if the program participants are expected to show changes that are more or different than those of a similar group that did not participate) Baseline = Pre-intervention
12
Types of data and typical data collection methods
Quantitative data (statistics, counts, numbers, percentages, costs, etc.) can be collected via making direct measurements, conducting formal surveys, analyzing secondary databases Qualitative data (on perceptions, beliefs, ideas, aspirations, categories, behaviors, etc.) can be collected via observations, interviews, rapid rural appraisals, CSAM, focus groups
13
Impact Evaluation Impact or end results will often take years to develop, and may occur after the completion of the program
14
Evaluation Reports Include stakeholders in reviews of preliminary and final reports M&E Findings Recommendations What questions has the M&E process not been able to answer? What additional research is needed? What are the major lessons from the assessment? Including STAKEHOLDERS in the M&E planning and implementation process will improve the chances that the evaluation results are utilized for decision making and future planning KEY FOCUS What changes have occurred in the participant population since the beginning of the program? To what extent are these changes attributable to the program?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.