Download presentation
Presentation is loading. Please wait.
Published bySibyl Arnold Modified over 8 years ago
1
FAST TRACK PROJECT: IMPACT EVALUATION DESIGN Musa Kpaka & Kevin Leiby | Component Leaders Meeting | 3 Aug, 2015
2
2 Impact Evaluation Context Project Monitoring and Experiential Learning (all project locations) Intensive Impact Evaluation (a few locations) The impact evaluation will contribute to broader project learning and evaluation by rigorously measuring impact and cost-effectiveness in select areas. M&E resources will be spread across the project but will be concentrated in study areas.
3
3 Impact Evaluation Overview How cost-effective is the fast-track intervention at increasing uptake of improved sweet potato varieties? Clustered randomized controlled trial (RCT) Evaluation Question Methodology Outcomes Primary Outcome Secondary Outcomes Uptake of improved SP Yield Good agricultural practices Gradual replacement of local varieties Consumption and Income Cost of implementation
4
4 Impact Evaluation Locations Central Uganda (Districts TBD) Bukoba & Misenyi Districts Mkuranga District CountryRegionDistrict UgandaCentral Wakiso Mukono Tanzania CoastMkuranga Lake Bukoba Misenyi = impact evaluation location Both project countries covered Basic conditions for success Two growing seasons per year Both wet and non-wet areas
5
5 Tentative Sample Size Location Implementation Schools Comparison Schools Wakiso / Mukono168 Bukoba / Misenyi168 Mukuranga84 Total4020
6
6 1. Establish list of eligible villages with implementers 2. Conduct a baseline survey in a random sub-sample of eligible villages 3. Use baseline data to match similar pairs of villages 4. Across matched pairs, randomly assign one village to the intervention group and the other to the comparison group Sampling Schools / Villages Schools / villages will be randomly selected to receive intervention. Example District Implementation location Comparison location (no implementation) village “match” Key message: It is important to hold off on selecting implementation schools until after the baseline survey
7
7 Sampling Households Key questions for implementers: How should we obtain household rosters? 1. Obtain a roster of all households in each study village 2. Randomly sample up to 20 households per village for surveying The evaluation will assess uptake of improved varieties by a random sub-sample of all households in a village, not just those that received vines directly from the project. Sampled household Non-sampled household Example Village
8
8 Study Design Comparison Arm (15-25 villages) Refined Intervention Arm (20-40 villages) 1 2 Exposure to external promotion & markets No intervention Exposure to external promotion & markets Refined Fast-Track intervention Are there other add- on components worth evaluating in a third study arm? Two Study Arms
9
9 Evaluation Timeline 2015 Aug-Tentative evaluation design Sep -Development of research protocol and survey tools -Piloting of key methodologies (e.g. identifying varieties) -Establishment of sample frames Oct -Submission of research protocol for IRB approval -Finalization of research protocol and survey tools Nov-Dec-Field work preparations, training, and piloting 2016 Jan-Feb-Baseline survey Mar -[early March]: Treatment / Control arm assignments -Beginning of vine distribution Apr-Vine distribution in relevant districts Note: The baseline survey could occur earlier in some locations if the survey team is prepared.
10
10 Complete Phase II of process evaluation Work with individual implementers to plan and discuss roles/responsibilities in more detail Obtain list of schools and villages in each district Finalize evaluation sample size for each location Pilot and finalize survey methods Receive IRB approval Next Steps
11
11 Discussion
12
12 Appendix
13
13 Location considerations
14
14 Sample calculations (i)
15
15 Sample calculations (ii)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.