Download presentation
Presentation is loading. Please wait.
Published byKathlyn Warner Modified over 9 years ago
1
What Works in Multi-Site Evaluations of Nutrition Education Interventions? Andy Fourney, Andrew Bellow, Patrick Mitchell, Sharon Sugerman, Angie Keihner California Nutrition Network for Healthy Active Families Research and Evaluation Unit
2
California Nutrition Network for Healthy, Active Families Targets low-income families Conducts a social marketing campaign to increase: fruit and vegetable consumption physical activity promote the Food Stamp Program Contracts over 170 agencies Receives almost 106 million from USDA food stamp program
3
Strategies Factors that Influence FVC Change in Behavior Evaluation for contractors receiving >$350k Process Short-term Impact Long-term impact
4
Network Developed Methodology 1. SOW Impact evaluation objective 2. Compendium of Surveys 3. Impact Evaluation Handbook 4. Trainings and workshops 5. Data entry templates 6. Evaluation report template 7. Monitoring
5
Scope of Work Impact Evaluation Objective Impact objective: By Sep. 30 20xx, a sample of 4th and 5th grade students at intervention sites will report an increase in fruit and vegetable consumption and an increase in one or more factors related to fruit and vegetable consumption, such as knowledge, preferences, outcome expectations and self-efficacy.
6
Impact Evaluation Handbook
7
Compendium of Surveys
8
TrainingsWorkshops
9
One-on-one TA
10
Data Entry Templates Created in MS Excel Provides same results as SPSS Calculate pretest mean, posttest mean, difference and p-value for a paired t-test Customized for each survey
11
Data Entry Templates
13
Evaluation Report Template 18 questions Design Methods Results Reflection Successes
14
Report Template - Design 1. Please type your impact objective. 2. What were the factors you measured? 3. What intervention and strategies were implemented to change the factors? 4. What survey tool did you use? 5. Were there questions on the survey tool that were not addressed in the intervention? If so, why? 6. Describe challenges faced during the intervention implementation. 7. What type of design was used, e.g., pretest and posttest, pre-posttest with a control, posttest only or other?
15
Report Template - Evaluation Survey Implementation 8. What type of design was used, e.g., pretest and posttest, pre-posttest with a control, posttest only or other? 9. Who were the survey respondents? (please give age range) 10. Where were they administered (name and type of locations, e.g., health department, community center, mobile home park, etc.) 11. When were the pretest and posttest surveys administered? (Please give dates.) 12. How many were administered? (If a different number of pretests and posttest please explain.) 13. How long did it take the respondents to answer the survey? (Please give range in minutes.)
16
Report Template - Evaluation Results 14. What were the results? How would you interpret these results? How might these results be used? 15. How would you interpret these results? 16. How might these results be used?
17
Report Template - Reflection 17. What were the big challenges you encountered during the evaluation process? 18. Now, reflect on the successes you’ve had over the past year. Describe the best moment you had as a nutrition educator this year. Think about a moment when you knew your nutrition education had made a difference in someone’s life or give an example of a time when you were proud to be a nutrition educator. 19. Share a story about a one of the best moments you had while conducting the evaluation this year? Think about one of the highlights? 20. If Aladdin appeared with his magic lamp and offered you a wish to have more of those best moments or highlights what would you ask for?
18
Monitoring sheet (in MS Excel) includes columns for: Contractor Name Contact Name Region Channel Federal Share Program Manager Required/Volunteer Contract External Evaluator Number of years in impact evaluation
19
Monitoring sheet (continued) Evaluation Objective Factors (to be)/(that were) measured Design (posttest only, posttest with control, pre- posttest, pre-post with control) Participant’s age (grade, adults/youth, in years) Survey Used Have a copy of survey Pretest date(s) Posttest date(s) Number of Surveys expected (collected)
20
Monitoring sheet (continued) Need Data Entry Template (date) Sent data entry template (date) Received Report (in email from & date) Report Late/Received Draft Extension Granted (notes) Data Received Need Training
21
What were the indicators of success? Factor # that measured change in factor 03-0404-0505-06 Knowledge 7719 Preferences 101322 Self-efficacy 3418 Outcome expectations 22 Consumption 1722 PA preferences 01
22
Who has participated? Channel 03-04 n=12 (44%) 04-05 n=24 (67%) 05-06 n=45 (98%) Schools 81224 Colleges and universities 234 Health departments 1611 County Offices of Ed. 26 Cooperative Extension 111 First 5 1
23
Funding (% of total budget) (19%) (34%)(47%)
24
Resources www.ca5aday.com Research and Evaluation See impact evaluation section at the bottom www.harvestofthemonth.com
25
Hoorah! – More Matters
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.