Planning and Focusing an Evaluation Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Associate Professor and Youth Development Specialist.

Slides:



Advertisements
Similar presentations
Engaging Your Community in Practical Program Evaluation Stephanie Welch, MS-MPH, RD, LDN Knox County Health Department Phone (865)
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Setting Goals & Modeling Healthy Behavior.  Make them manageable and specific.  Start small and try not to focus on too many things at once.  Make.
February 9, 2012 Session 1: Observing Lessons NYSED Principal Evaluation Training Program.
FCS Program Focus Area – Healthy Eating/Active Lifestyles Dr. Virginie Zoumenou UMES/ Maryland Cooperative Extension 11/01/07.
Using data to tailor a school-based worksite wellness program Stephanie Vecchiarelli, Judith Siegel, Michael Prelip University of California Los Angeles,
2014 AmeriCorps State and National Symposium How to Develop a Program Logic Model.
Goal 3, Volunteer Development and Systems to Support Youth: Logic Model and Communications Plan Situation Statement During 2005, over 11,218 adult volunteers.
Kids’ Cooking Camps Promote Healthy Lifestyles Among Native American Youth Kelly Burdett, Graduate Research Assistant Julie Garden-Robinson, Ph.D., R.D.,
INSERT PRESENTER NAME HERE, AFFILIATION DATE School Wellness Policies Creating a Healthy Future for Alaska.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Program Evaluation Debra Spielmaker, PhD Utah State University School of Applied Sciences, Technology & Education - Graduate Program Advisor USDA-NIFA,
I.S.O.T.U.R.E. A Model for Volunteer Management Success Improving Lives. Improving Texas.
1 Classroom-Based Research: How to Be a Researcher in Your Classroom Basic Skills Initiative Teaching and Learning Workshop October 2009 Darla M. Cooper.
How to Develop the Right Research Questions for Program Evaluation
A Child Care Center Intervention Promoting Policy and Menu Change in Early Care and Education in Ohio Autumn Trombetta MS, RD, LD Cheryl Graffagnino MS,
School Wellness Policy
How to Focus and Measure Outcomes Katherine Webb-Martinez Central Coast and South Region All Staff Conference April 23, 2008.
The Comprehensive School Health Education Curriculum:
Camping & Environmental Education Institute 2/25/15
So What Can I Expect When I Serve on an NEASC/CPSS Visiting Team? A Primer for New Team Members.
Marie-Claude Thibault, MBA, RD Public Health Nutritionist Ottawa Public Health April 21, 2008 Ottawa’s Healthy Active Schools Partnership.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
The Evaluation Plan.
Out of School Nutrition and Physical Activity Initiative by Harvard School of Public Health Prevention Research Center Out of School Time Nutrition & Physical.
Program Evaluation and Logic Models
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
5 Chapter Training Evaluation.
Evaluation Designs and Methods Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Professor and Youth Development Specialist Oregon State University.
A Peer Education Approach to Sexuality Education in Schools Melissa Blake Melissa Reagan Princeton Center for Leadership Training AAHE-AAHPERD National.
How can school districts support the development of healthy school communities? Facilitated by: Rhonda Patton, Alberta Health Services Dr. Steve Manske,
It takes a Village to Raise a Healthy Child: Leveraging public health departments to create a school wellness network across Nebraska.
Juggling the Program Management Ball 1. One day thou art an extension educator… The next day thou art a county director or district extension director…how.
Workshop 6 - How do you measure Outcomes?
The Partnership for a Healthy Mississippi: A Comprehensive Prevention Program for the Washington County and Greenville, Mississippi, School District Teddy.
Overview of Program Evaluation Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Associate Professor and Youth Development Specialist Oregon.
Prepared by the North Dakota State Data Center July HNDECA and ECCS Evaluation Dr. Richard Rathge Professor and Director North Dakota State Data.
Merrill Area United Way Outcomes Training (Part 2) Art Lersch Associate Professor Community Resource Development Educator University of Wisconsin – Extension,
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Ipod Project Welcome Back Session September 27, :30- 4:30 PM (SEA # NA–Credit) Lenoir County Public Schools Preparing all students to be competitive.
4-H Health Officers: Smiles for a Lifetime Helping each 4-H’er take a stand for “healthier living”
Community Planning Training 5- Community Planning Training 5-1.
Program Evaluation.
OCAN Train the Trainer For Trainers Version 2.0 December 2010.
We’re 4-H! Are we 3-H? No! Activities: Fairs and Events For Members and Leaders At 4-H Camp Choose Health Action Teens In 4-H Clubs Goal: 5000 youth.
Position of the American Dietetic Association: Benchmarks for Nutrition in Child Care By: Miranda Bender and Kaitlin Schreader.
WASHINGTON STATE UNIVERSITY EXTENSION Evaluation Based Program Development This material was developed in partnership with Rutgers's University and Lydia.
NOVA Evaluation Report Presented by: Dr. Dennis Sunal.
Projects #9, 17, 29, and 32 Mentor: Helga Bernard, Ph. D. Clark County School District School Improvement and Research.
Presented By Patricia Dawson Oregon State University Extension Service.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
HEALTHY NEIGHBORHOODS FUND LEARNING COLLABORATIVE MEETING NYU School of Medicine Evaluation Team November 11-12, 2015.
Evaluating Screen Time Reduction Initiatives: The Washington State Story Donna Johnson, RD, PhD Center for Public Health Nutrition University of Washington.
Patti Delger RD, LD Carrie Scheidel, MPH Iowa Department of Education
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
Trouble? Can’t type: F11 Can’t hear & speakers okay or can’t see slide? Cntrl R or Go out & come back in 1 Sridhar Rajappan.
2016 Spring Grantee Convening IKF Evaluation Update Center for Community Health and Evaluation April 11, 2016 Foundation for a Healthy Kentucky.
Walmart Youth Voice: Youth Choice Training. 4-H IS THE YOUTH DEVELOPMENT PROGRAM OF OUR NATION’S COOPERATIVE EXTENSION SYSTEM 2 |2 | Walmart YVYC Training2.
Simple Surveys: From Program Objectives to Impact Statements in 90 Minutes 2016 Extension Conference Jeff Buckley & Jennifer Cantwell January 13, 2016.
Evaluation Nicola Bowtell. Some myths about evaluation 2Presentation title - edit in Header and Footer always involves extensive questionnaires means.
Logic Models Performance Framework for Evaluating Programs in Extension.
Professional Development: Imagine Difference Shapes and Sizes
School Health Component
OHIO STATE UNIVERSITY EXTENSION
School Health Component
6 Chapter Training Evaluation.
Introduction Introduction
Presentation transcript:

Planning and Focusing an Evaluation Program Evaluation Basics Webinar Series Mary E. Arnold, Ph.D. Associate Professor and Youth Development Specialist Oregon State University 4-H Professional Development Webinar January 10, 2013

Webinar Agenda Building on previous month’s topic of logic modeling we will: Review the basic components of a logic model, using the YA4-H! Teens as Teachers program as an example Learn the importance of identifying the purpose and stakeholders of a program evaluation Learn the Four Level Model of educational program evaluation: 1.Reaction 2.Learning 3.Transfer of learning 4.Results Introduce three important parts of evaluation planning: 1.Evaluation questions 2.Evaluation design 3.Data collection methods Learn about IRB regulations for evaluation projects

YA4-H! Teens as Teachers Program We received a $50000 competitive grant to Washington State University for 18 months of program funding Program Outline 1.YA4-H Teens as Teachers Develop the YA4-H! Teens as Teachers curriculum Provide 3-days of training to teens and adults Participants Learn About: o Developing Youth - Adult partnerships o Forming a county YA4-H Teens as Teachers team o Learning and implementing the Choose Health: Fun, Food, and Fitness (CHFFF) curriculum Again, this is educational content. Do you think the younger youth are the only ones who will learn? 2.Nutrition Education for Youth Ages 9-12 Teens teach the 6 CHFFF lessons to younger youth Younger Youth Learn: o The amount of sugar in drinks o Eating more fruits and vegetables o How to read a nutrition label o Whole grains o High fat and high sugar foods o Eating breakfast This is the educational part of the program- and where we will look for short-tem outcomes NO! The teens will learn these things as well, simply through teaching it to others!

YA4-H! Teens as Teachers Program We have received a $50,000 competitive grant to Washington State University for 18 months of program funding 3.Medium Term Outcomes (Behaviors) Younger Youth Replace sweetened drinks with low-fat milk and water Eat more fruits and vegetables Eat fewer high-fat and high-sugar foods and more nutrient-rich and high-fiber foods Eat only as often and as much as needed to satisfy hunger Play actively 60 minutes a day Limit screen time to two hours or less a day Teens as Teachers All of the above outcomes AND Actively promote healthy behaviors Develop and practice teaching and leadership skills Increase PYD Act as a role model for younger youth Note that these outcomes are focused on what we want to see HAPPEN! ACTION!!!!!!!

YA4-H! Teens as Teachers Program We have received a $50,000 competitive grant to Washington State University for 18 months of program funding 4.Long Term Outcomes (Results) Youth are less obese and more active Healthier food choices are available for youth There is a reduction in prevalence of sugar drinks and low-nutrition food Communities are united to provide healthy spaces for youth There is a reduction in chronic diseases related to obesity over time

YA4-H! Teens as Teachers Program Evaluation InputsOutputsOutcomes Staff Money Partners Teens as Teachers Curriculum Provide training to youth-adult teams Youth-Adult Teams are trained What is done.Who is reached.. Teams return to the community and implement the CHFFF program

Program theory is important, but it also has to make practical sense in order for evaluations to be meaningful! That is why logic models are so useful- they can identify critical links in the program’s theory of change. If the links aren’t “logical” than the program may have little practical value

Why Do We Evaluate? Help others understand the program (stakeholders) Understand the need for a program Improve the program Improve teaching Understand the program’s impact Determine if the program is progressing as planned Determine if the program is worth the cost Meet grant reporting criteria Meet administrative requirements

Why Do We Evaluate? Poll: What is the number ONE reason you evaluate your programs? Share Your Experience: What motivates you to do evaluation, and how do you feel about that motivator?

Ahhhh…. Stakeholders! Stakeholders will have different needs for evaluative evidence!

Who Cares? People affected by the program either directly or indirectly (youth, parents, volunteers) County boards, elected officials Community leaders Colleagues, volunteers, supporters, collaborators Extension administrators Grantors Tenure committees Other stakeholders It is essential to think about who cares about the evaluation results and determine how the results will be used early in the evaluation planning process.

Poll: Who are your most important stakeholders? (Check all that apply) Share Your Experience: Share an experience of needing difference evaluation evidence for difference stakeholders

The Four Level Model of Educational Program Evaluation (Kirkpatrick & Kirkpatrick, 2006) 1.Participant Reaction to the program- you want a favorable response to the program; people are more motivated to participate and learn if the program is a positive experience for them. 2.Participant Learning – the extent to which participants change attitudes, improve knowledge, and increase skills. 3.Participant Behavior Change (Transfer of Learning) 4 conditions for transfer of learning to take place: 1)The person must have a desire to change 2)The person must know what to do and how to do it 3)The person must be in a climate that supports the change 4)The person must receive some type of reward for changing (intrinsic and/or extrinsic) 4.Participant Results –One way to guide this part of the evaluation is to ask What is the main reason for this program? Then we ask also have to determine the links between the learning and behavior changes and these results (yep, this is a cloaked logic model!)

An introduction to three important aspects of evaluation planning: Evaluation questions Evaluation design Data collection methods

We begin with the evaluation questions, because the questions determine the design and data collection methods

Focusing Evaluation Questions Questions of participant reaction: Who actually participated in the program? Are there barriers to participation? Were participants satisfied with the program? Was the content of the program relevant to the participants? Questions of participant learning: Did participants learn the intended program content? Did the program change participant attitudes? Did participants leave the program with new or enhanced skills? Questions about behavior change Do participants plan to use what was learned? (potential change) Do participants have a supportive climate for implementing what was learned? Did participants actually implement new behaviors? Why or why not? Questions about Results What difference has this program made for the audience, community, and other stakeholders? (the SO WHAT question)

Choosing an Evaluation Design Critical to Understand! Your evaluation question determines your evaluation design and data collection methods!!! (too often I hear: I need a survey to evaluate my program )

Evaluation Designs 1) Post only design (XO) 2) Post only control group design E (XO) C (XO) 3) Retrospective pretest ( XO) O = “Observation” (data collection)X = “intervention” (program) E = Experimental group (program participants) C = Control group (non-participants )

O = “Observation” (data collection X = “intervention” (program) E = Experimental group (program participants) C = Control group (non-participants ) 4) One group pretest- posttest design (OXO) 5) Control group pretest- posttest design E (OXO) C (OXO) 6) Time series design (with control group) E (O OO X O O O) C (O OO X O O O) Evaluation Designs

Choosing an Evaluation Data Collection Method Some Common Methods Mailed survey On-line survey Individual face to face interviews Focus group interviews Phone survey or interview Observation Archival data (records and documents) Test (e.g. scenarios or skill/knowledge tests) Stay tuned on February 14 th for our next webinar, which will focus more in depth on design and methods!

Poll: What data collections methods have you used? (Check all that apply) Share Your Experience: Share an experience of using a data collection method that challenged you or gave you great information

Oregon State University Institutional Review Board (IRB) AKA… Do I have to do that? Research is determined by three qualities: 1.Systematic inquiry into a phenomena 2.That is designed to develop or contribute 3.Generalizable knowledge Human subjects Are: Living individuals about whom an investigator conducting research obtains: 1.Data through an intervention or interaction with the individual, or 2.Identifiable private information

Institutional Review Board (IRB) Okay, I have to do it… now what? Go to the IRB Website at: 1.Complete the “Does Your Study Require IRB Review? Form 2.Complete online ethics training modules 3.Complete the steps listing under “Preparing and Initial Submission” at initial-submission

That’s all for now! Join in next month for: Evaluation Designs and Methods That’s all for now! Join in next month for: Evaluation Designs and Methods Don’t forget to complete an evaluation of today’s webinar at: