ASNNA Justin Fast, Social Initiatives Specialist, Michigan Fitness Foundation Helen Idzorek, SNAP-Ed & EFNEP Coordinator, Cooperative Extension Service,

Slides:



Advertisements
Similar presentations
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Advertisements

Implementing NICE guidance
Educational Specialists Performance Evaluation System
Building capacity for assessment leadership via professional development and mentoring of course coordinators Merrilyn Goos.
+ SW-PBIS: Painting a Picture of Implementation in Schools Serving Students with Significant Disabilities Dr. Amy L. Schelling Grand Valley State University.
Supported self-evaluation in assessing the impact of HE Libraries Sharon Markless, King’s College London and David Streatfield, Information Management.
The Role of Wraparound within School-wide Positive Behavior Support Rob Horner University of Oregon.
A plain English Description of the components of evaluation Craig Love, Ph.D. Westat.
MARY BETH GEORGE, USD 305 PBIS DISTRICT COORDINATOR USD #305 PBIS Evaluation.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
DETERMINANTS OF DATA USE Session 2. Session Objectives  Explain the data-use conceptual framework  Highlight the determinants of data use  List potential.
EEN [Canada] Forum Shelley Borys Director, Evaluation September 30, 2010 Developing Evaluation Capacity.
Assisting Struggling Readers and Writers: Using Evidence-Based Resources to Support Adult Learners Michigan Conference 2014 Kathy Houghton-- LINCS.
Prison Education and Training in Europe
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Healthy North Carolina 2020 and EBS/EBI 101 Joanne Rinker MS, RD, CDE, LDN Center for Healthy North Carolina Director of Training and Technical Assistance.
Strategic Management of Human Capital Recruitment Strategy
RDA Wheat Data Interoperability Working Group Outcomes RDA Outputs P5 9 th March 2015, San Diego.
ORGANISING THE CREATIVITY AND MAKING THE ORGANISATION CREATIVE Nathalie Droyer 1.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Key Performance Measures, Evaluation Plans, and Work Plan
The Role of Family Organizations in Reaching & Supporting Immigrant Families to Access Services Immigrant children in the US are more likely to be poor,
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
HRSA’s Oral Health Goals and the Role of MCH Stephen R. Smith Senior Advisor to the Administrator Health Resources and Services Administration.
Parents as Teachers New Requirements and Advanced Training: quality, accountability, compliance.
What is an effective induction (within an academic context) and how do you implement it across the whole university or college? Michael Hill Action on.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Wellness Committee Update February 13, Wellness in District Review of current Wellness Policy 2.Committee work in Plan for recommendations.
Program Overview: Federal, State, and County Updated 06/2014.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
HECSE Quality Indicators for Leadership Preparation.
 How do we begin?  What are the different types of evaluation research?  How do these different types fit together?  What purpose do they serve?
American Community Survey (ACS) Program Review Webinar March 6, 2012.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Fiscal Fitness: Understanding and utilizing fiscal mechanisms.
USDA Food and Nutrition Service Farm To School. USDA Food and Nutrition Service Farm To School What is USDA's Involvement in Farm to School? USDA recognizes.
Supplemental Nutrition Assistance Program Education and Evaluation Study (Wave II) Anita Singh, PhD USDA, Food and Nutrition Service Office of Policy Support.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Social Context of Tobacco Use among Asian Americans in Ohio: Policy Implications Surendra Bir Adhikari, Ph.D. “Impact of Tobacco Use on Special Populations”
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
AZTRANSFER STEERING COMMITTEE MEETING DECEMBER 6, Evaluation of Arizona’s Transfer System 1.
A partnership of the Healthcare Association of New York State and the Greater New York Hospital Association NYSPFP Preventable Readmissions Pilot Project.
Barbara F. Schloman, Ph.D., TRAILS Project Director Julie A. Gedeon, Ph.D., TRAILS Assessment Coordinator Kent State University Libraries and Media Services.
Building relationships and bridging social capital: An inclusive approach to immigrant civic engagement within libraries A PROCESS AND OUTCOME EVALUATION,
Hammarskjold Middle School Supporting Students Interventions.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
Improving Health through USDA’s EFNEP & SNAP-ED: Regional Nutrition Education and Obesity Prevention Centers of Excellence (RNECE ) Brewer D 1., Kurzynske.
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
County Nutrition Action Plan (CNAP) Local Health Department web series FFY 2013 Presented by: Amy DeLisio CDPH.
Regional Nutrition Education and Obesity Prevention Centers of Excellence National Coordination Center at the University of Kentucky.
National Quality Program Standards Program. Creators of NQPS NQPS B&I NAAE AAAEUSDE NASAENFFA Council Alumni NFRBMEA PAS NYFEA ACTE.
Regional Nutrition Education and Obesity Prevention Centers of Excellence-Western Region at Colorado State University SNAP & EFNEP: Regional Nutrition.
Regional Nutrition Education and Obesity Prevention Centers of Excellence What Can They Do For You?
Dennis Savaiano, PhD Director Virginia Meredith Professor of Nutrition Policy Purdue University
Regional Nutrition Education and Obesity Prevention Centers of Excellence Interagency & Partner Briefing Jamie Dollahite Northeast Regional Center Cornell.
Rationale: SNAP-Ed and EFNEP networks critically need a pedagogically sound competency- based training system, an organized collection of resources and.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Community Nutrition Education Programs (CNEP). CNEP Community Nutrition Education Programs (CNEP) encompass two programs. EFNEP: Expanded Food and Nutrition.
COLLEGE OF FOOD, AGRICULTURAL, AND ENVIRONMENTAL SCIENCES FAMILY AND CONSUMER SCIENCES OHIO STATE UNIVERSITY EXTENSION COLLEGE OF EDUCATION AND HUMAN ECOLOGY.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Improving Health through USDA’s EFNEP & SNAP-ED: Regional Nutrition Education and Obesity Prevention Centers of Excellence (RNECE) Brewer D 1., Kurzynske.
METHODS A Delphi Study to Identify Barriers, Facilitators and Training Needs for PSE Interventions in SNAP-Ed Karen Franck, University of Tennessee; Karla.
Health Education THeories
SNAP-Ed Evaluation Framework in New Mexico
Evaluating Partnerships
Strategic Management of Human Capital Recruitment Strategy
Presentation transcript:

ASNNA Justin Fast, Social Initiatives Specialist, Michigan Fitness Foundation Helen Idzorek, SNAP-Ed & EFNEP Coordinator, Cooperative Extension Service, University of Alaska Fairbanks Shailja Mathur, Senior Project Administrator, NJ SNAP-ED/EFNEP, NJ SNAP-Ed Support Networks, New Jersey Agriculture Experiment Station-RCE, Rutgers, The State University of New Jersey Ana Claudia Zubieta, Ohio SNAP-Ed Director, Departments of Human Sciences and Extension, The Ohio State University One Time Interventions

ASNNA What is a “One Time Intervention” (OTI)? Formal or informal messages targeting participants who are not accessible beyond the initial intervention Administered to meet very specific needs of a community or in association with another event Face to face or using technology

ASNNA Great way to increase reach of people Reach people in non-traditional setting Reach people who lack motivation or face other barriers to attend a formal class Motivate participants to participate in more formal nutrition education Why “One Time Interventions”?

ASNNA USDA FNS stresses the need to: – Increase reach – Increase use of practice-tested and evidence-based interventions – Improve outcome evaluation of all interventions SNAP-Ed implements direct or indirect OTI out of necessity No evidence base of community of best practices exists The introduction of PSE in the FY13 SNAP-Ed guidance has set the stage for expanded use of OTI What Do We Know?

ASNNA OTI are suitable with large and diverse groups (higher reach) OTI are problematic to implement with fidelity and evaluate with rigor There is a lack of “best practices” on OTI Why a Survey?

ASNNA A workgroup to address these questions was formed after the annual ASNNA conference in February 2014 Justin Fast (Chair) Helen Idzorek Shailja Mathur Ana Claudia Zubieta We met via conference call several times Developed a survey to distribute among SNAP-Ed and EFNEP list serves Analyzed and discussed results How We Did It?

ASNNA Survey Results

ASNNA What is your role as it relates to nutrition education?

ASNNA Do you implement any interventions which are one-time only?

ASNNA Which situation best describes your intervention?

ASNNA How long are these interventions?

ASNNA How are participants recruited?

ASNNA OTI Reach

ASNNA Type of Evaluation

ASNNA Which outcome indicators are evaluated?

ASNNA What types of data are collected?

ASNNA What is the evaluation design?

ASNNA Is demographic information collected?

ASNNA How is demographic information collected?

ASNNA Are OTI reported to EARS as “Direct” or “Indirect”?

ASNNA 1.Provide information for future planning 2.Assessing needs of the population 3.Parents are more engaged and eager to participate when food tasting with their kids 4.Engaging participants in experiential and relevant learning Successes of OTI

ASNNA 5. Recommendations for more classes 6. Often the last booth to close 7. Participants indicate intention to use resources Successes of OTI

ASNNA 1.Lack of top-down support to enable evaluation plan for multi-faceted interventions 2.Difficulties administering evaluation tools 1.Time 2.Setting 3.Lack of motivation 3.Inability to measure behavior change 4.Incapacity to comply with FNS Challenges

ASNNA OTI are widely used and are an important part of SNAP-Ed programming. A proportion of OTI programs implement one or more “best practices” in program implementation or evaluation. Establish the evidence base for OTIs required of SNAP-Ed programmers nationwide. Some common practices do need to be addressed. To ensure comparability between programs, it would be helpful to identify or establish a consistent “inventory” of potential SNAP-Ed outcome indicators from which practitioners could choose. Implications for Practitioners

ASNNA Validated tools to evaluate very concrete, individual, learning objectives often taught in OTI settings are needed. More process evaluations specifically aimed at identifying best practices for performing onsite evaluations in difficult settings are needed. Broad program outcome evaluations with internal comparison groups, i.e. “with” and “without” the addition of a OTI component. Implications for Practitioners

ASNNA T HANK Y OU !