Intervention, Evaluation, and Reporting Activities Max Young, RD, LDN

Slides:



Advertisements
Similar presentations
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Advertisements

UNDERSTANDING SUPPLEMENTAL NUTRITION ASSISTANCE PROGRAM EDUCATION (SNAP-ED) Gerry Howell, MS, RD, Nutritionist July 12, 2012.
Strengthening Your Action Plan March 12, 2015 Presenters: Josephine Young, Health Manager Trish Anderson, Project Specialist Partner: California Department.
Regional Nutrition Education and Obesity Prevention Centers of Excellence-Western Region at Colorado State University SNAP & EFNEP: Regional Nutrition.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Improving Health through USDA’s EFNEP & SNAP-ED: Regional Nutrition Education and Obesity Prevention Centers of Excellence (RNECE) Brewer D 1., Kurzynske.
Jerry Endres Jerry Endres, M.S.W. Project Director FDM Project Contacts and Trainers Zuleima Arevalo, M.S.W. Database Trainer.
Measuring the Impact of Volunteers Cuso International presentation at VEGA Program Meeting December 2014.
METHODS A Delphi Study to Identify Barriers, Facilitators and Training Needs for PSE Interventions in SNAP-Ed Karen Franck, University of Tennessee; Karla.
Alabama Made the Evaluation Framework Work
Building Our Future Together
National Coordination Center at the University of Kentucky
Laurel Hart, Director Housing Finance & Development
SNAP-Ed Evaluation Framework: Breakfast with Andy
Optimize the HR Department to Support the Organizational People Strategy Enhance your HR departmental structure, process, technology, and capability to.
CCP: Monitoring and Evaluation
GENDER TOOLS FOR ENERGY PROJECTS Module 2 Unit 2
The Government’s perspective on measuring disability employment
Presented at the 142nd Annual APHA Meeting, November 17, 2014
SAMPLE Drive Engagement Through Interdepartmental Collaboration
Committed to equipping, training, and empowering SNAP-Ed and EFNEP networks to go beyond direct education to include impactful PSE strategies that will.
Physical Activity Panel
Session 1 – Study Objectives
Loren Bell Linnea Sallack, MPH, RD Altarum Institute
Lisa Raphael Laura Costello Jack Lumbley Melissa Dodson SEDL
SNAP-Ed Evaluation Framework in New Mexico
SNAP-Ed Evaluation Framework and FFY 2016 Annual Report
Evaluating Partnerships
Committed to equipping, training, and empowering SNAP-Ed and EFNEP networks to go beyond direct education to include impactful PSE strategies that will.
Tracking development results at the EIB
Enhancing Capacity in the Assessment and Application of Health Equity Tools  Kathleen Perkin, Wanda Martin, Bernie Pauly, Marjorie MacDonald, Warren O’Briain,
Evaluating the Impacts of Multi-layered Interventions in MD Schools
Rotary Club Central and Membership Reports
11/18/2018 ANNUAL performance PLAN (2018/19) NATIONAL DEVELOPMENT AGENCY PORTFOLIO COMMITTEE – 02 MAY 2018.
Translating the Evaluation Framework in the MPR
Supplemental Nutrition Assistance Program – Education (SNAP-Ed):
12/5/2018 ANNUAL performance PLAN (2018/19) NATIONAL DEVELOPMENT AGENCY Select COMMITTEE – 19 June 2018.
Educator Evaluation Summative Evaluation
Results of the Organizational Performance
Kimberly Keller, Ph.D. University of Missouri
The SNAP-Ed Evaluation Framework: Examples of Use at the Regional, State and Local Level Moderator Susan B. Foerster, MPH Co-Chair, ASNNA Evaluation.
Southern Region SERA Application
4.2 Identify intervention outputs
Building Knowledge about ESD Indicators
National Coordination Center at the University of Kentucky
Denver Office of Children’s Affairs
October 23, 2018 Amy DeLisio MPH, RD Suzanne Ryan-Ibarra PhD, MPH
Using Data for Program Improvement
Request for Proposals Q&A Webcast - February 2009
A method for making results data more searchable and usable
Transforming your vision for the future into action
Specialized Staffing to Support PSE Implementation
Innovations in Tracking, Managing, & Reporting SNAP-Ed Impact Data
Service Array Assessment and Planning Purposes
Partners For Asthma Action’s Value Proposition
Using Data for Program Improvement
ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities
Southeast Region FFY 17 SNAP-Ed Outcome Evaluation Results
Mission, Vision & Values
Modernisation of Statistics Production Stockholm November 2009
Data Aggregation in the Mountain Plains Region
Framework Webinar September 7, 2016 NOTES FOR TODAY’S WEBINAR
Finalization of the Action Plans and Development of Syllabus
The Ohio CoIIN June 17, 2015 St. Louis, MO.
Areas of Program Focus: Developing Great Agents
MPR Impact Reporting Survey questions, aggregation, and moving forward
BRD The Development Bank of Rwanda Plc (BRD) is Rwanda’s only national Development Finance Institution Public limited company incorporated in 1967 and.
Integrating Gender M&E Capacity Strengthening Workshop, Addis Ababa
Missouri Healthy Schools: A Comprehensive Approach to Student Success
Alabama Community Leadership Network (ACLN) Forum
Presentation transcript:

Intervention, Evaluation, and Reporting Activities Max Young, RD, LDN ASNNA 2017 Census Intervention, Evaluation, and Reporting Activities Max Young, RD, LDN

Many thanks… Star Morrison – Regional Coordinator MPRO Sue Foerster – Director of the ASNNA Census ASNNA Design Team Pamela Bruno, Sue Foerster, Karen Franck, Kimberly Keller, Laura Kettle Khan, Star Morrison, Andrew Naja-Riese, Jini Puma, Marci Scott Census Pilot Testers Pamela Bruno, Sara Beckwith, Carrie Draper, Karen Franck, Sarah Jones, Laurel Jacobs, Sarah Panken, Jon Perrott, Lauren Wheltsone FNS Regional Coordinators/State Agencies Pamela Griffin (NERO), Veronica Bryant (SERO), Eric Meredith (MWRO), Tara Griep (WRO), Doris Chin (MARO) University of Colorado Denver Julie Atwood and Jini Puma Colorado Department of Human Services Karen Smith 124 SNAP-Ed Agencies who completed the Census 136 SNAP-Ed Implementing Agencies sent the survey. 124 completed census = 91% response rate This is phenomenal, I was expecting 60%, we were hoping for 70%. 91% response means that we can feel confident that the findings represent the SNAP Ed agencies relatively accurately

The Goals Describe the Objectives of the Census Review the Highlights of Results Identify Next Steps

Objectives Obtain baseline info about use of the Framework and agencies intent to impact, evaluate and report on the indicators. Describe variations in the intended use of the framework by factors such as type of SIA, region of the country, and years of experience with SNAP-Ed. Understand how agencies are using the framework in planning and setting priorities, working with partners, communicating results, and other program objectives. Identify what reporting systems to track outcomes are planned, under development or being used. Project agency needs for future training, and technical assistance to successfully achieve outcomes in the SNAP-Ed Evaluation Framework.

Types of Implementing Agency Lets start by looking at who responded to the census What you can see here is a break down the the type of institution the agency works within. Over 40% of the respondents classified themselves as working in Cooperative Extension with land grant universities. Following Cooperative Extensions the next three institutions were: 1) Non-Profit General with is a faith-based organization or a public health focused non-profit. 2) The next was non-profit food specific such as a Food Bank. Combining those two non-profit categories together you have 23% respondents classifying as a non-profit 3) The third category is Department of State Government Public Health. In the survey we had two categories about State Government: - Department of State Government Public health - Department of State Government other than public health. 11% of respondents fell into Public Health and only 2% of respondents classifying as another department of state government

Use of the Interpretive Guide As I explained we wanted to understand the ways agencies were using the Interpretive Guide. This slide showcases the findings.. 82% use the guide to inform data collection instruments 76% to define count or measure accomplishments 61% to report results nationally 59% use the guide to inform intervention topics 58% showcase the larger SNAP-Ed Mission 37% to Garner Support from Partners. Over time, as more is understood about the framework and the Interpretive Guide, I believe we will see a shift in the agencies use from program planning and evaluation

Use of Reporting Systems Again in alignment to our objectives we asked agencies if they have a reporting system to track their program outcomes related to the framework. The question explained that reporting system should excludes EARS but includes systems like WebNEERS, PEARS, Sales Force. A reporting system was defined as a single or combination of systems that allow you to input program data, aggregate the data, and export the results for some or all of the indicators you are planning to impact. As you can see the majority of respondents have an existing system or are creating a system. However 25% of respondents reported not having a system. There are systems out there right now that align to the framework indicators and will allow for agencies to easily pull reports showcasing how framework indicators are aligned to programming outcomes. PEARS is a great example of a system that is in place and developing that can help these agencies. We encourage you to find systems that will allow you this type of outcome tracking.

Type of Reporting System If the respondents choose that they have an existing reporting system or are creating a system, they were asked to list the type of reporting system. The respondents could select as many options as applied to them. 50% use excel, 36% selected “Other” which the respondents classified as being use of qualitrics, access, survey monkey or a State Specific reporting system. WebNeers was up next with 19% and PEARS at 16%.

Technical Assistance Needs Here agencies were able to explain their top technical assistance needs with Choosing and Using Eval tools with 57% of agencies saying yes. That was quickly followed by identifying and using reporting systems and aligning activities with Interpretive guide. Our hope is that through sessions like this one and through other conferences and webinars these technical assistance needs will be addressed. Other: only trend was help evaluating PSEs

Now we are in the heart of the data Now we are in the heart of the data. Here we asked two questions for each of the 51 indicators: Does your program intend to impact the indicator And if so, do you plan on evaluating the indicator in any way. What you see here are the aggregated national results. In this analysis we combined the two questions I just read to create three categories on one graph: Intent to impact as the dark purple, Intent to impact but not evaluate as the green and the golden is a mash of several categories: Those that said I don’t know, those that said no, and those that skipped the question. The gold is what I will refer to as our golden opportunity; Its where our opportunity lies in understanding and impacting the indicators more. Also I want to be clear that interpretive guide was released in June and agencies were asked their intent to impact all 51 indicators by October. From the technical assistance questions, agencies are looking for help understanding and applying these indicators. But you will see that as we go through these results that nationally, all indicators are being addressed. Since its expected that no one agencies impacts all 51 indicators, I think this shows the phenomenal work we are doing as a program. Alright so lets look at these individual indicators. This chapter of the framework represents our direct education for healthy eating, food resource management, Physical activity and food Safety. Starting at the top and moving downward you can see that we have our four short term at the top and our long term at the bottom. Its very exciting to see that SNAP-Eds priority indicators MT1, MT2, and MT3 have the highest impact and evaluation. Nice work us!

Here are our environmental indicators Here are our environmental indicators. This is the start to our PSE work and is truly focused on the domains of where people eat, learn, live, play, shop and work. This chart is organized the exact same way. From short term to long term. And again our SNAP-Ed Priority indicators, ST7 Organizational Partners and MT5 nutrition Supports having the most intent to impact and evaluate.

Sectors of Influence is the chapter of the framework that recognize that all sectors of society can contribute to creating opportunities for healthy living. This is also the part where we see more gold in the slides. But if you focus in on the SNAP-Ed priority indicator you see around 50% impacting this indicator.

Population level results are the long term outcomes and impacts that are measured at the population level. They relate to the work that SNAP-Ed prioritizes such as improving overall diet quality and fruits and vegetables. There is one priority indicator R2 fruits and vegetables and around 50% are impacting this indicator. Now what we are looking at is not outcome data. This is what agencies believe to be impacting. Use this information to inform a discussion around the indicators but acknowledge that this is not a report but a baseline assessment giving us a glimpse at our impact.

Next Steps Providing Technical Assistance to increase agencies capacity to implement the framework indicators Framework Ambassador, Mentor and Mentee programs (creating learning communities) Publication In conclusion, I hope everyone leaves with an understanding of why this initial baseline assessment is important And how it has provided us with these golden opportunities for technical assistance and help understanding the application of the framework I hope that you take this information home with you and begin to find ways to addresses this work within your home agency. Again thank you for the amazing opportunity