Innovations in Tracking, Managing, & Reporting SNAP-Ed Impact Data

Slides:



Advertisements
Similar presentations
Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Advertisements

Relevance and effectiveness Regional Good Standardization Practice Course July, Bangkok, Thailand Good Standardization Practice 2012.
SEM Planning Model.
ICAI International Conference 8 th January, 2012 Vision 2030.
1 HSRC EPWP SEMINAR 07 JUNE 2016 Presentation by Lulu Mdletshe EPWP Co-ordinator: Department of Transport.
1 Connecting The Dots The Importance of Collaboration May 24, 2016 Nancy Schultz Family Living Educator.
UNGGIM – PRIVATE SECTOR NETWORK. AGENDA > OVERVIEW OF PRIVATE SECTOR >PURPOSE AND VISION OF PRIVATE SECTOR NETWORK > STRUCTURE AND TERMS OF REFERENCE.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
Gabrielle Wong HKUST Library
Michael Lindsay, ICF International
Knowledge for Healthcare: Driver Diagrams October 2016
Review of Assessment Toolkit
Statutory induction arrangements for newly qualified teachers commencing induction on or after 1 September 2017 Please refer to WG guidance.
Approaches to Partnership
A new tool for measuring client experience
Indiana’s Trajectory toward Supporting Families
Innovation Ecosystems Fellowship Overview
Every Hour Counts Measurement Framework Pilot
Fair Go Rates System Dr Ron Ben-David Chairperson
Auditing Sustainable Development Goals
Recap of Day 3.
2016 Year-End Performance Management
SAMPLE Develop a Comprehensive Competency Framework
Objectives of the Training
ROTARY STRATEGIC PLAN UPDATE
LEARNING REPORT 2016 Disasters and Emergencies Preparedness Programme
44th Meeting of the Standing Committee Bonn, Germany, October 2015 Report on activities of the Strategic Plan Working Group Ines Verleye,
Programme Board 6th Meeting May 2017 Craig Larlee
Risks & Opportunities AN INTERCULTURAL, MULTI-STAKEHOLDER APPROACH
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Evaluating Partnerships
Panhandle Partnership for Health and Human Services
9/16/2018 The ACT Government’s commitment to Performance and Accountability – the role of Evaluation Presentation to the Canberra Evaluation Forum Thursday,
Strategic Communications Plan: Initial Findings
Research Program Strategic Plan
Interoperability of Data Systems Administration for Children & Families Office of Planning, Research & Evaluation Robert (Bob) Garcia Regional Administrator,
Educator Effectiveness Regional Workshop: Round 2
Suicide Prevention Coalitions: The Backbone of Community Prevention
Texas Association of Community Colleges
Translating the Evaluation Framework in the MPR
Kansas Educational Data Users Consortium
Essential Components.
PURPOSE OF THE DISTRICT
Developing 21st Century Classrooms: Connecting the Dots IV
Statistics Governance and Quality Assurance: the Experience of FAO
An Introduction to Senior Friendly Care
Fair Go Rates System Dr Ron Ben-David Chairperson
Implementation Guide for Linking Adults to Opportunity
School of Dentistry Education Research Fund (SDERF)
Evaluation in the GEF and Training Module on Terminal Evaluations
A3 – Improving State Level Supports and Stakeholder Engagement through Effective Evaluation Kim Gulbrandson, Justyn Poulos – Wisconsin RtI Center Key.
Grantee Guide to Project Performance Measurement
CAF Quarterly Meeting Measuring the Value of an EA Practice
Establish you aims from the outset
2018 Improving Data, Improving Outcomes Conference
We worry about what a child will be tomorrow, yet we forget that he is someone today. --Stacia Tauscher.
Parent-Teacher Partnerships for Student Success
Jess Thompson, Program Administrator Accessible Technology Initiatives
Comprehensive Evaluation: Institutional Effectiveness Committee Recommendations Presentation to College Council Office of Institutional Effectiveness.
Reflections on the EQB: Opportunities for Deepening Public Engagement
ePerformance: A Process Crosswalk May 2010
The Who, What, When and Where of Coaching
MPR Impact Reporting Survey questions, aggregation, and moving forward
Pay for Performance Project support overview Presenter's Name
Making Middle Grades Work
Hannah Clarke, sparqs Steph Kirkham, sparqs
This is the second module of the Collaborative Backward Design series
Developing SMART Professional Development Plans
Presentation transcript:

Innovations in Tracking, Managing, & Reporting SNAP-Ed Impact Data ASNNA 2019 Innovations in Tracking, Managing, & Reporting SNAP-Ed Impact Data

Warning PEARS puns ahead…

Overview Progress made tracking SNAP-Ed impact data Working towards better consistency Tips for Gathering reliable data Engaging stakeholders

PEARS without the “P” Background: For at least several years, EARS was the primary mechanism to aggregate data across the entire nation. So far this tool has focused mostly on metrics related to reach, lists of activities, and partnerships.

A Shared Blueprint In recent years, we have all been working to adopt the more comprehensive Evaluation Framework. How many contributed in some way to writing or reviewing the framework?

51 indicators and ~306 outcome measures I think the biggest innovation in tracking, managing, and reporting SNAP-Ed impact is the Evaluation Framework. The Framework provided a much-needed blueprint or guide about specific information to track when evaluating SNAP-Ed. There are 51 indicators and about 306 outcome measures (depending on how you count them). When we first developed PEARS (before it was PEARS), our primary guide was EARS. It determined what data we needed to track because we knew the system must generate these numbers with a few clicks. When the Evaluation Framework came out, we had the blueprint needed to begin tracking more in-depth details about SNAP-Ed activities. If you only remember one thing from this presentation: Use the Framework! It may not always be PEARfect, but together we can work to make continual improvements.

Live demonstration of the PEARS indicator metrics module that aligns closely with Framework outcome measures. Examples screenshots of what a national-level report could look like follow…

Example Data

Example Data

Example Data

Consistency Consistency is key to aggregating data.

PEARS Records Metric 2015 2016 2017 2018 # organizations 1 22 43 52 program activities 1,335 3,624 17,529 57,977 success stories 158 358 800 3,057 PSEs 1,030 2,569 4,772 With tools like PEARS, we are making some progress in that direction.

1,339 However, we still have a ways to go. This is the number of unique SNAP-Ed surveys currently active in PEARS across all organizations.

It’s Getting Better… MPR Regional report in 2018 led to MPR Data and Reporting Workgroup Kansas’ success moving to a small set of pre-determined surveys PEARS workgroup is developing a comprehensive & unified list of PSE changes

Tips

Tip #1: Reliable & Relevant Data If at all possible, don’t re-invent the wheel Flexibility in data collection can be the enemy of data analysis Establish clear processes & expectations Frequently check for data quality

Almost twice as many support requests per user during the final quarter of FY 2018 from organizations that didn’t employ regular data checks.

Tip #2: Engage All Stakeholders Over-communicate Hold advisory committee meetings Solicit internal feedback about tools & processes Reflection section of PSE module in PEARS User feedback survey According to the Pell Institute (2019), given that it may not be feasible to include all stakeholder groups in the process, special consideration should be given to those that “enhance the credibility of an evaluation, are able to provide technical guidance regarding the evaluation process/needs/expectations, have influence over the program’s day-to-day operations, can influence how recommendations from the evaluation are utilized, and have the ability to fund or deny funding.”

User Feedback Survey Distributed to 2,562 active users in May, 2018 1,187 responses for a 46.3% response rate

Frequency of use was a key driver of a user’s positive perception of PEARS. In other words, the more they use the system, the more likely they are to recommend it.

The Road Ahead Continue aligning data management systems with Evaluation Framework outcome measures Prioritize established and pearscribed surveys Remove flexibility = better data for analysis Work towards nationally utilized curriculum kits? Leverage & expand regional and national-level collaborations