Presentation is loading. Please wait.

Presentation is loading. Please wait.

ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities

Similar presentations


Presentation on theme: "ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities"— Presentation transcript:

1 ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities
ASNNA & University of Colorado Denver Jini Puma, PhD 2019 ASNNA Conference Arlington, Virginia February 4-7th

2 Sue Foerster (Emeritus), Kimberly Keller (MO), Pamela Bruno (ME), Sue Sing Lim (KS), Jennie Quinlan (CO), Deanna LaFlamme (CO) – Census Project Workgroup Max Young, Star Morrison & the rest of the 2017 Census Project Workgroup members ASNNA respondents Thank You!

3 For FFY 2019, are SIAs planning more comprehensive programming approaches and/or evaluations, especially in longer- term and in the outer spheres of the Framework? Do SIAs intent to impact and evaluate the Framework indicators vary by region? How do these results compare to the 2017 Census results that captured the time when the Framework was introduced? Goal of the Census

4

5 2019 Census

6 The targeted respondents for the survey were the program directors of the SIAs (Total = 143).
Names and addresses of the program directors were identified Census survey respondents were asked to use their FFY 2019 state plan and its evaluation activities to inform their responses For the most part, the survey content was the same as in 2017. 2019 Census Methodology

7 The ASNNA listserve was used to publicize the Census.
The survey was administered through REDCap (Research Electronic Data Capture) database survey link in October The survey data collection protocol followed the Tailored Design Method (Dillamn, 2007) and included: An advanced notice followed by a survey link ed one week after the notice. Up to 3 follow-up s (one per week) with the survey link sent to non-responders. After 4 ed survey attempts, any non-responders were contacted personally by ASNNA colleagues or UCD staff and encouraged to participate in the survey. 2019 Census Methodology

8 This data collection approach resulted in 129 (out of 143 known) SIAs completing the survey (90% response rate). Remarkably similar to the response rate in 2017 (n = 124; 91% response rate) All states, the District of Columbia, and both Territories were represented, except Idaho. 2019 Census Methodology

9 2019 Census Results Highlights
Out of the 51 indicators, on average, SIAs are intending to impact 19 indicators and evaluate 12 indicators. There is a gap in the intent to impact and intent to evaluate for all indicators. Gap widens for long-term indicators at the Individual-Level. In every sphere of influence of the Framework, the indicators that have the highest percent of SIAs that intend to Impact and evaluate them are the core indicators (with the exception of ST1 at the Individual-level). 2019 Census Results Highlights

10 2019 Census: SIAs Planning to Impact and/or Evaluate Indicators at the Individual-Level (n = 12 Indicators)

11 2019 Census: SIAs Planning to Impact and/or Evaluate Indicators at the Environmental Settings-Level (n = 12 Indicators) Generally lower uptake

12 2019 Census: SIAs Planning to Impact and/or Evaluate Indicators at the Sectors of Influence-Level (n = 16 Indicators) Highlight social marketing

13 2019 Census: SIAs Planning to Impact and/or Evaluate Indicators at the Population-Level (n = 11 Indicators) Population-level results represent the cumulative impact of interventions in the other 3 spheres of influence. Population results may be reported at the local, regional or statewide levels.

14 2019 Census: Barriers to Evaluating Indicators
POPULATION Not Enough Staff Time/ Personnel (31%) Budget constraints, in general (23%) Lack of funds to pay for comparison studies (21%) Outcomes cannot be linked to intervention (17%) Secondary data sources not available (12%) Not Enough Staff Time/Personnel (58%) Lack of outside funds to pay respondents (37%) Not Enough Staff Time/Personnel (63%) Budget constraints, in general (40%) Lack of training/expertise (33%) Not Enough Staff Time/Personnel (41%) Budget constraints, in general (29%) Lack of training/expertise (29%) Outcomes cannot be linked to intervention (21%) Another entity is conducting the evaluation (13%) The intent to evaluate is very different among the 4 spheres, so while the barriers look similar, how serious the barriers to measurement in different spheres varies:  §  For Individual, over 2/3 to 80%+ of SIAs are evaluating Priority Indicators. Pretty good!  But drops to one-quarter for LT. §  For Environmental Settings, it drops to 2/3 for the 2 Priority Indicators out of 12.  For LT Indicators, many are only in the 20% range. §  For Sectors of Influence, the single Priority Indicator is <50%;  for most of the 15 in this Sphere, it’s in single digits! §  For Population Results, the Priority Indicators is only 30%; out of the remaining 10, many are single digits.  The need to address barriers appears most pressing for Sectors and Population Results, e.g., new expertise, methods, data bases and funds to support personnel, state budgets and training.

15 2019 Census Results Highlights
Mean % of SIAs Intending to Impact and Evaluate Indicators Across Framework Levels 2019 Census Results Highlights Proximal Distal The intent to impact and evaluate the indicators is, on average, higher in the more proximal spheres, than distal spheres

16 2019 Census Results Highlights
Mean % of SIAs Intending to Impact and Evaluate Indicators Across the Individual, Environmental and Sectors of Influence Framework Levels 2019 Census Results Highlights The intent to impact and evaluate short-term indicators is greater than medium-term indicators, and medium-term indicators is greater than long-term indicators.

17 2019 Census: Were there regional differences in the number of indicators that SIAs intended to impact? n = 23 SIAs Total # Indicators : 51 Average # Indicators: 16 n = 12 SIAs Total # Indicators : 45 Average # Indicators: 23 n = 14 SIAs Total # Indicators : 49 Average # Indicators: 22 n = 7 SIAs Total # Indicators : 41 Average # Indicators: 19 No apparent differences - State-by-state analyses are needed, for example to see how states compare or which Indicators each focuses on to see where different states might bring strengths that would help everyone? n = 27 SIAs Total # Indicators : 51 Average # Indicators: 19 n = 22 SIAs Total # Indicators : 51 Average # Indicators: 18 n = 24 SIAs Total # Indicators : 51 Average # Indicators: 21

18 Comparison of Results from 2017 - 2019

19 Mean % of SIAs Intending to Impact and Evaluate Indicators Across Framework Levels Across Years
Comparison of Results from 2017 – There were no statistically significant differences in Framework levels across the years. Results are remarkably similar between the two years, which supports the reliability and validity of the Census survey tool.

20 Mean % of SIAs Intending to Impact and Evaluate Indicators Across the Individual, Environmental and Sectors of Influence Framework Levels Levels Across Years Comparison of Results from 2017 – There were no statistically significant differences in Framework levels across the years But there are two exceptions at the indicator-level (next slide)

21 Trends in Results from 2017 – 2019
Exceptions: The Indicators with Significant Increases Over Time Trends in Results from 2017 – 2019 *p<.05; **p<.01

22 Evaluation Policy Take-Aways
Intent to impact and to evaluate Framework outcomes decreased as SIAs moved to longer- term outcomes and those in the outer Spheres of Influence. USDA’s core indicators have, by far, the highest percent of SIAs that intend to impact and evaluate them, with little change reported over 2 years. SIAs appear to focus on SNAP-Ed Guidance and annual reporting requirements From 2017 to 2019, the expected increase toward more comprehensive approaches and evaluation of Framework indicators was not seen. Among all 51 Indicators, efforts related to ST8- Partnerships and MT12-Social Marketing were the only significant changes noted over time. States were far more likely to attend to ‘priority’ indicators, and if the goal is for SIAs to move toward statutorily recommended ‘comprehensive and public health approaches’, greater understanding is needed to understand barriers and incentives that would assist states.

23 Evaluation Policy Take-Aways
Need for training and expertise is noted across all levels of the Framework (in addition to staff and budget constraints) ➣ SIAs may benefit from more interventions and evaluation tools for long-term and outer sphere outcomes in searchable, online resources like the SNAP-Ed Toolkit and SNAP-Ed Connection ➣ Ongoing, multi-disciplinary technical assistance would address gaps in knowledge and effort Further examination of barriers to and incentives for use of the Framework is needed, including within the annual USDA Guidance, state plans and reporting Evaluation Policy Take-Aways

24 Specific Opportunities
Evaluation Committee encourages ASNNA members to present and publish their work to widen understanding and adoption of the Framework: The SNAP-Ed Evaluation Framework: Demonstrating the Impact of a National Framework for Obesity Prevention in Low-Income Populations Translational Behavioral Medicine (Submitted January, ) 2017 and 2019 Census Regional, state and SIA results Specific Opportunities In conclusion, I hope everyone leaves with an understanding of why this initial baseline assessment is important And how it has provided us with these golden opportunities for technical assistance and help understanding the application of the framework I hope that you take this information home with you and begin to find ways to addresses this work within your home agency. Again thank you for the amazing opportunity

25 Specific Opportunities
To increase Framework capacity and use, the need for technical assistance and training could be addressed through: Further analysis of the 2019 Census and Social Marketing Profile results for gaps to address at regional, state, and SIA levels Identifying champions and mentors for peer learning at outer levels of the Framework Joining efforts with other stakeholders such as NOPREN (National Obesity Prevention, Research, and Evaluation Network) to advance practice Entering into new partnership with NCCOR (National Collaborative on Childhood Obesity Research) and ASNNA to update the Framework’s Interpretive Guide Seeking outside grant support for Framework- related projects Specific Opportunities

26 Thank You! Jini Puma, PhD Assistant Professor
Colorado School of Public Health University of Colorado Denver


Download ppt "ASNNA 2019 Census of Intervention, Evaluation and Reporting Activities"

Similar presentations


Ads by Google