Data-driven Approaches to Monitoring and Evaluating Environmental Strategies Training Workshop for Vermont Community Prevention Coalitions March 20, 2012.

Slides:



Advertisements
Similar presentations
Empowering tobacco-free coalitions to collect local data on worksite and restaurant smoking policies Mary Michaud, MPP University of Wisconsin-Cooperative.
Advertisements

1 Capacity Training New Mexico Strategic Prevention Framework.
Donald T. Simeon Caribbean Health Research Council
Campus Improvement Plans
Local Evaluator Meeting June 19, 2012 University of Maryland School of Pharmacy.
Program Evaluation. Lecture Overview  Program evaluation and program development  Logic of program evaluation (Program theory)  Four-Step Model  Comprehensive.
Strengthening College Alcohol Abuse Prevention Efforts in Arizona Through a Statewide Tri- University Consortium.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
SAMHSA’s Strategic Plan
1 Minority SA/HIV Initiative MAI Training SPF Step 3 – Planning Presented By: Tracy Johnson, CSAP’s Central CAPT Janer Hernandez, CSAP’s Northeast CAPT.
Presented By: Tracy Johnson, Central CAPT
Building Safe, Healthy, and Drug Free Communities March 12, 2015 General Arthur T. Dean Chairman and CEO, CADCA A Public Health Approach.
Evaluation. Practical Evaluation Michael Quinn Patton.
ARQ part II data management Training pack 2: Monitoring drug abuse for policy and practice.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Sustaining Local Public Health and Built Environment Programs Fit Nation NYC November 3, 2011 Annaliese Calhoun.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
How To Use Science in Practice Selecting, Implementing, and Evaluating Environmental and Population-Based Prevention Strategies How To Use Science in Practice.
1-2 Training of Process FacilitatorsTraining of Coordinators 5-1.
Reporting and Using Evaluation Results Presented on 6/18/15.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
CAST Evaluation Team Webinar March 10,  Different organizations & fields use different terminology  SC Prevention System has agreed-upon definitions.
Opioid Misuse Prevention Program “OMPP” Strategic Planning Workshop
Training of Process Facilitators Training of Process Facilitators.
The Evaluation Plan.
Must include a least one for each box below. Can add additional factors. These problems… School Performance Youth Delinquency Mental Health [Add Yours.
Must include a least one for each box below. Can add additional factors. These problems… School Performance Youth Delinquency Mental Health [Add Yours.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
Harold D. Holder, Ph.D. Prevention Research Center Berkeley, California Selecting, Implementing, and Evaluating Environmental and Population-Based Prevention.
The Audit Process Tahera Chaudry March Clinical audit A quality improvement process that seeks to improve patient care and outcomes through systematic.
Program Evaluation and Logic Models
Maine Learning Community: Selecting Strategies February 21, 2007
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Strategic Prevention Framework Overview Paula Feathers, MA.
General assumptions – the role of media campaign in public education B. Bukowska, P. Jablonski Belgrad, 23 – 24 April 2013.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Washington State Department of Social & Health Services One Department Vision Mission Core set of Values - Division of Behavioral Health and Recovery Prevention.
Overview June,  Sub-recipients grant applications will go to ADAMHS/ADAS Boards only.  ADAMHS/ADAS Boards will be expected to identify a primary.
PRI Logic Model The following slides demonstrate various displays of the PRI logic model for your reference and use in local presentations. If you need.
Potential Alcohol Strategies March 20, 2008 Sheila Nesbitt.
Environmental Management System Definitions
Nebraska Collegiate Consortium To Reduce High-Risk Drinking Environmental Scanning NCC Skill Building Workshop April 11, 2006.
21/4/2008 Evaluation of control measures 1. 21/4/2008 Evaluation of control measures 2 Family and Community Medicine Department.
Step 2. Selecting Strategies that Fit Effective Identify evidence-based strategies that have been shown through research and scientific studies to be.
March 31, 2011 Long-Term Individual & Community Consequences (not an accountability measure) Consumption (Long-term outcomes) Strategies (State required)
Logic Model for Youth Substance Abuse & Use Prevention Programs in OAS Member States September 14, 2005 Ottawa, Ontario Wanda Jamieson & Tullio Caputo.
Performance Indicators Table for Youth Substance Abuse & Use Prevention September 15, 2005 Ottawa, Ontario Wanda Jamieson & Tullio Caputo.
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
State of California Department of Alcohol and Drug Programs State Incentive Grant Project Overview Michael Cunningham Deputy Director, Program Services.
Copyright © 2014 by The University of Kansas Using the Evaluation System to Answer Key Questions About Your Initiative.
+ Applying Environmental Strategies to Affect Sustainable Community Change April 30 – May 3, 2013 Macon, GA Shayla Bennett, MPA CITF Coach.
Building a Comprehensive Approach Part 2: Using Complementary Strategies Erica Schmitz MESAP: Maine’s Environmental Substance Abuse Prevention Center Medical.
Support the spread of “good practice” in generating, managing, analysing and communicating spatial information Evaluating and Reflecting on the Map-making.
Maine Learning Community: Day 2 Selecting Strategies and Implementation March 26, 2007 Maine Office of Substance Abuse (OSA) Northeast Center for Application.
Planning Evaluation Setting the Course Source: Thompson & McClintock (1998) Demonstrating your program’s worth. Atlanta: CDC/National Center for Injury.
1 Strategic Prevention Framework Overview Paula Feathers, MA Presented by Marcus Bouligny.
IDEV 624 – Monitoring and Evaluation Evaluating Program Outcomes Elke de Buhr, PhD Payson Center for International Development Tulane University.
Step 2. Selecting Strategies that Fit Effective Identify evidence-based strategies that have been shown through research and scientific studies to be.
Basic Concepts of Outcome-Informed Practice (OIP).
Session 2: Developing a Comprehensive M&E Work Plan.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Welcome IPFS Additional Strategic Plan Guidance March 3, 2016 The webinar will begin shortly.
Organizations of all types and sizes face a range of risks that can affect the achievement of their objectives. Organization's activities Strategic initiatives.
STRATEGIC PLANNING January 12-13, Where are we in the planning process? Needs assessment data collection and prioritization - Consequence - Consumption.
Partnerships for success (PFS)
Environmental Prevention Strategies
Strategic Prevention Framework – Planning
Strategic Prevention Framework - Evaluation
Bob Flewelling Amy Livingston
M & E Plans and Frameworks
Presentation transcript:

Data-driven Approaches to Monitoring and Evaluating Environmental Strategies Training Workshop for Vermont Community Prevention Coalitions March 20, 2012 PIRE

TODAY’S TRAINERS AND PRESENTERS…  VDH:  Lori Uerz  Suzanne Kelley  PIRE:  Bob Flewelling  Amy Livingston  Vermont Center for Rural Studies:  Erin Roche

AND WORK GROUP #1 FACILITATORS….  Cindy Hayford  Beth Crane  Maryann Morris

And now…. Introductions

Just What Is Environmental Prevention, Anyway? Environmental strategies in a community seek: Environmental strategies in a community seek: 1.To bring about system-level change (including physical space, local community policies, availability of drugs and alcohol, etc.) in order 2.To reduce substance abuse problems (increase health and safety) at the population level. That is Public Health. Both conditions must be met. Both conditions must be met.

One of many examples…..

What about Strategies Designed to Change Community Norms? Yes, norms are an attribute of the community environment. Communications strategies (e.g., media campaigns) can contribute to normative change But are most effective when combined with other environmental change strategies; for example: – Alcohol free events – Bans on outdoor advertising – Stricter enforcement of alcohol laws

Environmental Strategies: Implications for Evaluation  Long-term outcome measures should be at the population-level;  The short-term outcome measures (intervening variables) will typically be environmental conditions or perceptions of such  Implementation often is less well prescribed, creating challenges for planning, implementation, and process evaluation

Evaluating Community-Based Prevention Strategies: For Who?  First and foremost, the evaluation serves the community organization that’s doing the prevention work:  By documenting activities  By monitoring implementation and making adjustments as needed  By indicating if short- and long-term objectives are being achieved (or going in right direction)  By providing empirical data on activities and progress to be shared with stakeholders

Evaluation Components  Evaluation involves collecting and interpreting data on:  Process (information regarding implementation of strategies)  Short-term outcomes (also referred to as intervening variables)  Longer-term outcomes (indicators of the target problem being prevented or reduced)

Assessing Outcomes  Short and long term outcomes need to be measured at multiple points in time (or at least two points)  Focus is on whether and how measures change over time  Comparisons to the state (or nation) may enhance interpretation

Challenges and Limitations of Evaluating Community-Level Strategies Focused on Population-Level Change Community is an N of 1 There are many external factors that influence outcomes Population-level outcome data requires different approaches to collect (no “captive” audience) Standards for implementation (especially intensity) less well known

Implications for Interpreting Findings from Community-Level Evaluations Cannot definitively attribute positive changes to your interventions But positive changes can suggest your interventions deserve some of the credit* Lack of positive changes does not mean your interventions were ineffective * Especially if evidence-based strategies were well-implemented, intervening variables also changed, and you have some reasonable comparison data.

Whether your evidenced-based strategies were appropriate and well implemented (hence the importance of process evaluation) Whether short and long term outcome indicators are moving in: the “right direction” or the “wrong direction” (This is also referred to as “program monitoring” or “performance monitoring) Essential Foci of Any Community- Level Program Evaluation

Logic Models  Simply a way of showing the connections between:  Strategies  Desired Outcomes  The mechanisms through which the strategies are designed to work (i.e., intervening variables)  May be expanded to also show the measures for each intervening variable and outcome  Provide a blueprint for outcome evaluation

Substance- related consequences and substance use Intermediate variables Programs/ policies/ practices Implementing the Strategic Prevention Framework Planning, Monitoring, Evaluation, and Replanning Outcomes-Based Prevention

Important Attributes of Logic Models  Can be depicted with a variety of formats  Show the connections between each strategy and each IV the strategy is expected to change  Shows the measure(s) to be used for each IV and outcome  Identifying the “contributing factors” for each IV will help define appropriate measures  Usually best to focus each model on a single ultimate outcome

One Example of a Logic Model Template

One Example of a Logic Model Template (Filled In)

Measuring Intervening Variables  Each intervening variable may have multiple possible measures (i.e., “contributing factors”)  Typically there is more than one way to measure an IV  Desirable attributes of measures:  Valid  Periodic collection (with consistency)  Available (free or for minimal cost)  Reflective of the target population

Data Sources for Intervening Variables  Established surveys (e.g., YRBS)  Other surveys (e.g., a local parent survey)  Archival data (e.g., compliance check results from Dept. of Liquor Control)  Public records (e.g., liquor outlet locations, enactment of policies, enforcement activities)  Observation (e.g., community scans)  Active surveillance (e.g., purchase surveys)  Focus groups  Interviews with key informants

Measuring Outcomes  Same criteria and sources as for IVs  In addition, keep in mind:  Outcomes based on small numbers of events are not very sensitive (e.g., fatalities)  Measures that may reflect the intensity of intervention efforts are not good measures (e.g., arrest rates, treatment admissions)  The further a measure is from the strategy in the logic model, the more external influences there are

Monitoring Implementation What did you do? How much did you do? And how well did you do it?

Why is it important to track and monitor implementation activities?  Document what was done so: You can determine how well it was done You and/or others can repeat what was done  The information can be used to adjust and improve implementation going forward (formative evaluation)  The information can be used to help understand outcome findings

Monitoring Helps Us Understand Why Strategies Succeed or Fail 25

It’s not a miracle! If you are using appropriate evidence-based strategies and implementing them well (following available implementation guidelines closely, completing all core activities with minimal deviations, enough dosage and intensity) You can expect that they will lead to the desired outcome!

Steps for effective program monitoring 1.Identify the core components/activities for your strategy 2. Develop an implementation plan that lists out the core activities and timeline for completion.

Steps for effective program monitoring contd. 3. As you implement the activities, document progress including anything that can be measured (number of meetings, number and types of media spots, how many people reached, etc.)

Steps for effective program monitoring contd. 4. Assess progress toward short-term outcomes and make adjustments as needed. Is this moving us toward our end goal?

RECAP OF KEY POINTS Environmental strategies focus on: Changing the context, not the individual Population-level outcomes ES require different (and creative) approaches for measuring process and outcomes Although outcome monitoring is important, conducting conclusive outcome evaluations is not the goal for community-level efforts Implementing evidence-based strategies with the highest possible fidelity should be a goal for every community Dramatic change usually takes time and perseverance (e.g., smoking rates in the U.S.)