© 2012 | WORK, FAMILY & HEALTH NETWORK Evaluating Workplace Health and Wellbeing Interventions Georgia T. Karuntzos, Ph.D Jeremy Bray, Ph.D Jesse M. Hinde.

Slides:



Advertisements
Similar presentations
Research Strategies: Joining Deaf Educators Together Deaf Education Virtual Topical Seminars Donna M. Mertens Gallaudet University October 19, 2004.
Advertisements

Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Introduction to Monitoring and Evaluation
The NDPC-SD Intervention Framework National Dropout Prevention Center for Students with Disabilities Clemson University © 2007 NDPC-SD – All rights reserved.
Donald T. Simeon Caribbean Health Research Council
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
National Human Resources for Health Observatory HRH Research Forum Dr. Ayat Abuagla.
Implementation Research: Using Science to Guide Implementation of Evidence-Based Practices Brian S. Mittman, PhD Director, VA Center for Implementation.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Creating an Evaluation Plan Freya Bradford, Senior Consultant NorthSky Nonprofit Network May 23, 2013.
Process Evaluation: Considerations and Strategies CHSC 433 Module 4/Chapter 8 L. Michele Issel UIC School of Public Health.
SWRK 292 Thesis/Project Seminar. Expectations for Course Apply research concepts from SWRK 291. Write first three chapters of your project or thesis.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Evaluation of Health Promotion CS 652 Sarah N. Keller.
Agenda: Block Watch outcome map Program Theory overview Evaluation theory overview Mentoring Evaluation Assignment 1 Evaluation Debrief.
Evaluation. Practical Evaluation Michael Quinn Patton.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
How to Develop the Right Research Questions for Program Evaluation
The Oxford Health Alliance The Oxford Health Alliance Community Interventions for Health: Methodology Confronting the Epidemic.
The Vision Implementation Project
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
Performance Monitoring : Thoughts, Lessons, and Other Practical Considerations.
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
Program Evaluation Using qualitative & qualitative methods.
Nursing Care Makes A Difference The Application of Omaha Documentation System on Clients with Mental Illness.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Performance Measurement and Analysis for Health Organizations
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Advancing Implementation Science: Process & Outcome Conceptual Framework Enola Proctor George Warren Brown School of Social Work Washington University.
Multilevel and Multi-method Designs Capturing the Effects of Organizational Context and Process in Implementation Research AHRQ 2007 Annual Conference:
Skunk Works Evaluation Tools: How do we know if we are having an impact?
RE-AIM Plus To Evaluate Effective Dissemination of AHRQ CER Products Michele Heisler, MD, MPA September, 2011.
Program Evaluation and Logic Models
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Food Safety Professional Development for Early Childhood Educators Evaluation Plan.
The EPISCenter is a project of the Prevention Research Center, College of Health and Human Development, Penn State University, and is funded by the Pennsylvania.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Measuring QI Intervention Implementation: Helping the Blind Men See? EQUIP (Evidence-Based Practice in Schizophrenia ) QUERI National Meeting Working Group.
Promoting a Culture of Evidence and Use of Data through Program Evaluation Session Theme 2 Presentation to: OSEP Project Directors’ Conference July 21,
SIM Evaluation Approach Presentation to the SIM Steering Committee September 25, 2013.
Nancy L. Weaver, PhD, MPH Department of Community Health School of Public Health Saint Louis University 16 July 2010 LOGIC MODEL FUNDAMENTALS.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
FOODSERVICE JBP GUIDEBOOK
Utilities’ Update on Energy Savings Assistance Program Studies Ordered in D LIOB Meeting August 21, 2013 Sacramento, California.
Program Evaluation Principles and Applications PAS 2010.
1 National Indicators and Qualitative Studies to Measure the Activities and Outcomes of CDC’s PRC Program Demia L. Sundra, MPH Prevention Research Centers.
INTRODUCING THE PSBA-GTO ACT FOR YOUTH CENTER OF EXCELLENCE IN CONSULTATION WITH HEALTHY TEEN NETWORK Planning for Evidence-Based Programming.
Session 2: Developing a Comprehensive M&E Work Plan.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
IMPLEMENTATION AND PROCESS EVALUATION PBAF 526. Today: Recap last week Next week: Bring in picture with program theory and evaluation questions Partners?
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
What is Research Design? RD is the general plan of how you will answer your research question(s) The plan should state clearly the following issues: The.
Approaches to Linking Process and Outcome Data in a Cross-Site Evaluation Laura Elwyn, Ph.D. Kristin Stainbrook, Ph.D. American Evaluation Association.
Stages of Research and Development
Designing Effective Evaluation Strategies for Outreach Programs
Right-sized Evaluation
Pre-implementation Processes Implementation, Adoption, and Utility of Family History in Diverse Care Settings Study Lori A. Orlando, MD MHS.
Process Indicators for Patient Navigation
Primary Care Performance Measurement and Reporting
HMS Academy Fellowship in Medical Education Research June 2, 2016
CHRODIS PLUS WP8 Task 8.2 – Toolkit and pilot
Presentation transcript:

© 2012 | WORK, FAMILY & HEALTH NETWORK Evaluating Workplace Health and Wellbeing Interventions Georgia T. Karuntzos, Ph.D Jeremy Bray, Ph.D Jesse M. Hinde RTI International

Steps for Conducting Effective Evaluations 2 Centers for Disease Control

Types of Evaluations Adapted from: Norland, E. (2004, Sept). From education theory.. to conservation practice Presented at the Annual Meeting of the International Association for Fish & Wildlife Agencies, Atlantic City, New Jersey.

Theoretical/Logic Model ■ A diagram of the theory of how a program is supposed to work— a graphic depiction of relationships between activities and results (conceptual/theoretical model) ■ A logical chain of connections showing what a program intends to accomplish ■ Increases intentionality and purpose ■ Guides prioritization and allocation of resources ■ Helps to identify important variables to measure; use evaluation resources wisely ■ Supports replication

WFHN Evaluation Design ■ Single protocol ■ Work redesign, supervisor training and self-monitoring ■ Multiple industries and worksites ■ Healthcare (blue collar) ■ Telecommunications (white collar) ■ Group randomized field experiment ■ Intervention and comparison group assignment at the worksite or work group level ■ Adaptive randomization to balance covariates

Outcome Evaluation Design ■ Nested cohort design ■ Worksite partners are randomized within industry to intervention or control group ■ Outcomes evaluated at multiple levels, employees, workgroups and worksites ■ Allows for multiple levels of clustering (e.g., workgroups within worksites), and a variety of outcomes (e.g., discrete, continuous, count)

Formative(Process) Evaluation Framework 8 Implementation Outcomes Acceptability Adoption Appropriateness Cost Feasibility Fidelity Penetration Sustainability Implementation Outcomes Acceptability Adoption Appropriateness Cost Feasibility Fidelity Penetration Sustainability - *IOM Standards of Care What? QIs ESTs How? Implementation Strategies Implementation Research Methods Service Outcomes* Efficiency Safety Effectiveness Equity Patient- centeredness Timeliness Individual Outcomes Satisfaction Function Health status/ symptoms Proctor’s Model of Implementation Outcomes

Process Evaluation Data ■ Document Reviews – provide information to build an “a priori” understanding of the program content, operations, context, and program stakeholders ■ Review reports, instruments, protocols, promotional materials, patient materials, resource lists, organizational documents (org charts, flow charts, operation manuals), web sites. ■ Observational Studies – provide empirical evidence to assess program fidelity, and generate service flow and timing data to inform outcome and cost analysis ■ Key Informant Interviews – provide contextual information related to program utility, contextual factors that influence program implementation, service delivery, dispersion, and sustainability ■ Practioner and Consumer Surveys -- provide systematic data related to service delivery experiences and program related perceptions

Qualitative Analysis Methods ■ Recursive abstraction (Document Summaries) ■ Iterative process that generates summaries, classifications, lists, rates, or groupings ■ Deductive and Inductive Analysis (data coding) ■ Deductive “a priori” framework ■ Inductive “grounded theory” analysis ■ Results in taxonomies, themes, categories, orders ■ Comparative analysis ■ Documented vs observed processes, behaviors or outputs ■ Document changes over time (e.g., model migration) ■ Variation between processes or outputs ■ Mixed Methods (triangulation and convergence) 10

Process Evaluation Results  Comprehensive description of program components  Performance indicators and proficiency scores for program delivery  Common barriers and facilitators across worksites  Descriptive taxonomy of program settings  Construction of moderator variables for use in outcome and economic analyses  Program delivery protocols that are used at each worksite site

Outcome Evaluation Questions ■ Program Outputs ■ To what extent is the program actually performed as measured by (e.g. the number of health risk appraisals completed, percentage of employees participating in workshops, number of follow-up services delivered) ■ Proximal and Distal Outcomes ■ What is the effectiveness of the program on outcomes of interest? (changes in sleep quality, changes in bio- measures)? ■ What stakeholder and employee-level characteristics moderate the willingness and ability of performance sites and practitioners to adopt the program

WFHN Study Data

Outcome Evaluation Methods ■ Descriptive statistics generating classifications or groupings of the participating worksites, provider characteristics, employee characteristics ■ Multivariate regression examining the relationship between groups ■ Statistical modeling measuring changes on outcomes over time

Outcome Evaluation Results ■ Rates and frequencies of program participation ■ Performance site descriptive characteristics (including staff characteristics) that are associated with outcomes of interest ■ Within group pre to post changes in health and wellness outcomes ■ Between group comparative changes in health and wellness outcomes at multiple time points

Intervention Effect on Work Environment FSWE Hypotheses ■ H1. There are changes in work environment for the ■ H1a. TOMO intervention group ■ H1b. TOMO control group ■ H1c. LEEF intervention group ■ H1d. LEEF control group ■ H2. There are baseline differences in environment between intervention and control groups for the ■ H2a. TOMO industry ■ H2b. LEEF industry ■ H3. There are differences in the rate of change in climate between intervention and control groups for the ■ H3a. TOMO industry ■ H3b. LEEF industry

FSWE Factor Analysis

Growth Model Results by Industry HypothesisIndustryGroupRandom Effect EstimateStandard Errorp H1aTOMOInterventionSlope < H1bTOMOControlSlope H1cLEEFInterventionSlope H1dLEEFControlSlope H2aTOMO Intervention Vs. Control Intercept H2bLEEF Intervention Vs. Control Intercept H3aTOMO Intervention Vs. Control Slope H3bLEEFIntervention Vs. Control Slope

Economic Evaluation Questions ■ What is the program cost to worksites and to other stakeholders? ■ What is the cost-effectiveness of the program? ■ What characteristics moderate the cost-effectiveness of the program?

Translational Study  An effective intervention is only useful if it is communicated to and adopted by workplaces  Complementary methods:  Use participant feedback to inform post-study messaging  Analyze process data to study employee perception for developing portrayals of the intervention  Assess potential dissemination channels  Stimulate market demand for effective intervention

Thank You For more information on the Work, Family, and Health Network Study