Measuring QI Intervention Implementation: Helping the Blind Men See? EQUIP (Evidence-Based Practice in Schizophrenia ) QUERI National Meeting Working Group.

Slides:



Advertisements
Similar presentations
Co-Teaching as a Methodology in Teacher Preparation
Advertisements

Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Introduction to the unit and mixed methods approaches to research Kerry Hood.
The NDPC-SD Intervention Framework National Dropout Prevention Center for Students with Disabilities Clemson University © 2007 NDPC-SD – All rights reserved.
Donald T. Simeon Caribbean Health Research Council
Robin L. Donaldson May 5, 2010 Prospectus Defense Florida State University College of Communication and Information.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
On-the-Spot: Needs Assessment. Objectives To recognize the importance of conducting a rapid initial assessment before deciding whether and how to respond.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
PPA 502 – Program Evaluation
Happy semester with best wishes from all nursing staff Dr Naiema Gaber
Chapter 3 Preparing and Evaluating a Research Plan Gay and Airasian
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
The Case Study as a Research Method
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
ORIENTATION SESSION Strengthening Chronic Disease Prevention & Management.
Applying Multiple Frameworks and Theories in Implementation Research (Part 2) Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Step 6: Implementing Change. Implementing Change Our Roadmap.
5-Step Process Clarification The 5-Step Process is for a unit, topic, or “chunk” of information. One form should be used for the unit, topic, etc. The.
The Implementation Process: Perspectives from Frontline Providers and Managers JoAnn E. Kirchner, MD Louise E. Parker, PhD Laura Bonner, PhD Elizabeth.
Module 3. Session DCST Clinical governance
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Module 4: Systems Development Chapter 13: Investigation and Analysis.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
Demystifying the Business Analysis Body of Knowledge Central Iowa IIBA Chapter December 7, 2005.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Chapter 11: Qualitative and Mixed-Method Research Design
Incorporating an Evaluation Plan into Program Design: Using Qualitative Data Connie Baird Thomas, PhD Linda H. Southward, PhD Colleen McKee, MS Social.
Joint Infant and Toddler Steering Committee/Early Learning Regional Coalition Statewide Meeting “Using our Data for Continuous Improvement” Organizational.
Barbara Resnick, PhD, CRNP, FAAN, FAANP
The Impact of Health Coaching
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Building and Recognizing Quality School Systems DISTRICT ACCREDITATION © 2010 AdvancED.
STARTALK: Our mission, accomplishments and direction ILR November 12, 2010.
NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE Report of Independent Evaluation Presentation – 7 th February 2012 NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
The Interactive Model Of Program Planning
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Transforming Patient Experience: The essential guide
Working Group: Can Six Blind Men Find Apples & Oranges? Measuring Variable Implementation of QI Interventions Using Multiple Data Sources.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Evidence-Based Quality Improvement (EBQI) Amy N. Cohen, PhD Desert Pacific Mental Illness Research Education and Clinical Center (MIRECC)
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Measuring QI Intervention Implementation: Helping the Blind Men See? QUITS Trial (Smoking Cessation) QUERI National Meeting Working Group December 12,
Program Evaluation Principles and Applications PAS 2010.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
+ SOUTH DAKOTA PRINCIPAL EFFECTIVENESS MODEL PROCESS OVERVIEW PE WEBINAR I 10/29/2015.
Research Methods Observations Interviews Case Studies Surveys Quasi Experiments.
This action-based research study used a descriptive triangulation process, which included quantitative and qualitative methods to analyze nursing students’
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
Quality Improvement in Parallel Circuits: WHAT METHODS DOES IMPLEMENTATION RESEARCH EMPLOY? Teresa Damush, Ph.D. Implementation Research Coordinator VA.
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
A Multi-Level Framework to Understand Factors Influencing Program Implementation in Schools Celene E. Domitrovich, Ph.D. Penn State Prevention Research.
Cross-site Evaluation Update Latino ETAC. Goal of Cross-site Evaluation To facilitate and conduct a rigorous evaluation of innovative and effective service.
Implementation and Sustainability in the US National EBP Project Gary R. Bond Dartmouth Psychiatric Research Center Lebanon, NH, USA May 27, 2014 CORE.
WHY IS THIS HAPPENING IN THE PROGRAM? Session 5 Options for Further Investigation & Information Flow.
Building Capacity for EMR Adoption and Data Utilization Among Safety Net Organizations Presented by Chatrian Reynolds, MPH, Evaluator, LPHI Shelina Foderingham,
Action Research for School Leaders by Dr. Paul A. Rodríguez.
Stages of Research and Development
Readiness Consultations
MUHC Innovation Model.
The Methods: Use of an Analytic Rubric to Evaluate Dissemination Strategies for Evidence-Based Interventions Jennifer Berktold Joseph Sonnefeld Presented.
Pre-implementation Processes Implementation, Adoption, and Utility of Family History in Diverse Care Settings Study Lori A. Orlando, MD MHS.
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
Presentation transcript:

Measuring QI Intervention Implementation: Helping the Blind Men See? EQUIP (Evidence-Based Practice in Schizophrenia ) QUERI National Meeting Working Group December 12, 2008

QI Intervention Example EQUIP (Enhancing QUality of care In Psychosis) – evidence-based quality improvement to implement effective care in specialty mental health – Alex Young, MD & Amy Cohen, PhD (Co-PIs)

EQUIP Effective Schizophrenia Care EBQI Provider/patient education Quality manager QI Informatics support Performance feedback Leadership support “infrastructure” “priority-setting” Evidence base: TMAP EQUIP-1

Context Matters: Design for It EQUIP – 4 VISNs: intervention and control site in each VISN – sites chosen collaboratively based on interest – Each VISN asked to select evidence-based care targets for intervention: all selected Wellness & Supported Employment – Availability & quality of these care targets vary across sites – Structure of care for patients with schizophrenia varies across sites – Formative evaluation methods utilized to understand variable implementation

Formative evaluation=assessment process designed to identify potential and actual influences on the progress and effectiveness of implementation efforts Data collection occurs before, during, and after implementation Need to be able to answer questions about context, adaptations, and responses to change What is Formative Evaluation?

Developmental evaluation Implementation-focused evaluation (process evaluation) Progress-focused evaluation Interpretive evaluation Four Stages of Formative Evaluation

Simpson Transfer Model

Developmental Field notes Documents (minutes, etc.) ORC & Burnout Inventory Key stakeholder interviews Implementation- Focused Field notes Quality Coordinator logs Documents Key stakeholder interviews Progress-Focused QI tools Interpretive Field notes Key stakeholder interviews ORC & Burnout Inventory Stages of FE (STM) & EQUIP FE Measures Post- Implementation (STM: Practice) Pre- Implementation (STM: Exposure & Adoption) Implementation (STM: Implementation)

Multiple Data Sources: Measuring Implementation EQUIPExamples Semi-structured interviews: leaders, clinicians, mgrs participation, level of implementation Organizational site surveys: admin. & staff clinic structure, processes, change Field journals group-level dynamics, implementation details Administrative data visits, Rxs Patient surveys PAS Activity logs Time spent on aspects of study

Multiple Data Sources: Strengths and Challenges StrengthsChallenges Semi-structured interviews: leaders, clinicians, mgrs rich data, diverse perspectives expensive, time- consuming Organizational site surveys: Admin & staff site profiles, faster, easier to analyze limited discovery, key informant view Field journalsdetailed contextual data variation between observers Administrative datareadily available, historical value not QII-specific, local coding differences Patient surveysvalidate experience, exposure, outcomes expensive, highly sensitive to sample Activity logsclinical implementation, dose of effort/time global measure—no detailed dose info.

Organizational Readiness for Change (ORC): Staff and Administrator versions Maslach Burnout Inventory On-line measure Pre- and post-implementation EQUIP Organizational Climate Measures

Using scales related to: – Motivation for change (program needs, training needs, pressures for change) – Staff attributes (growth, adaptability) – Organizational climate (mission, cohesion, autonomy, communication, change) Purpose is descriptive & to assess change in readiness from pre- to post-implementation Organizational Readiness for Change

Conducted pre-, mid-, and post-implementation Versions for providers, administrators, and VISN leaders Covered in consent Face-to-face recorded interviews Professionally transcribed Analyzed after each round EQUIP Semi-Structured Interviews

Primary method of capturing data from observant participation “If you didn’t write it down in your field notes, then it didn’t happen.” (at least in terms of data analysis) 3 kinds of notes – Records of events observed and information given – Records of prolonged activities – Chronological daily diary EQUIP Participant Observation: Field Journal

Submitted monthly by RN Quality Coordinator What % of time was spent on each aspect of clinical intervention Will be able to look across sites to see variation in time spent on clinical activities; can see if this relates qualitatively to implementation at each site EQUIP Quality Coordinator Logs

Critical Measures of Implementation Integrity of innovation – Fidelity to planned implementation strategy – Dose of intervention delivery, when variability is possible – Requires clear operational definitions of intervention components Exposure to innovation – Degree to which intervention is experienced by targeted users – Dose of exposure, when variability is possible – Requires clear operational defs for measuring intervention exposure Intensity of implementation – E.g, implementation or intensity scores for multifaceted interventions – Eg, ‘goal attainment scaling’ when strategy allows local adaptation or choice of alternative interventions across sites

Triangulation Critical to collect information about implementation from multiple sources – Be prepared for disagreement – Perspectives and opportunities for observation differ for managers, providers vs. patients Recognize differences between “exposed” sample and practice population – Does the “enrolled” group represent the practice? – Did the intervention penetrate among all providers?

Telling the story of variable implementation Examine range of data sources as a team – Throughout course of data collection – Discuss which data sources answer which questions Examine which data sources are complementary – Which data sources should be triangulated? – What questions are raised or what answers are provided?

Telling the story of variable implementation Use qualitative data analysis software to facilitate mixed methods analysis – Multiple data sources – Multiple grouping options (e.g., by site, by stakeholder, by data collection time points) – Team-based analysis – Ongoing, iterative analysis informing implementation efforts

Software Support: ATLAS.ti

Telling the story of variable implementation Audience considerations – Throughout course of data collection – Which data sources answer which questions, for whom – Issue of providing feedback to sites Product considerations – Which data sources should be triangulated? – What questions are raised or what answers are provided? – How much and what should go into which products?