Advanced Methods in Delivery System Research – Planning, Executing, Analyzing, and Reporting Research on Delivery System Improvement Webinar #4: Formative.

Slides:



Advertisements
Similar presentations
Progress Towards Reading Success: The Reading First Evaluation Prepared by: Amy Kemp, Ph.D. Research Associate and Patricia A. Muller, Ph.D. Associate.
Advertisements

MSCG Training for Project Officers and Consultants: Project Officer and Consultant Roles in Supporting Successful Onsite Technical Assistance Visits.
Introduction to Monitoring and Evaluation
USING PRACTICE-BASED EVIDENCE TO ASSESS AND IMPROVE INTEGRATED CARE: THE INTEGRATED CARE EVALUATION PROJECT Jim Fauth & George Tremblay Clinical Psychology.
Implementation Research: Using Science to Guide Implementation of Evidence-Based Practices Brian S. Mittman, PhD Director, VA Center for Implementation.
Teaching/Learning Strategies to Support Evidence-Based Practice Asoc. prof. Vida Staniuliene Klaipeda State College Dean of Faculty of Health Sciences.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
99.98% of the time patients are on their own “The diabetes self-management regimen is one of the most challenging of any for chronic illness.” 0.02% of.
Chapter 2 Flashcards.
Session 4: Frameworks used in Clinical Settings, Part 2 Janet Myers, PhD, MPH Session 2 ● September 27, 2012.
TIDES Collaborative Care for Depression: From Research to Practice Jeffrey L. Smith, PhD(c) Implementation Research Coordinator VA Mental Health QUERI.
Cancer Disparities Research Partnership Program Process & Outcome Evaluation Amanda Greene, PhD, MPH, RN Paul Young, MBA, MPH Natalie Stultz, MS NOVA Research.
Happy semester with best wishes from all nursing staff Dr Naiema Gaber
Evaluation. Practical Evaluation Michael Quinn Patton.
CAPP Evaluation: Implementing Evidence Based Programs in NYS Jane Powers ACT for Youth Center of Excellence 2011 A presentation for Comprehensive Adolescent.
Evaluating Physical Activity Intervention Programs Thomas Schmid, PhD Physical Activity and Health Branch CDC Atlanta, Georgia, USA.
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Healthy North Carolina 2020 and EBS/EBI 101 Joanne Rinker MS, RD, CDE, LDN Center for Healthy North Carolina Director of Training and Technical Assistance.
1 Strengths and Challenges of Action Research Carol VanDeusen Lukas, EdD QUERI Implementation Seminar June 26, 2008.
How to Develop the Right Research Questions for Program Evaluation
Evaluation and Policy in Transforming Nursing
Delivery System Research: Discovering Ways to Improve Care Value* Michael I. Harrison Center for Delivery, Organization & Markets AHRQ Annual Meeting Sept.
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
From Evidence to Action: Addressing Challenges to Knowledge Translation in RHAs The Need to Know Team Meeting May 30, 2005.
ORIENTATION SESSION Strengthening Chronic Disease Prevention & Management.
Darren A. DeWalt, MD, MPH Division of General Internal Medicine Maihan B. Vu, Dr.PH, MPH Center for Health Promotion and Disease Prevention University.
Applying Multiple Frameworks and Theories in Implementation Research (Part 2) Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Performance Measurement and Analysis for Health Organizations
Module 3. Session DCST Clinical governance
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
Too expensive Too complicated Too time consuming.
Inventory and Assessment of NBCCEDP Interventions Evaluation November 1, 2007.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Evaluation Assists with allocating resources what is working how things can work better.
Evelyn Gonzalez Program Evaluation. AR Cancer Coalition Summit XIV March 12, 2013 MAKING A DIFFERENCE Evaluating Programmatic Efforts.
Measuring QI Intervention Implementation: Helping the Blind Men See? EQUIP (Evidence-Based Practice in Schizophrenia ) QUERI National Meeting Working Group.
Research Utilization in Nursing Chapter 21
Boston Geneva San Francisco Seattle Beginning the Evaluation Journey with FSG KCIC Boot Camp March 24, 2010 Prepared for:
NIPEC Organisational Guide to Practice & Quality Improvement Tanya McCance, Director of Nursing Research & Practice Development (UCHT) & Reader (UU) Brendan.
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
Presents: Information for participants: Your microphone will be muted for the formal presentation. If your audio portion the presentation is not working,
11 The CPCRN, DCPC, NCI, and the Community Guide: Areas for Collaboration and Supportive Work Shawna L. Mercer, MSc, PhD Director The Guide to Community.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 24, 2009.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Prepared by: Forging a Comprehensive Initiative to Improve Birth Outcomes and Reduce Infant Mortality in [State] Adapted from AMCHP Birth Outcomes Compendium.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Accountability & Effectiveness Innovation Network, Inc. January 22, 2003.
Collaboration Between Researchers and State Policymakers: Models for Achieving Evidence-Informed Policy Andrew Coburn, Ph.D Muskie School of Public Service.
Quality Assurance Review Team Oral Exit Report School Accreditation Sugar Grove Elementary September 29, 2010.
Quality Improvement in Parallel Circuits: WHAT METHODS DOES IMPLEMENTATION RESEARCH EMPLOY? Teresa Damush, Ph.D. Implementation Research Coordinator VA.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Yvonne Abel, Abt Associates Inc. November 2010, San Antonio, TX Enhancing the Quality of Evaluation Through Collaboration Among Funders, Programs, and.
Implementation Science: Finding Common Ground and Perspectives Laura Reichenbach, Evidence Project, Population Council International Conference on Family.
Overview: Evidence-based Health Promotion and Disease Management Programs.
Critical Program Movement: Integration of STD Prevention with Other Programs Kevin Fenton, MD, PhD, FFPH Director National Center for HIV/AIDS, Viral Hepatitis,
Innovations in Primary Care: Implementing Clinical Care Management in Primary Care Practices Judith Steinberg, MD, MPH Deputy Chief Medical Officer Jeanne.
Module 8 Guidelines for evaluating the SDGs through an equity focused and gender responsive lens: Overview Technical Assistance on Evaluating SDGs: Leave.
MUHC Innovation Model.
Regulated Health Professions Network Evaluation Framework
Changing the Health System to Improve the Health of Older Patients
Presentation transcript:

Advanced Methods in Delivery System Research – Planning, Executing, Analyzing, and Reporting Research on Delivery System Improvement Webinar #4: Formative Evaluation Presenter: Jeffrey Smith, PhD(c) Discussant: Cheryl McDonnell, PhD Moderator: Michael I. Harrison, PhD Sponsored by AHRQ’s Delivery System Initiative in partnership with the AHRQ PCMH program July 15, 2013

Speaker Introductions Jeffrey Smith, PhD Candidate is Implementation Research Coordinator (IRC) for the Department of Veterans Affairs’ Mental Health Quality Enhancement Research Initiative (QUERI). Jeff’s presentation today will draw on his paper with Kristin Geonnotti, Deborah Peikes and Winnie Wang on Formative Evaluation. This AHRQ PCMH Research Methods Brief is posted on the AHRQ PCMH website. Cheryl McDonnell, PhD is an experimental psychologist with over 30 years’ experience in evaluation and management of large-scale public health projects. Her presentation today draws on her work with an AHRQ grant entitled Accelerating Implementation of Comparative Effectiveness Findings on Clinical and Delivery System Interventions by Leveraging AHRQ Networks.

Jeffrey L. Smith, PhD(c) Implementation Research Coordinator VA Mental Health QUERI Formative Evaluation in Implementation Research: An Overview

Objectives Describe goals of evaluation in implementation science Describe goals of evaluation in implementation science Offer perspectives on what constitutes ‘successful implementation’ Offer perspectives on what constitutes ‘successful implementation’ Describe 4 stages of formative evaluation Describe 4 stages of formative evaluation

Goals of Evaluation in Implementation Science Conduct formative evaluation Conduct formative evaluation –Rigorous assessment process designed to identify potential and actual influences on the progress and effectiveness of implementation efforts (Stetler et al, JGIM 2006; 21(Suppl 2):S1-8.) Conduct summative evaluation Conduct summative evaluation –Systematic process of collecting and analyzing data on impacts, outputs, products, outcomes and costs in an implementation study Evaluate usefulness of selected theory, i Evaluate usefulness of selected theory, in terms of… – –Planning implementation strategy – –Unanticipated elements critical to successful implementation, but unexplained by selected theory – –Helping to understand findings and relationships between domains or constructs

What is Successful Implementation? Implementation plan and its realization Evidence-based practice (EBP) innovation uptake – –i.e., clinical interventions and/or delivery system interventions Patient and organizational outcomes achievement

Adapted from: Lukas CV, Hall C. Challenges in Measuring Implementation Success. 3rd Annual NIH Conference on the Science of Implementation and Dissemination: Methods and Measurement. March 15-16, Bethesda, MD. Does the concept of implementation success apply to implementation strategy as well as to the innovation?

Four Stages of Formative Evaluation (FE) Developmental Developmental Implementation-Focused Implementation-Focused Progress-Focused Progress-Focused Interpretive Interpretive

Developmental Formative Evaluation aka “needs assessment”, “organizational diagnosis” aka “needs assessment”, “organizational diagnosis” Involves data collection on… Involves data collection on… –Actual degree of less-than-best practice (need for improvement) –Determinants of current practice (including context) –Potential barriers / facilitators to practice change –Feasibility of (initial) implementation strategy Goals Goals –Identify determinants and potential problems and try to address in implementation strategy; refine strategy as needed –Avoid negative unintended consequences –Engage stakeholders in defining problem and potential solutions

Implementation-Focused Formative Evaluation Occurs during implementation of project plan Occurs during implementation of project plan Focuses on assessing discrepancies between implementation plan and execution Focuses on assessing discrepancies between implementation plan and execution Enables researchers to… Enables researchers to… –Ensure fidelity (both to implementation strategy and clinical intervention) –Understand nature and implications of local adaptation –Identify barriers –Identify new intervention components or refine original strategy to optimize potential for success –Identify critical details necessary to replicate implementation strategy in other settings

Progress-Focused Formative Evaluation Occurs during implementation of project plan Occurs during implementation of project plan Focuses on monitoring indicators of progress toward implementation or clinical quality improvement (QI) goals Focuses on monitoring indicators of progress toward implementation or clinical quality improvement (QI) goals –audit/feedback of clinical performance data –progress in relation to pre-determined timelines for implementing intervention components Used to inform need to modify or refine original strategy Used to inform need to modify or refine original strategy May also be used as positive reinforcement for high performing sites; negative reinforcement for low performers May also be used as positive reinforcement for high performing sites; negative reinforcement for low performers

Interpretive Evaluation Uses data from other stages and data collected from stakeholders at end of project Uses data from other stages and data collected from stakeholders at end of project Obtain stakeholder views on: Obtain stakeholder views on: –Usefulness or value of intervention –Barriers and facilitators to implementation success or failure –Satisfaction with implementation strategy –Recommendations for refinements to implementation strategy Can provide working hypotheses on implementation success / failure Can provide working hypotheses on implementation success / failure

Formative Evaluation Assessment Methods / Tools Quantitative Quantitative –Structured surveys / tools  Instruments assessing context (e.g., organizational culture, readiness to change), provider receptivity to evidence-based practices  Intervention fidelity measures –Audit / feedback of clinical performance data Qualitative Qualitative –Semi-structured interviews w/ clinical stakeholders (pre-/post-) –Focus groups –Direct (non-participant) observation of clinical structure and processes in site visits –Document review Mixed Methods (i.e., Quantitative + Qualitative) Mixed Methods (i.e., Quantitative + Qualitative)

Stages of Formative Evaluation

Limitations Requires additional time and resources Requires additional time and resources Methodological challenges Methodological challenges Necessitates care in interpreting results Necessitates care in interpreting results –intermediate vs. final results Preserving objectivity Preserving objectivity FE is part of the intervention FE is part of the intervention

Advantages Increase understanding of key barriers and facilitators to implementation Increase understanding of key barriers and facilitators to implementation Facilitate mid-stream modifications Facilitate mid-stream modifications –Process for adapting tools and strategies to increase chances for implementation success Refine complex interventions Refine complex interventions –Patient-Centered Medical Home (PCMH) Interventions  Multiple components  New roles for clinical staff  Variable local resources to support implementation

QUESTIONS??

Cheryl McDonnell, PhD James Bell Associates

Grant Overview Accelerating Implementation of Comparative Effectiveness Findings on Clinical and Delivery System Interventions by Leveraging AHRQ Networks (R18) Dina Moss - PO Purpose: Spread CER findings by leveraging the capacities of multi-stakeholder or multi-site networks Goal: Implement existing evidence

Evaluation Objectives Identify effective methods of dissemination and diffusion of evidence-based practices, and barriers and facilitators to diffusion The evidence-based practices included activities intended to assist clinical providers and/or patients to: Choose a course of treatment Identify the most effective method of screening for a disease within a population Change the process of care delivery Promote self-management of chronic diseases

Grantee Projects Leveraging PBRNs for Chronic Kidney Disease Guideline Dissemination: James Mold, MD Comparative Effectiveness of Asthma Interventions Within an AHRQ PBRN: Michael Dulin, MD The Teen Mental Health Project: Dissemination of a Model for Adolescent Depression Screening & Management in Primary Care: Ardis Olson, MD Partners in Integrated Care (PIC): Keith Kanel, MD Accelerating Utilization of Comparative Effectiveness Findings in Medicaid Mental Health: Stephen Crystal, PhD Cardiac Surgery Outcomes – Comparing CUSP and TRiP to Passive Reporting: Peter Pronovost, MD and David Thompson, DNSc

CER Dissemination Grants Examples of ‘T3’ phase translational research incorporating: Effectiveness Dissemination Implementation Applied knowledge about interventions in a real-world setting At the ‘make it happen’ end of the continuum

Formative Evaluation Approach Mixed methods Qualitative Quantitative Four areas of focus Needs Assessment Evaluability assessment Implementation evaluation Process evaluation

Greenhalgh et al 2004

Formative Evaluation Tasks Ensure an evaluation is feasible. Determine the extent to which the program is being implemented according to plan on an ongoing basis Assess and document the degree of fidelity and variability in program implementation, expected or unexpected, planned or unplanned Provide information on what components of the intervention are potentially responsible for outcomes

Formative Evaluation Tasks (cont.) Describe the relationship between program context (i.e., setting characteristics) and program processes (i.e., levels of implementation). Provide feedback on the status of implementation Identify barriers and facilitators to implementation Refine the delivery components Provide program accountability to stakeholders, and other funders

Initial Site Visit Focus Evidence base Value and relevance of the evaluation process to the implementation team Identified outputs Role of the research team in the implementation process Perceived degree of influence of the PI Scalability of the intervention Organizational variables

Ongoing Monitoring Measurement of outputs Number of clinics enrolled/services delivered/training sessions completed/meetings held Frequency/duration/dosage Measurement of Study Characteristics Clinics participating Clients served Staff involved Integration of Tracking and Reporting Site Visits Input from external expert advisors

Follow-up Site Visits Current Status Evidence base External context Partnerships/collaborations Study design Progress to date Identified barriers/proposed solutions Identified facilitators

Common Challenges Resource Constraints IT Integration Challenges Infrastructure Limitations Communication and collaboration Intervention fidelity vs. flexibility Practice site engagement and sustainability

Thank you for attending! For more information about the AHRQ PCMH Research Methods briefs, please visit: For more information about the AHRQ PCMH Research Methods briefs, please visit: /community/pcmh__home/1483/pcmh_evi dence___evaluation_v2