Julie C. Lowery, PhD, MHSA Associate Director, VA CCMR; Co-Implementation Research Coordinator, VA Diabetes QUERI HSR&D Center for Clinical Management.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Philip M. Ullrich, Ph.D. Spinal Cord Injury QUERI IRC Philip M. Ullrich, Ph.D. Spinal Cord Injury QUERI IRC Philip M. Ullrich, Ph.D. Spinal Cord Injury.
Service Agency Accreditation Recognizing Quality Educational Service Agencies Mike Bugenski
Session 4: Frameworks used in Clinical Settings, Part 2 Janet Myers, PhD, MPH Session 2 ● September 27, 2012.
Building on the Measurement of CFIR Constructs in FQHCs: Where Do We Go From Here? Maria Fernandez, PhD on behalf of the CPCRN FQHC WG Investigators CPCRN.
INTRODUCTION The use of and interest in shared decision making (SDM) and decision aids (DAs) in the clinical setting has increased in recent years. Stakeholder.
Business Excellence within The University of Bolton Strategic Planning Process 17 th October 2006.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Happy semester with best wishes from all nursing staff Dr Naiema Gaber
Quality evaluation and improvement for Internal Audit
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
Purpose of the Standards
Developing the Marketing Plan
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Clinical Audit How to make it work Clinical Audit Department Last revised July 2009.
Coaching Workshop.
One Health Plan’s Initiatives to Improve Patient Experiences: What the Physicians Had to Say Ron D. Hays, Ph.D. Professor of Medicine, UCLA CAHPS PI, RAND.
TeamSTEPPS TM National Implementation Measurement The following slides are not part of the TeamSTEPPS Instructor Guide. Due to federal 508 compliance requirements.
Building a Community of Practice and leveraging Collaboration towards shared Innovations Jane Hsieh, Executive Director SCIKMN June 3, AM Theme 1B.
Results from 2014 NHSRU-KTEP Environmental Scan Prepared for Policy to Practice – Investing in Your Workforce September 15, 2014 Prepared by the Nursing.
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Health promotion and health education programs. Assumptions of Health Promotion Relationship between Health education& Promotion Definition of Program.
The Role of Management Support in Implementing Innovative Clinical Practices Carol VanDeusen Lukas, EdD Mark M. Meterko, PhD David Mohr, PhD Marjorie Nealon.
Integrated Models of Care: Pain Management Robert D. Kerns, PhD National Program Director for Pain Management, VACO Chief, Psychology Service, VA Connecticut.
Applying Multiple Frameworks and Theories in Implementation Research (Part 2) Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
National Mental Health Providers Survey: Baseline Results Dr. Kendra Weaver Senior Consultant, MH Clinical Operations, Office of Mental Health Operations.
Performance Measurement and Analysis for Health Organizations
Julie C. Lowery, PhD, MHSA, Associate Director, CCMR; Co-Implementation Research Coordinator, Diabetes QUERI Laura Damschroder, MS, MPH, Co- Implementation.
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Sue Huckson Program Manager National Institute of Clinical Studies Improving care for Mental Health patients in Emergency Departments.
Do it pro bono. Strategic Scorecard Service Grant The Strategy Management Practice is presented by Wells Fargo. The design of the Strategic Scorecard Service.
RE-AIM Plus To Evaluate Effective Dissemination of AHRQ CER Products Michele Heisler, MD, MPA September, 2011.
My Own Health Report: Case Study for Pragmatic Research Marcia Ory Texas A&M Health Science Center Presentation at: CPRRN Annual Grantee Meeting October.
South Carolina Association for Healthcare Quality Using Data to Drive Organizational Improvement July 11, 2008 Bryan Bowles, Client Executive Premier Healthcare.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Communication System Coherent Instructional Program Academic Behavior Support System Strategic FocusBuilding Capacity.
Introduction to MAST Kristian Kidholm Odense University Hospital, Denmark.
CFIR Implementation Framework with Application to the VISN 11 Stroke Collaborative Laura J. Damschroder, MS, MPH Diabetes QUERI Co-IRC Ann Arbor Center.
Welcome! Please join us via teleconference: Phone: Code:
CONFIDENTIAL – NOT FOR REDISTRIBUTIONfilename 1 Board of Directors Identifying Key Roles and Responsibilities in Board to Foster Sustainable Long Term.
DEVELOPMENT OF SURVEY FROM AN ITEM BANK For Counselors Motivational Interviewing Performance Management and Vocational Rehabilitation Program Evaluation.
December 3, 2014 Lauren Benishek, PhD & Sallie Weaver, PhD
Implementing QI Projects Title I HIV Quality Management Program Case Management Providers Meeting May 26, 2005 Presented by Lynda A. O’Hanlon Title I HIV.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE Report of Independent Evaluation Presentation – 7 th February 2012 NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE.
Increasing the Relevance of Health Care Organizational Research Jeff Alexander, Ph.D. AHRQ Annual Meeting Sept 8, 2008.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
Nuclear Security Culture William Tobey Workshop on Strengthening the Culture of Nuclear Safety and Security, Sao Paulo, Brazil August 25-26, 2014.
Strategies for Knowledge Management Success SCP Best Practices Showcase March 18, 2004.
Fostering Change to Scale Up Effective Health Services.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Evaluation Plan Steven Clauser, PhD Chief, Outcomes Research Branch Applied Research Program Division of Cancer Control and Population Sciences NCCCP Launch.
Making It Better Planning Employee & Patient Satisfaction November 2010.
Introduction to research
Welcome! Seminar – Monday 6:00 EST HS Seminar Unit 1 Prof. Jocelyn Ramos.
30/10/2006 University Leaders Meeting 1 Student Assessment: A Mandatory Requirement For Accreditation Dr. Salwa El-Magoli Chair-Person National Quality.
Session 2: Developing a Comprehensive M&E Work Plan.
Training for organisations participating in Peer Review of Paediatric Diabetes.
HCS 465 OUTLET Experience Tradition /hcs465outlet.com FOR MORE CLASSES VISIT
MUHC Innovation Model.
Assessment and Feedback – Module 1
Coaching.
Laura J. Damschroder, VA Ann Arbor Healthcare System
What is performance management?
State of the Science for Implementation Research: The role of context and strategies Laura J. Damschroder, MS, MPH March 6, 2018 Prepared for.
Jessica Reszel, RN, MScN Tuesday, April 25, 2017
Presentation transcript:

Julie C. Lowery, PhD, MHSA Associate Director, VA CCMR; Co-Implementation Research Coordinator, VA Diabetes QUERI HSR&D Center for Clinical Management Research VA Ann Arbor Healthcare System

 A comprehensive framework to promote consistent use of constructs, terminology, and definitions  Consolidates existing models and frameworks  Comprehensive in scope  Can tailor use to each project 2 Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, Lowery J: Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. 2009, 4:50.

 Intervention  8 Constructs (e.g., evidence strength & quality, complexity)  Outer Setting  4 Constructs (e.g., patient needs & resources)  Inner Setting  14 constructs (e.g., leadership engagement, available resources)  Individuals Involved  5 Constructs (e.g., knowledge, self-efficacy)  Process  8 Constructs (e.g., plan, engage, champions) 3

4 Ref: Additional Resources:

 Embraces, consolidates, and standardizes key constructs from other models  Agnostic to specific models and theories  Provides a pragmatic structure for evaluating complex implementations  Helps to organize findings across disparate implementations  Paves the way for cross-study synthesis 5

6  Consists of 39 individual constructs  Cannot use them all in every study  Not all will apply  Conduct a priori assessment of which constructs to include  Interviews  Survey  Develop interview guide based on key constructs

 Intervention Source (Perception of key stakeholders about whether the Mini-Residency program is externally or internally developed) {Choose one} ( ) Very Unimportant ( ) Unimportant ( ) Neutral ( ) Important ( ) Very Important  Evidence Strength & Quality (Stakeholders' perceptions of the quality and validity of evidence in the published literature, based on clinical experience, or other local evidence or experience supporting the belief that the Mini- Residency program will have desired outcomes)

 Purposive sample of low & high uptake sites  Semi-structured interviews with key stakeholders  Rapid analysis  Analyze notes instead of verbatim transcripts  Use a prescribed coding template: CFIR  Analyze data on an ongoing basis 8

Two pairs of analysts do coding and analysis. For each respondent:  Each analyst independently codes  Meet to compare and achieve consensus on coding 9

For each respondent:  Two analysts independently code  Meet to compare and achieve consensus on coding 10

For each respondent, each construct:  Two analysts independently rate  Meet to compare and achieve consensus on rating  Create summary memo with supporting quotes and recommendations 11

 Think of CFIR constructs as independent variables  Construct 1 + construct 2 + … = f(implementation effectiveness)  Ratings  ordinal values of independent variables  Is the construct positive or negative force in the organization?  Does it manifest strongly or weakly?  Is the construct present but neutral? 12

Positive: Facilitating influence on implementation Negative: Impeding influence on implementation Strong+2: Specific statements, concrete examples -2: Specific statements, concrete examples Weak+1: General statements without specific examples -1: General statements without specific examples NeutralPresent but no influence (0); mixed remarks (* - these are important at the organizational level) MissingNo information obtained or respondent not knowledgeable 13

CFIR Construct Rating (-2, -1, 0, +1, +2) plus Summary of Rationale* Recommendation(s)** Reflecting & Evaluating No, no, I really haven’t received any data. They haven’t included e-consults on any reports. Would be good to have a website to go for any info but I am not aware of anything at this point. Certainly resource tool available would be helpful. Patient Needs & Resources How are patients involved? At this point, not that involved, to be frank. I think they should be more involved. They are not used to that when Patient asks to see spec, has to explain, they get electronic apt. I would think that specialist would give Patient a call but they give back to PC and give advice. Patient should be. We should get a Patient satisfaction survey, are they satisfied are not involved as they, I

 For each construct, review ratings across respondents. Consider:  Involvement  Role  Knowledge  Decide on a facility-level rating for each respondent (“weighted average”)  Pull the most important quotes from the respondent memos to justify each rating at the facility level  Add recommendations 15

 Identify constructs that appear to correlate with implementation success/uptake  Returning to the facility-level memos (or the respondent memos, if necessary), describe how these constructs are manifested at high implementation sites (facilitators) and low implementation sites (barriers)  Develop recommendations for dissemination based on these findings 16

CFIR ConstructsLowHigh I. INTERVENTION CHARACTERISTICS Relative advantage-2122 II. OUTER SETTING Patient needs & resources-2022 External Policy & Incentives-201 III. INNER SETTING Networks and communications-2 22 Implementation Climate Tension for change0011 Relative priority-212 Goals and feedback-212 Learning climateN/A12 Readiness for Implementation Leadership Engagement-222 Available resources-2 1 V. PROCESS PlanningN/A11 Executing-2122 Reflecting & Evaluating

 “Chief of Staff as well has recognized the value. Not cracking the whip, but making sure there’s motivation…Especially at executive meetings showing his wide support.”  “It’s exciting to see when people are excited about an initiative and embrace it. Had all the chiefs sitting in the room to sell the system…Leadership provided FTE after they saw the additional workload generated.”  “The support of leadership/administration moving forward and encouraging specialists to go forward implementing E-consults. Primarily this is the Chief of Staff…The medical director is always looking at improving access, so this is a natural involvement for him.”

Sites could not identify a local champion and/or perceived little support from top leadership. Another common negative comment was that leadership was too focused on the numbers, and not the quality of care.  “More focused on meeting targets than quality.”  In response to question of leadership involvement: “Not really. Their involvement is more on the side of the statistics of what we are using.”

 Leadership’s focus on achieving a certain number of E- Consults is not viewed as supportive. If leadership wants to demonstrate their interest, they need to derive better metrics for tracking implementation, preferably ones that focus on the appropriateness and quality of the consults, and patient and provider satisfaction.  Leadership needs to recognize the additional time that may be required for PCPs to collect additional data needed to provide to PCPs and implement specialists’ recommendations.  Leadership should keep the program in the forefront of meetings/discussions with clinicians and clinical leadership, linking the success of the program to VHA and medical center goals of improving access to quality care for our Veterans.

 Continue to refine CFIR constructs & rating process  Link to quantitative measures to increase efficiency of process  Build a repository of findings and continue to add study findings  Tool for researchers & practitioners  Use repository to conduct synthesis across studies  Use more sophisticated and rigorous analyses (e.g., QCA) to develop higher-order theories 21

Linking CFIR constructs to quantitative measures 22

ProgramModeDoseContent MOVE! ® Weight Management On-site Weekly hr in- person group sessions 6-14 weeks Weight Loss DPP-inspired Multi-disciplinary team TeleMOVEIn-home devices 1 message/day for 82 days Daily workbook lesson 3 x min monthly calls Daily psycho-ed content Safety checks Motivational and problem-solving support Telephone Lifestyle Coaching (TLC) Telephone 10 x 20 min sessions 6 months Unlmt inbound calls 6 topics MI coaching Coach continuity 23

Study:MOVE!TeleMOVETLC Structural Characteristics Networks & Communications Tension for Change Compatibility Relative Priority Goals & Feedback Learning Climate Leadership Engagement Available Resources 24 Strongly Distinguishes Weakly Distinguishes Not assessed

 The CFIR Wiki will promote:  Shared definitions  Operationalization of definitions  Repository of findings  Predictive modeling  Site-specific “System-change likelihood Indices” 1  Which will result in…  … more reliable implementation strategies  …more generalized knowledge about what works where and why 1. Davidoff F: Heterogeneity is not always noise: lessons from improvement. JAMA 2009, 302:

Visit the CFIR Wiki: 26