Session 4: Frameworks used in Clinical Settings, Part 2 Janet Myers, PhD, MPH Session 2 ● September 27, 2012.

Slides:



Advertisements
Similar presentations
A Presentation to the Cabinet A Presentation to Stakeholders
Advertisements

Evaluating and Institutionalizing
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
P romoting A ction on R esearch I mplementation in H ealth S ervices Promoting Action on Research Implementation in Health Services Project Team Jo Rycroft-Malone.
Introduction to Monitoring and Evaluation
Scaling-Up Early Childhood Intervention Literacy Learning Practices Maurice McInerney, Ph.D. American Institutes for Research Presentation prepared for.
Strategies for Implementing Outcomes in Practice Carolyn Baum, PhD, OTR, FAOTA.
What PARIHS is about. Introducing the PARIHS Group Alison Kitson Brendan McCormack Kate Seers Angie Titchen Jo Rycroft-Malone Gill Harvey.
E.g Act as a positive role model for innovation Question the status quo Keep the focus of contribution on delivering and improving.
NICE Guidance and Quality Standard on Patient Experience
Philip M. Ullrich, Ph.D. Spinal Cord Injury QUERI IRC Philip M. Ullrich, Ph.D. Spinal Cord Injury QUERI IRC Philip M. Ullrich, Ph.D. Spinal Cord Injury.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
Succesful implementation of a multi- modal pressure ulcer prevention program in nursing homes Eva, Nuria, Luk, Theo, Cecile, Niina, Vivi, Antoinette.
99.98% of the time patients are on their own “The diabetes self-management regimen is one of the most challenging of any for chronic illness.” 0.02% of.
Reviewing and Critiquing Research
Learning Objectives Define roles and responsibilities of team members
Introduction to evidence based medicine
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
The Texas Board of Nursing DECs
Conceptual Model Building: Overview Felicia Hill-Briggs, PhD, ABPP Associate Professor Departments of Medicine and Health, Behavior, and Society, Welch.
Critical Appraisal of Clinical Practice Guidelines
Path-Goal Theory Chapter 7.
Applying Multiple Frameworks and Theories in Implementation Research Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
Applying Multiple Frameworks and Theories in Implementation Research (Part 2) Jeffrey Smith Implementation Research Coordinator Mental Health QUERI.
26 TH ACADEMIC COUNCIL ST APRIL 2015 Breakout session Group C “EXPERIENTIAL LEARNING”
Designing Survey Instrument to Evaluate Implementation of Complex Health Interventions: Lessons Learned Eunice Chong Adrienne Alayli-Goebbels Lori Webel-Edgar.
Presenter-Dr. L.Karthiyayini Moderator- Dr. Abhishek Raut
Evaluation and Research: Dual Purpose Data Collection Ann E. Austin Professor, Michigan State University ADVANCE PI Meeting March 5, 2013.
Multilevel and Multi-method Designs Capturing the Effects of Organizational Context and Process in Implementation Research AHRQ 2007 Annual Conference:
Perioperative fasting guideline Getting it into practice Getting started.
Inventory and Assessment of NBCCEDP Interventions Evaluation November 1, 2007.
Improving Implementation Research Methods for Behavioral and Social Science Working Meeting Measuring Enactment of Innovations and the Factors that Affect.
An Introduction to Practice Development. Magical History Tour of PD 1980s Nursing Development Unit Burford, Oxford 1990s PD Units PD hospital roles Evidence.
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
Redrawing the Map Chapter 19 Mark A. Simpson “Nursing knowledge refers to knowledge warranted as useful and significant to nurses and patients in understanding.
Evaluation framework: Promoting health through strengthening community action Lori Baugh Littlejohns & Neale Smith David Thompson Health Region, Red Deer,
Organizational Conditions for Effective School Mental Health
NIPEC Organisational Guide to Practice & Quality Improvement Tanya McCance, Director of Nursing Research & Practice Development (UCHT) & Reader (UU) Brendan.
Integrating Knowledge Translation and Exchange into a grant Maureen Dobbins, RN, PhD SON, January 14, 2013.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Laying the Foundation for Scaling Up During Development.
Diagnosing Organizations. Diagnosis Defined Diagnosis is a collaborative process between organizational members and the OD consultant to collect pertinent.
NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE Report of Independent Evaluation Presentation – 7 th February 2012 NATIONAL MENTAL HEALTH SERVICES COLLABORATIVE.
Leadership Chapter 7 – Path-Goal Theory.  Path-Goal Theory Perspective  Conditions of Leadership Motivation  Leader Behaviors & Subordinate Characteristics.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
Factors impacting implementation of a community coalition-driven evidence- based intervention: results from a cluster randomized controlled trial Methods.
AN INTRODUCTION Managing Change in Healthcare IT Implementations Sherrilynne Fuller, Center for Public Health Informatics School of Public Health, University.
Practical Challenges Integrating EBP into Addiction Treatment Programs Dan Kivlahan, Ph.D. VA Puget Sound & University of Washington APA, San Francisco.
Promoting a Culture of Evidence Through Program Evaluation Patti Bourexis, Ph.D. Principal Researcher The Study Group Inc. OSEP Project Directors’ Conference.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Problem-based Learning Cherdsak Iramaneerat Department of Surgery Faculty of Medicine Siriraj Hospital 1PBL.
Developing a Framework In Support of a Community of Practice in ABI Jason Newberry, Research Director Tanya Darisi, Senior Researcher
Adapting Guidelines for Local Implementation: Fusion Cuisine or Fast Food Leftovers Eddy Lang MDCM CCFP(EM) Head Department of Emergency Medicine Senior.
A Team Members Guide to a Culture of Safety
Workshop A. Development of complex interventions Rob Anderson, PCMD Nicky Britten, PCMD.
A Multi-Level Framework to Understand Factors Influencing Program Implementation in Schools Celene E. Domitrovich, Ph.D. Penn State Prevention Research.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Program Planning for Evidence-based Health Programs.
A conceptual framework is described as a group of concepts broadly defined and systematically organized to provide a focus, rationale, and tool for the.
MODULE 11 – SCENARIO PLANNING
RAPID RESPONSE program
An Evidence Based Project : Title of Your Project Goes Here
Path-Goal Theory Lecture 7 Md. Mahbubul Alam, PhD Associate Professor
HMS Academy Fellowship in Medical Education Research June 2, 2016
Changing practice to support self-management and recovery in mental illness: application of an implementation model M Harris1, P Jones2, M Heartfield1,
Leadership Chapter 7 – Path-Goal Theory Northouse, 4th edition.
Learning-oriented Organizational Improvement Processes
Presentation transcript:

Session 4: Frameworks used in Clinical Settings, Part 2 Janet Myers, PhD, MPH Session 2 ● September 27, 2012

Session 4 Overview  Review the rationale for frameworks  Fidelity Frameworks Fidelity of Implementation Framework for Implementation Fidelity  PARIHS Framework Promoting Action on Research Implementation in Health Services  ORCA Framework Organizational Readiness to Change  Frameworks exercise

Review – Why frameworks? 1. Planning: To guide the selection and tailoring of programs or interventions. 2. Implementation: To understand program “theory,” which can improve implementation 1. Can guide timing/stages of implementation 3. Evaluation: As a guide to evaluation 1. Suggest formative evaluation/diagnostic analysis 2. Guide the development of hypotheses to test. 3. Facilitate interpretation of process and outcomes and the relationship between the two

FOI: Fidelity of Implementation  To explain the degree to which evidence-based interventions succeed or fail.  FOI occurs between context and program effectiveness

Carroll et al “A conceptual framework for implementation fidelity” Implementation Science :40. Framework for Implementation Fidelity

SmartSTEPS ATSM Model

Conceptual Framework for Evaluating Intervention Fidelity of SMARTsteps

PARIHS

PARiHS Framework Elements Evidence. Context. Facilitation. Weak to strong support for implementation

Evidence Sub-elements: Research evidence. Weak: Anecdotal evidence, descriptive. Strong: RCTs, evidence-based guidelines. Clinical experience. Weak: Expert opinion divided. Strong : Consensus. Patient preferences and experiences. Weak: Patients not involved. Strong : Partnership with patients. Local information.

Context Sub-elements: Culture. Weak: Task driven, low morale. Strong : Learning organization, patient-centered. Leadership. Weak: Poor organization, diffuse roles. Strong : Clear roles, effective organization. Evaluation. Weak: Absence of audit and feedback Strong : Routine audit and feedback.

Facilitation Sub-elements: Characteristics (of the facilitator). Weak: Low respect, credibility, empathy. Strong: High respect, credibility, empathy. Role. Weak: Lack of role clarity. Strong: Clear roles. Style. Weak: Inflexible, sporadic. Strong: Flexible, consistent.

PARiHS Framework: Elements and Sub-elements Evidence. Research Clinical experience Patient experience Local knowledge Context. Culture Leadership Evaluation Facilitation. Characteristics Role Style

PARIHS Causal Pathways (from Kitson et al)

Limitations  Used post-hoc  Study designs: Only cross-sectional or retrospective  Lack of conceptual clarity

ORCA  Assesses 3 major scales corresponding to the core elements of PARIHS  Evidence: published research, professional knowledge/competence, patient prefs, local context  Context: Org culture, leadership, evaluation/feedback.  Facilitation: Internal and external factors

Patient Exit Interviews HCW Interviews Data abstraction from EMR PREDICTOR VARIABLES (PARIHS Core elements): Evidence: the strength and nature of the evidence as perceived by multiple stakeholders ORCA-adapted measures of patient preferences ORCA measures on clinical experience and professional knowledge and perception of PP evidence Routine information derived from local practice context Context: the quality of the context of environment in which the research is implemented Patient ratings of types of information (eg. risk assessment) collected and how it is used in the setting. ORCA measures of HCW perception of organizational culture and leadership (requires multiple levels of HCW input); ratings of types of data collected and how it is used in the setting. Use of risk assessment fields in EMR; feedback to clinical team members regarding data completeness and clinical progress. Facilitation: Processes by which implementation is facilitated. ORCA measures regarding the enabling and interactive features of internal and external facilitation. OUTCOME VARIABLES (Composite FOI Score): Fidelity of Implementation: HCW satisfaction with the intervention; intervention delivery quality and consistency Quality: Reports of the quality of the intervention delivery Satisfaction: HCW reports of satisfaction with the intervention. Consistency: Proportion of patients seen who received intervention messages Table 2: Planned Data Collection for each element of the Conceptual Framework (PARIHS) and Fidelity of Implementation (FOI)