Implementing Evidence-Based Practices

Slides:



Advertisements
Similar presentations
CQI 101: Building and Sustaining an Effective Infrastructure
Advertisements

Evidence Based Practices Lars Olsen, Director of Treatment and Intervention Programs Maine Department of Corrections September 4, 2008.
Conceptual Feedback Threading Staff Development. Goals of Presentation What is Conceptualized Feedback? How is it used to thread the development of staff?
The NDPC-SD Intervention Framework National Dropout Prevention Center for Students with Disabilities Clemson University © 2007 NDPC-SD – All rights reserved.
JUVENILE JUSTICE TREATMENT CONTINUUM Joining with Youth and Families in Equality, Respect, and Belief in the Potential to Change.
Research Insights from the Family Home Program: An Adaptation of the Teaching-Family Model at Boys Town Daniel L. Daly and Ronald W. Thompson EUSARF 2014/
Implementation Evaluation of Two DOC Initiatives: Reentry Workbook Program & Program Fidelity Project Marcia Hughes, PhD, Assistant Director Matthew Sagacity.
Introduction Results and Conclusions Comparisons on the TITIS fidelity measure indicated a significant difference between the IT and AS models on the Staffing.
Establishing and Maintaining Fidelity to Evidence-Based Practices
TAKING IT TO THE NEXT LEVEL: Core Correctional Practices
Prepared by the Justice Research and Statistics Association IMPLEMENTING EVIDENCE-BASED PRACTICES.
Evaluation is a professional and ethical responsibility and is a core part of PHN professional practice Commitment to evaluation helps build the PHN intelligence.
Chapter 15 Evaluation.
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Legal & Administrative Oversight of NGOs Establishing and Monitoring Performance Standards.
Drug and Therapeutics Committee Session 7A. Identifying Problems with Medicine Use: Indicator Studies.
An Overview of the Mental Health Remedial Plan California Department of Corrections and Rehabilitation Division of Juvenile Justice REDEFINING MENTAL HEALTH.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Quality Improvement Prepeared By Dr: Manal Moussa.
Auditing Standards IFTA\IRP Audit Guidance Government Auditing Standards (GAO) Generally Accepted Auditing Standards (GAAS) International Standards on.
Creating a service Idea. Creating a service Networking / consultation Identify the need Find funding Create a project plan Business Plan.
Achieving Our Mission The Role of CQI in Public Safety Kimberly Gentry Sperber, Ph.D.
Organization Mission Organizations That Use Evaluative Thinking Will Develop mission statements specific enough to provide a basis for goals and.
Can We Quantify the Risk Principle? Kimberly Gentry Sperber, Ph.D.
IMPLEMENTING EVIDENCE-BASED PRACTICES IN COMMUNITY CORRECTIONS Stephen M. Haas, Ph.D., Director Office of Research and Strategic Planning JRSA Training.
NASC 2012 ANNUAL CONFERENCE AUGUST 6, 2012 NASC 2012 ANNUAL CONFERENCE AUGUST 6, 2012 Ray Wahl Deputy State Court Administrator.
What Works and What Doesn’t in Reducing Recidivism: The Principles of Effective Intervention: Presented by: Edward J. Latessa, Ph.D. Center for Criminal.
Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
Investing in CQI Implementation Issues to Consider Kimberly Gentry Sperber, Ph.D.
Introducing QI Tools and Approaches Whole-Site Training Approach APPENDIX F Session C Facilitative Supervision for Quality Improvement Curriculum 2008.
Measuring and Improving Practice and Results Practice and Results 2006 CSR Baseline Results Measuring and Improving Practice and Results Practice and Results.
Coaches Training Introduction Data Systems and Fidelity.
Center for Advancing Correctional Excellence, ACE! Department of Criminology, Law & Society George Mason University Faye Taxman, Ph.D. University Professor.
Continuous Quality Improvement for CBT Groups Jennifer Lux, M.A. Research Associate, Corrections Institute Center for Criminal Justice Research University.
Moving from Principle to Execution Applications of the Risk-Dosage Relationship Kimberly Sperber, PhD.
Quantifying and Executing the Risk Principle in Real World Settings Webinar Presentation Justice Research and Statistics Association August 15, 2013 Kimberly.
MIA: STEP Toolkit Overview. NIDA-SAMHSA Blending Initiative 2 What is an MI Assessment?  Use of client-centered MI style  MI strategies that can be.
Children’s Evaluation, Outcomes and Fidelity CMHACY Conference 2007 Todd Sosna, Ph.D.
Achieving Our Mission Using Continuous Quality Improvement to Promote and Enhance Community Corrections Kimberly Gentry Sperber, Ph.D.
Geelong High School Performance Development & Review Process in 2014.
Journal Club Alcohol, Other Drugs, and Health: Current Evidence May–June 2014.
8/21/2015 Scott Ronan Idaho Supreme Court Senior Manager, Problem-Solving Courts and Sentencing Alternatives.
Addressing Maternal Depression Healthy Start Interconception Care Learning Collaborative Kimberly Deavers, MPH U.S. Department of Health & Human Services.
OFFENDER REENTRY: A PUBLIC SAFETY STRATEGY Court Support Services Division.
Assuring Safety for Clinical Techniques and Procedures MODULE 5 Facilitative Supervision for Quality Improvement Curriculum 2008.
Community Planning Training 5- Community Planning Training 5-1.
Children’s Evaluation: Outcomes and Fidelity Full Service Partnerships January , 2007.
Healthy Futures Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
January 2012 Coalition of Community Corrections Providers of New Jersey Employment Forum.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Reduce Waiting & No-Shows  Increase Admissions & Continuation Reduce Waiting & No-Shows  Increase Admissions & Continuation Lessons Learned.
Regional Dental Consultants’ Meeting Presented by Emerson Robinson, DDS, MPH Region II and V Dental Consultant.
Implementation and Sustainability in the US National EBP Project Gary R. Bond Dartmouth Psychiatric Research Center Lebanon, NH, USA May 27, 2014 CORE.
The Role of Staff Supervision in Changing Offender Behavior Kimberly Gentry Sperber, Ph.D.
RE-AIM Framework. RE-AIM: A Framework for Health Promotion Planning, Implementation and Evaluation Are we reaching the intended audience? Is the program.
Standards and Competencies for Cancer Chemotherapy Nursing Practice in Canada: CANO/ACIO AN INTRODUCTION.
Cindy Tumbarello, RN, MSN, DHA September 22, 2011.
Continuous Quality Improvement Basics Created by Michigan’s Campaign to End Homelessness Statewide Training Workgroup 2010.
Curriculum Development: an Overview of 6 Steps MAJ Heather O’Mara, DO, FAAFP Faculty Development Fellow.
Measuring and Improving Fidelity to Evidence-Based Practices Our Obligation to Effective Service Delivery and Supervision Kimberly Gentry Sperber, Ph.D.
Promising Practices in Criminal Justice Reform
Intercept 5 Community Supervision
Using Observation to Enhance Supervision CIMH Symposium Supervisor Track Oakland, California April 27, 2012.
MUHC Innovation Model.
Exploring the Past Improving the Future
Chapter 14 Implementation.
Human Resources Management: Module 2
Ensuring Effective Treatment and Supervision
Presentation transcript:

Implementing Evidence-Based Practices Our Obligation to Program Fidelity Kimberly Gentry Sperber, Ph.D.

Efforts To Date “What Works” Literature Principles of Effective Interventions Growing evidence based on individual program evaluations and meta-analyses Continuing Gap Between Science and Practice Few programs score as satisfactory on CPAI

CPAI Data Hoge, Leschied, and Andrews(1993) reviewed 135 programs assessed by CPAI 35% received failing score; only 10% received score of satisfactory or better. Holsinger and Latessa (1999) reviewed 51 programs assessed by CPAI 60% scored as satisfactory but needs improvement or unsatisfactory; only 12% scored as very satisfactory.

CPAI Data Continued Gendreau and Goggin (2000) reviewed 101 programs assessed by CPAI Mean score of 25%; only 10% scored received satisfactory score Matthews, Hubbard, and Latessa (2001) reviewed 86 programs assessed by CPAI 54% scored as satisfactory or satisfactory but needs improvement; only 10% scored as very satisfactory.

Fidelity Research Landenberger and Lipsey (2005) Brand of CBT didn’t matter but quality of implementation did. Implementation defined as low dropout rate, close monitoring of quality and fidelity, and adequate training for providers. Schoenwald et al. (2003) Therapist adherence to the model predicted post-treatment reductions in problem behaviors of the clients. Henggeler et al. (2002) Supervisors’ expertise in the model predicted therapist adherence to the model. Sexton (2001) Direct linear relationship between staff competence and recidivism reductions.

More Fidelity Research Schoenwald and Chapman (2007) A 1-unit increase in therapist adherence score predicted 38% lower rate of criminal charges 2 years post-treatment A 1-unit increase in supervisor adherence score predicted 53% lower rate of criminal charges 2 years post-treatment. Schoenwald et al. (2007) When therapist adherence was low, criminal outcomes for substance abusing youth were worse relative to the outcomes of the non-substance abusing youth.

Washington State Example (Barnowski, 2004) For each program (FFT and ART), an equivalent comparison/control group was created Felony recidivism rates were calculated for each of three groups, for each of the programs Youth who received services from therapists deemed ‘competent’ Youth who received services from therapists deemed ‘not competent’ Youth who did not receive any services (control group)

Family Functional Therapy Results: % New Felony Results calculated using multivariate models in order to control for potential differences between groups

UC Halfway House/CBCF Study in Ohio: A Look at Fidelity Statewide Average Treatment Effect was 4% reduction in recidivism Lowest was a 41% Increase in recidivism Highest was a 43% reduction in recidivism Programs that had acceptable termination rates, had been in operation for 3 years or more, had a cognitive behavioral program, targeted criminogenic needs, used role playing in almost every session, and varied treatment and length of supervision by risk had a 39% reduction in recidivism

What Do We Know About Fidelity? Fidelity is related to successful outcomes (i.e., reductions in recidivism, relapse, and MH instability). Poor fidelity can lead to null effects or even iatrogenic effects. Fidelity can be measured and monitored. Fidelity cannot be assumed.

Why Isn’t It Working? Latessa, Cullen, and Gendreau (2002) Article notes 4 common failures of correctional programs: Failure to use research in designing programs Failure to follow appropriate assessment and classification practices Failure to use effective treatment models Failure to evaluate what we do

Ways to Monitor Fidelity Training post-tests Structured staff supervision for use of evidence-based techniques Self-assessment of adherence to evidence-based practices Program audits for adherence to specific models/curricula Focus review of assessment instruments Formalized CQI process

Ensuring Training Transfer Use of knowledge-based pre/post-tests Use of knowledge-based proficiency tests Use of skill-based rating upon completion of training Mechanism for use of data Staff must meet certain criteria or score to be deemed competent. Failure to meet criteria results in consequent training, supervision, etc.

Staff Supervision Staff supervision is a “formal process of professional support and learning which enables individual practitioners to develop knowledge and competence, assume responsibility for their own practice and enhance [client]… care in complex … situations.” Modified from Department of Health, 1993

Performance Measurement for Staff Standardized measurement Consistency Everyone measured on same items the same way each time Consistent meaning of what is being measured Everyone has same understanding, speaks the same language

Sample Measures Uses CBT language during encounters with clients. Models appropriate language and behaviors to clients. Avoids power struggles with clients. Consistently applies appropriate consequences for behaviors. Identifies thinking errors in clients in value-neutral way.

Agency Self-Assessment: Assessing Best Practices at 17 Sites Use of ICCA Treatment Survey to establish baseline Complete again based on best practice Perform Gap Analysis Action Plan

ICCA Treatment Survey CQI Manager and Clinical Director met with key staff from each program to conduct self assessment of current practices. Evaluated performance in 6 key areas Staff Assessment/Classification Programming Aftercare Organizational Responsivity Evaluation

Agency Response: Strategic Plan FY2006 Required to submit at least 1 action plan to “fix” an identified gap. Gaps in the areas of risk and need to be given priority. FY2007 Required to submit 2 action plans. One on use of role-plays and one on appropriate use of reinforcements. FY2008 Proposed focus on fidelity measurement at all sites. Creation of checklists and thresholds.

Program Audits: CBIT Site Assessments Cognitive Behavioral Implementation Team Site visits for observation and rating Standardized assessment process Standardized reports back to sites Combination of quantitative data and qualitative data

Individual LSI Reviews Schedule of videotaped interviews Submitted for review Use of standardized audit sheet Feedback loop for staff development Aggregate results to inform training efforts

Formal CQI Model Data Collection/Review Requirements: Peer Review (documentation) MUI’s/Incidents Complaints/Grievances Environmental Review Client Satisfaction Process Indicators Outcome Indicators

Formal CQI Model Programs required to review data monthly and to action plan accordingly. Each program’s data and action plans reviewed once per quarter by agency’s Executive CQI Committee

CQI Committee Infrastructure

The Talbert House Strategic Plan Focus on Fidelity FY2008 – FY2010 Objective: Improve Quality of Client Services FY2008 – FY2010 Goal: Exceed 90% of quality improvement measures annually FY2008 – FY2010 Strategy: Talbert House programs demonstrate fidelity to best practice service/treatment models as demonstrated by site specific best practice fidelity check sheet. 100% of programs create a Fidelity measurement tool by 12/31/07. 100% of programs establish and measure its site-specific fidelity threshold by 1/30/08. Programs will be expected to meet/exceed established fidelity thresholds by 6/30/09. Programs will be expected to meet/exceed established fidelity thresholds by 6/30/10.

Getting Started What services do you say/promise that you deliver? What does your contract say? What do referral sources expect? List all programming components What is the model (e.g., CBT, MI, IDDT, IMR, TFM, etc.)? What curricula are in use? Identify which is most important Make selection for measurement

Creating a Tool for Measurement Scale should adequately sample the critical ingredients of the EBP. Need to be able to differentiate between programs/staff that follow the model versus those that do not. Scale should be sensitive enough to detect progress over time. Need to investigate what measurement tools may already exist.

Sample Measures CBT Group Observation Form TFM Fidelity Review Sheets and Database IMR Fidelity Rating Scale IDDT Fidelity Scale Motivational Interviewing Treatment Integrity (MITI) Code

Sample Project - TFM 4 residential adolescent programs implemented Teaching Family Model. Required to record all teaching interactions with all clients. Required to record data on standardized form and to enter into Fidelity database. CQI Indicator = percentage of staff achieving 4:1 ratio.

Sample Project – CBT Groups Several programs conducting group observations using standardized rating form. Needed to operationalize who would do observations and how frequently. Needed to operationalize how data would be collected, stored, analyzed, and reported. CQI Indicator = percentage of staff achieving a rating of 3.0. (on scale of 0-3).

Measuring CBT in Groups Year One Chose 5 items from observation tool: Use of role plays/or other rehearsal techniques Ability of the group leader to keep participants on task Use of peer interaction to promote prosocial behavior Use of modeling Use of behavioral reinforcements

Measuring CBT in Groups Year Two Refinement of role-play indicators: Percentage of groups observed where staff modeled the skill prior to having clients engage in role-play Percentage of role-plays containing practice of the correctives Percentage of role-plays that required observers to identify skill steps and report back to the group

Sample Project – Dosage by Risk and Need Program created dosage grid by LSI-R risk category and criminogenic need domains. Requires prescribed set of treatment hours by risk Program created dosage report out of automated clinical documentation system. Review monthly to insure clients are receiving prescribed dosage. Also review individual client data at monthly staffings. CQI Indicator = percentage of successful completers receiving prescribed dosage (measured monthly).

Sample Dosage Protocol

Sample Dosage Protocol

Relationship Between Evaluation and Treatment Effect (based on UC Halfway House and CBCF study)

NPC Research on Drug Courts Significant at p<.05

Conclusions Many agencies are allocating resources to selection/implementation of EBP with no evidence that staff are adhering to the model. There is evidence that fidelity directly affects client outcomes. There is evidence that internal CQI processes directly affect client outcomes. Therefore, agencies have an obligation to routinely assess and assure fidelity to EBP’s. Requires a formal infrastructure to routinely monitor fidelity performance.