Introduction to the Revised Standard CTS

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

Introduction to Competency-Based Residency Education
Dr. Hamda Qotba, M.D,MFPH,FFPH
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
1 Introduction to ValueOptions ® On Track Outcomes Sonny Phipps, M.B.A. Program Manager, ValueOptions Jeb Brown, Ph.D. Director, Center for Clinical Informatics.
Community Planning Training 1-1. Community Plan Implementation Training 1- Community Planning Training 1-3.
Early Childhood Mental Health Consultants Early Childhood Consultation Partnership® Funded and Supported by Connecticut’s Department of Children and Families.
Administrative Evaluation Committee – Orientation Meeting Dr. Christine Carver, Associate Superintendent of Human Capital Development Mr. Stephen Foresi,
Seth Bernstein, Ph.D. ABHA Executive Director Jeb Brown, Ph.D
Learner-Ready Teachers  More specifically, learner-ready teachers have deep knowledge of their content and how to teach it;  they understand the differing.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
Performance Measurement and Analysis for Health Organizations
STANDARDS FOR THE PRACTICE RECREATIONAL THERAPY (ATRA, REVISED 2013) HPR 453.
Why principal evaluation? Because Leadership Matters!
The inspection of local area responsibilities for disabled children and young people and those who have special educational needs Charlie Henry HMI National.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
September 2008 NH Multi-Stakeholder Medical Home Overview.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
Presentation to the SAMHSA Advisory Councils
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
1 Strategic Plan Review. 2 Process Planning and Evaluation Committee will be discussing 2 directions per meeting. October meeting- Finance and Governance.
About District Accreditation Mrs. Sanchez & Mrs. Bethell Rickards Middle School
1 Establishing a New Gallaudet Program Review Process Pat Hulsebosch Office of Academic Quality CUE – 9/3/08: CGE – 9/16/08.
RCSLT Outcomes Project TOMs CONNECT 17th November 2016
Logic Models How to Integrate Data Collection into your Everyday Work.
June Gallup, RN, MS, HCS-D, COS-C, BCHH-C
Clinical Learning Environment Review GMEC January 8, 2013
Why Celebrate? Thank you!
Comprehensive, Collaborative System of Crisis and Emergency Care
The inspection of local areas effectiveness in identifying and meeting the needs of children and young people who have special educational needs and/or.
Performance Improvement Projects: From Idea to PIP
Snaptutorial ESE 697 Help Bcome Exceptional/ snaptutorial.com
Southampton City Council School School Improvement Service
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Training Trainers and Educators Unit 8 – How to Evaluate
Child Outcomes Summary Process April 26, 2017
Elaine Wyllie Executive Director of Joint Commissioning
MUHC Innovation Model.
Telepsychiatry: Cost Effective Solution to Integrated Care
Accreditation Canada Medicine Accreditation 2016.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
RCSLT Outcomes Project TOMs CONNECT 17th November 2016
The Joint Commission’s National Patient Safety Goals
Routine Outcome Monitoring: the good, the bad, and the ugly
Phase 4 Milestones.
Training Trainers and Educators Unit 8 – How to Evaluate
Governance and leadership roles for equality and diversity in Colleges
Preparing for Higher Education Review (HER)
15/16 Achievements and ambition for 16/17
2018 OSEP Project Directors’ Conference
Support for the AASHTO Committee on Planning (COP) and its Subcommittees in Responding to the AASHTO Strategic Plan Prepared for NCHRP 8-36, TASK 138.
Statistics Governance and Quality Assurance: the Experience of FAO
Performance Improvement Projects: PIP Library
Standard Four Program Impact
Finance & Planning Committee of the San Francisco Health Commission
The MSK-HQ Developing a generic Musculoskeletal Patient Reported Outcome Measure Policy & Public Affairs Team, Arthritis Research UK e.
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
February 21-22, 2018.
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Performance and Quality Improvement
Indicators, Data Sources and Data Quality for TB M&E
Facilitating Change (AET 560)
Accounting Discipline Overview
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Evidence-Based Public Health
Clinical Evaluations Using QSEN KSAs: Developing Tools and Strategies
Second U.S. Roadway Safety Data Capabilities Assessment
National Center for Chronic Disease Prevention and Health Promotion
Presentation transcript:

Introduction to the Revised Standard CTS.03.01.09 Webinar Presentation for: National Association for Children's Behavioral Health - Standards Committee Conference Call September 19, 2017 Scott Williams, PsyD Director Department of Health Services Research The Joint Commission

Objectives Review the revised requirements and rationale for Standard CTS.03.01.09 Highlight resources that can help with identifying tools/instruments Discuss the criteria an organization should consider to identify metrics that best fit the population they serve Explain how the change is expected to improve the provision of care, treatment and services

The Revised Requirement Standard CTS.03.01.09 – The organization assesses the outcomes of care, treatment, or services provided to the individual served. EP 1 – The organization uses a standardized tool or instrument to monitors the individual’s progress in achieving his or her care, treatment, or service goals. *New text in red

The Revised Requirement (continued) EP 2 – The organization gathers and analyzes the data generated through standardized monitoring, and the results are used to inform the goals and objectives of the individual’s plan for care, treatment, or services as needed. EP 3 – The organization evaluates the outcomes of care, treatment, or services provided to the population(s) it serves by aggregating and analyzing the data gathered through the standardized monitoring effort.

Rationale for the revised requirement Monitoring individual progress is a long-standing requirement, but the original standard provided little direction for how to do this. Increasing emphasis on measurement-based care in the behavioral health care field helped to provide direction for the change.

Rationale for the revised requirement (continued) SAMHSA has recognized measurement-based care as an evidence-based practice. Outcomes were significantly improved across a wide variety of behavioral health settings and populations when measurement-based care was used. Without an objective and valid measure of progress, it is difficult to reliably determine which individuals are improving and which are not.

Rationale for the revised requirement (continued) The revision focuses on the use of data to help organizations to know whether what they’re doing is working. Emphasizes continuous quality improvement at both an individual and organizational level.

How did we make this change? Conducted literature review to ensure that the revision is based upon evidence and to understand the value of using standardized tools to monitor progress and outcomes. Convened Technical Advisory Panel (TAP) consisting of experts in the field – the TAP members were very supportive of this effort.

How did we make this change? (continued) Conducted “field review” of proposed revisions: Six week open review by the behavioral health field Respondents supported use of a tool or instrument But, some concerns were raised about how to implement and comply Held five focus group calls with field review respondents to discuss concerns about implementation and compliance

How did we make this change? (continued) For more information: www.jointcommission.org... Accreditation… Behavioral Health Care… “New outcome measures standard” (in the left sidebar)

How did we make this change? (continued) Using all of this information, made final edits to revised standard. Standard was approved in November 2016. Field was given one year to prepare for implementation, which begins on January 1, 2018.

Selecting a Standardized Instrument In June 2017, The Joint Commission posted a list of instruments that could be used to meet the new standard. https://manual.jointcommission.org/BHCInstruments/WebHome There are multiple pathways to access the list from The Joint Commission website A couple important points about the list: We do NOT endorse any instrument The list is NOT intended to be exclusive

Selecting a Standardized Instrument (continued) There are multiple ways to access the instrument list… - Use the link on the previous slide, - Or visit the BHC accreditation home page

Selecting a Standardized Instrument (continued)

Selecting a Standardized Instrument (continued)

Selecting a Standardized Instrument (continued)

Selecting a Standardized Instrument (continued) Other resources: The Journal, Integrating Science and Practice summarizes 10 well-established and frequently used instruments (or suites of instruments). (http://mpprg.mcgill.ca/Articles/10%20tools%20PDF.pdf) The Kennedy Forum provides a list of dozens of instruments that are appropriate tools for measurement-based care categorized by type, setting, and other factors. (http://thekennedyforum-dot-org.s3.amazonaws.com/documents/MBC_supplement.pdf)

Selecting a Standardized Instrument (continued) An instrument should: Have well-established reliability and validity for use as a repeated measure Be sensitive to change Be appropriate for use as a repeated measure Be capable of discriminating between populations that may or may not benefit from services (if appropriate) e.g., clinical/non-clinical, healthy/non-healthy functioning, typical/non-typical, etc.

Other metrics… …which may complement measurement-based care, but don’t meet the requirements of the standard: A measure that assesses the use of evidence-based care or clinical practice guidelines A perception of care questionnaire or patient satisfaction survey A measure of medication/treatment compliance An assessment of outcome after the completion of service, even if it compares a baseline score to a subsequent point of measurement (e.g., intake/termination, admission/discharge)

Complying with Standard CTS.03.01.09 The revised standard is intended to promote the use of “measurement-based care” Measurement-based care is a process of using objective data as feedback during the course of services in order to monitor progress toward the desired outcome. Data are used to: Inform care for the individual served, and Support quality improvement efforts for the organization

Using Your Selected Instrument Data are routinely collected at multiple points in time over the course of service Data are typically collected at first contact and then at regular intervals (i.e., each subsequent point of contact, every “nth” contact, weekly, monthly, etc.) Each point of measurement provides an opportunity to assess progress by Comparing current scores with past scores Comparing individual scores with instrument norms Comparing actual progress to expected progress using a statistical model

Using Your Selected Instrument (Continued) After data are collected: They are analyzed and delivered to the service provider as objective feedback Analysis can be used to inform goals and objectives, monitor individual progress, and inform decisions related to changes in individual plans for care, treatment, or services. Can be used to identify individual cases that may benefit from treatment team discussion and supervision

Using Your Data - An example: Individuals not seeking services Projected Rate of Improvement Clinical cut-off (client score at initial contact is below clinical cut off, so good candidate for service) Individuals seeking services

Using Your Data – Aggregate Level By rolling up data to the clinician or organization-level, data collected through standardized instruments can be used to: Inform quality improvement priorities Evaluate progress on organizational performance improvement efforts

Using Your Data – Aggregate Level (Continued) “Effect Size” is a (standardized) estimate of the magnitude of a treatment effect Represents the difference between two averages (the pre-treatment score and the post-treatment score) as a function of the variability in an instrument’s scores within a particular population-of-interest Demonstrate the effectiveness of organization services with: Stakeholders in the community Prospective clients and families Payers/Insurers/Employers

In Summary… The revision of the standard seeks to improve care, treatment and services by promoting the use of data as a routine part of care At the level of the individual served, it is about helping organizations know how well this individual is responding to these services at this time The standard emphasizes continuous quality improvement at both an individual and organizational level. This emphasis on continuous quality improvement is a distinguishing feature of Joint Commission accreditation – and of the BHC organizations who seek and achieve it.

References Boswell JF, Kraus DR, Miller SD and Lambert MJ. Implementing routine outcome monitoring in clinical practice: Benefits, challenges, and solutions. Psychotherapy Research. 2015; 25(1):6-19. Goodman JD, McKay JR and DePhilippis D. Progress monitoring in mental health and addiction treatment: A means of improving care. Professional Psychology: Research and Practice. 2013; 44(4):231–246. Scott K and Lewis CC. Using Measurement-Based Care to Enhance Any Treatment. Cognitive and Behavioral Practice. 2015;22(1):49-59. Reese RJ, Duncan BL, Bohanske RT, Owen JJ, and Minami T. Benchmarking Outcomes in a Public Behavioral Health Setting: Feedback as a Quality Improvement Strategy. Journal of Consulting and Clinical Psychology. 2014. Seidel JA, Miller SD and Chow DL. Effect size calculations for the clinician: Methods and comparability. Psychotherapy Research. 2014;24(4):470-84. Gondek D, Edbrooke-Childs J, Fink E, Deighton D and Wolpert M. Feedback from outcome measures and treatment effectiveness, treatment efficiency, and collaborative practice: A systematic review. Adm Policy Ment Health. 2016; 43:325–343. Shimokawa K, Lambert MJ and Smart DW. Enhancing treatment outcome of patients at risk of treatment failure: Meta-analytic and mega-analytic review of a psychotherapy quality assurance system. Journal of Consulting and Clinical Psychology. 2010; 78(3):298–311. Chow DL, Miller SD, Seidel JA, Kane RT, Thornton JA, Andrews WP. The Role of Deliberate Practice in the Development of Highly Effective Psychotherapists. Psychotherapy. 2015; Vol 52, No 3, 337-345. De Jong K. Challenges in the Implementation of Measurement Feedback Systems. Adm Policy Ment Health. 2016; 43:467–470. Hannan C, Lambert MJ, Harmon C et al. 2005. A lab test and algorithms for identifying clients at risk for treatment failure. J Clin Psychol 61(2):155-63. Brown GS, Jones ER. 2005. Implementation of a feedback system in a managed care environment: What are patients teaching us? J Clin Psychol 61(2):187-98. The Kennedy Forum. Fixing Behavioral Health Care in America. www.thekennedyforum.org