Developing a Resource Guide for CANS Data Analysis and Reporting Vicki Sprague Effland, Ph.D.

Slides:



Advertisements
Similar presentations
Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
Advertisements

Statewide Children’s Wraparound Initiative COSA Conference Presenters: Erinn Kelley-Siel Mary Lou Johnson Larry Sullivan.
Comprehensive Organizational Health AssessmentMay 2012Butler Institute for Families Comprehensive Organizational Health Assessment Presented by: Robin.
Brian A. Harris-Kojetin, Ph.D. Statistical and Science Policy
4/30/20151 Quality Assurance Overview. 4/30/20152 Quality Assurance System Overview FY 04/05- new Quality Assurance tools implemented, taking into consideration.
From the Balcony and On the Ground The Louisiana Balcony View March 4, 2014.
Catulpa Community Support Services.  Use of an electronic data entry program to record demographic data and case notes to reflect service delivery 
Staying Ahead of the Budget Shortfall: Serving More Youth with the Same Amount of Money (or Less) Monday, March 5, 2012 Shannon Van Deman, CFO.
1 Wisconsin Partnership Program Steven J. Landkamer Program Manager Wisconsin Dept. of Health & Family Services July 14, 2004.
Calculating & Reporting Healthcare Statistics
Council of State Governments Justice Center | 1 Michael Thompson, Director Council of State Governments Justice Center July 28, 2014 Washington, D.C. Measuring.
Washington State Prevention Summit Analyzing and Preparing Data for Outcome-Based Evaluation Using the Assigned Measures and the PBPS Outcomes Report.
A Framework for Financial Statement Analysis Chapter 11.
Decision analysis and Risk Management course in Kuopio
National Public Health Performance Standards Local Assessment Instrument Essential Service: 1 Monitor Health Status to Identify Community Health Problems.
Quality Improvement Prepeared By Dr: Manal Moussa.
Response to Intervention in The Social Domain. Response to Intervention (RTI) Response to evidence-based interventions (Elliott, Witt, Kratchowill, &
Assessment Report Department of Psychology School of Science & Mathematics D. Abwender, Chair J. Witnauer, Assessment Coordinator Spring, 2013.
First, a little background…  The FIT Program is the lead agency for early intervention services under the Individuals with Disabilities Education Act.
Expanding the Population Served by System of Care March 4, 2013 Vicki Effland, PhD Shannon Van Deman, MBA.
WRAPAROUND MILWAUKEE “Never doubt that a small group of committed citizens can change the world: indeed, it’s the only thing that ever does.” Margaret.
Implementation of the WFI-EZ in a Multi-Site Wraparound Agency March 3, 2014.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Performance Measurement and Analysis for Health Organizations
Indicators of Family Engagement Melanie Lemoine and Monica Ballay Louisiana State Improvement Grant/SPDG.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Benchmarking for Organizational Excellence in Addiction Treatment Paul M. Lefkovitz, Ph.D. President, Behavioral Pathway Systems.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Program Evaluation EDL 832 Jeffrey Oescher, Instructor 6 June 2013.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Evidence-based policymaking: Seeking to do more good than harm Helen Jones Professional Adviser.
Creating a Dashboard for Frontline Staff to Improve Performance Training Institutes 2010 Shannon Van Deman, MBA Knute Rotto, ACSW.
Family Care Community Partnerships (FCCP) Selected Logic Model Outcomes in the System of Care CY14 1 st and 2 nd Quarters Rhode Island Department of Children,
PBIS Data Review: Presented by Susan Mack & Steven Vitto.
Toolkit for Mainstreaming HIV and AIDS in the Education Sector Guidelines for Development Cooperation Agencies.
ASSURANCES, FRAMEWORKS, DOMAINS… OH MY! Everything You Always Wanted to Know About QM Strategies But Were Afraid to Ask.
Maryland Department of Health and Mental Hygiene WB&A Market Research Executive Summary THE 2003 MARYLAND MEDICAID MANAGED CARE CUSTOMER SATISFACTION SURVEY.
Adult Drug Courts: The Effect of Structural Differences on Program Retention Rates Natasha Williams, Ph.D., J.D., MPH Post Doctoral Fellow, Morgan State.
“The Effect of Patient Complexity on Treatment Outcomes for Patients Enrolled in an Integrated Depression Treatment Program- a Pilot Study” Ryan Miller,
Studying Injuries Using the National Hospital Discharge Survey Marni Hall, Ph.D. Hospital Care Statistics Branch, Division of Health Care Statistics.
Using Data in the Goal-Setting Process Webinar September 30, 2015.
1 f02kitchenham5 Preliminary Guidelines for Empirical Research in Software Engineering Barbara A. Kitchenham etal IEEE TSE Aug 02.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
A compassionate team based approach... to care for families who have complicated needs.
The Ohio Mental Health Consumer Outcomes Initiative An Overview Fall 2002.
Cornelia Taylor, ECO at SRI Kathy Hebbeler, ECO at SRI National Picture –Child Outcomes for Early Intervention and Preschool Special Education October,
National Picture – Child Outcomes for Early Intervention and Preschool Special Education Kathleen Hebbeler Abby Winer Cornelia Taylor August 26, 2014.
Family Care Community Partnerships (FCCP) Selected Logic Model Outcomes in the System of Care CY15 1 st and 2 nd Quarters Rhode Island Department of Children,
Family Care Community Partnerships (FCCP) Selected Logic Model Outcomes in the System of Care CY14 3 rd and 4 th Quarters Rhode Island Department of Children,
CAHPS PATIENT EXPERIENCE SURVEYS AHRQ ANNUAL MEETING SEPTEMBER 2012 Christine Crofton, PhD CAHPS Project Officer.
Early Intervention Program & Early Family Support Services: Analyzing Program Outcomes with the Omaha System of Documentation Presented to: Minnesota Omaha.
Concurrent Validity of Alternative CANS Outcome Metrics William A. Shennum Julian Leiro Delisa Young Five Acres Altadena, California.
John S. Lyons, Ph.D. Chapin Hall at the University of Chicago.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Guidelines Recommandations. Role Ideal mediator for bridging between research findings and actual clinical practice Ideal tool for professionals, managers,
PSYCHOMETRICS. SPHS 5780, LECTURE 6: PSYCHOMETRICS, “STANDARDIZED ASSESSMENT”, NORM-REFERENCED TESTING.
Affiliated with Children’s Medical Services Affiliated with Children’s Medical Services Introduction to the Medical Home Part 4 How Can Assessment Tools.
Session 2: Developing a Comprehensive M&E Work Plan.
Understanding the Data on Preschool Child Find and Transition Annual Performance Report Indicator 12 February, 2016
RAI and MDS 2.0 and 3.0 HPR 451.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Improving Data, Improving Outcomes Conference Aug , 2016
Presenter: Christi Melendez, RN, CPHQ
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
TFI Wordle This presentation is intended to introduce the PBISApps site and the types of data teams will be working with. The teams will take their first.
Denver Office of Children’s Affairs
Provider Peer Grouping: Project Overview
Presenter: Kate Bell, MA PIP Reviewer
Presentation transcript:

Developing a Resource Guide for CANS Data Analysis and Reporting Vicki Sprague Effland, Ph.D.

Youth Improved!

Did Youth Improve Enough?

Need for Resource Guide Standardize methodology for CANS data analysis Establish benchmarks for various data analysis methods Develop guidelines for reporting CANS results

Introduction to Choices

Choices, Inc. Non profit care management entity created in 1997 Developed around a community need: “high cost youth” Blended system of care principles with wraparound values and managed care technology.

Choices Care Management More than 220 employees $35 million annual budget More than 1300 youth served in child and family teams daily Working across ALL child serving systems – 60% child welfare Indiana Choices – Since 1997 Maryland Choices – Since 2005 DC Choices – Since 2008 Louisiana Choices – Since 2012

Choices, Inc. Adopted CANS in 2006 – Comprehensive version – 12 Life Domains Outcomes Champion – Agency in 2007

Outcomes Monitoring Internal – Program effectiveness – Quality improvement External – Adherence to contract requirements – Marketing to new partners and communities

Successes Have lots of CANS data Multiple resources to analyze and report data – Outcomes and evaluation – Software development – Communications Ability to look at trends over time

Challenges Difficult to compare our performance to others – Multiple versions of the CANS – Variation in how CANS is analyzed – Multiple tools used across communities Need to establish meaningful performance expectations – Minimum levels of change – % youth expected to improve

Important Points about the CANS

Critical Elements of Communimetrics Measures 1.Partner Involvement 2.Malleable to the Organization 3.Just Enough Information Philosophy 4.Meaningfulness to Decision Process 5.Reliability at Item Level 6.Utility of Measure Based on its Communication Value

“Unlike psychometric measures in which clinical significance is a more rigorous standard than statistical significance, any change on the CANS is clinically significant.” - Lyons (2009), Communimetrics: A Communication Theory of Measurement in Human Service Settings

Family & Youth ProgramSystem Decision Support Service Planning Eligibility Resource Management Quality Improvement Case Management & Supervision AccreditationTransformation Outcome Monitoring Service Planning & Celebrations Evaluation Performance Contracting Total Clinical Outcomes Management

Methods for Analyzing the CANS Dimension-Level Analyses Item-Level Analyses

Dimension-Level Analyses

Change in Dimension Scores Analysis Steps 1.Sum items in a specified dimensions 2.Divide by the number of valid responses 3.Multiply by 10 4.Conduct statistical analysis

Change in Dimension Scores Reporting Results – Intake and discharge means – Results of statistical analysis – Statistically significant change in scores between intake and discharge Benchmarks – Accepted statistical criteria – None available for clinical significance

Change in Dimension Scores Advantages – Uses well known statistical methods – Statistical significance has a commonly understood meaning Disadvantages – Statistical significance not always indicative of clinical significance – Does not communicate results in terms of number of youth showing improvement

Any Improvement in Functioning Analysis Steps 1.Calculate intake and discharge mean scores 2.Identify youth with lower scores at discharge Intake Mean Score > Discharge Mean score 3.Divide by # youth in sample

Any Improvement in Functioning Reporting Results – % of youth with any improvement in functioning Benchmarks – N/A

Any Improvement in Functioning Advantages – Simple to analyze – Easy to explain methodology Challenges – Lack of established benchmarks – Difficult to communicate that change is clinically meaningful

Reliable Change Equation – RCI = 1.28 * SD * SQRT(1 – Reliability) Analysis Steps 1.Compute the RCI 2.Calculate change in intake and discharge mean scores 3.Identify youth with change in scores >= RCI 4.Divide by # youth in sample

Reliable Change Reporting Results – % of youth with a reliable improvement in functioning Benchmarks – 60-80% of youth expected to improve in at least one of the dimensions measured – 20-40% of youth expected to improve in a specific dimension

Reliable Change Advantages – Clearly defined method – Available benchmarks Challenges – Difficult for program staff to interpret and communicate results – Results include youth with no needs at intake

Actionable Needs Analysis Steps 1.Count the number of needs rated as a 2 or 3 within each dimension 2.Compare needs at Intake and Discharge

Actionable Needs Reporting Results – Average number of needs at intake and discharge across dimensions Benchmarks – N/A

Actionable Needs Advantages – Easy to display graphically – Simple for audiences familiar with the CANS to understand Challenges – Requires additional explanation if audience includes individuals not familiar with the CANS – Lack of established benchmarks

Met Needs Analysis Steps 1.Identify youth with ratings of 2 or 3 on individual items at Intake 2.Determine whether item ratings decreased to a 0 or 1 by Discharge 3.Compute the number and percent of items met within each dimension 4.Calculate the percent of youth who met at least one (or more) needs within the dimension

Met Needs Reporting Results – Average number of needs met by dimension – Percent of needs met – Percent of youth who met at least one need Benchmarks – N/A

Met Needs Advantages – Effective way to communicate improvement – Simple for audiences familiar with the CANS to understand – Several options for reporting Challenges – Requires additional explanation if audience includes individuals not familiar with the CANS – Lack of established benchmarks

Dimension-Level Analyses Questions? Additional Methods? Thoughts?

Item-Level Analyses

Item Score Analysis Steps 1.Mean item score for all youth at Intake and at Discharge 2.Multiply by 10

Item Score Reporting Results – Graph of intake and discharge scores Benchmarks – N/A

Item Score Advantages – Easy to present graphically Disadvantages – Does not communicate results in terms of number of youth showing improvement

Any Improvement Analysis Steps 1.Identify youth with ratings of 2 or 3 at Intake 2.Identify youth with lower scores at Discharge Intake Rating > Discharge Rating 3.Compute mean number of youth showing improvement Note that need does not have to be met to count in this analysis

Any Improvement Reporting Results – % of youth with any improvement in functioning Benchmarks – N/A

Any Improvement Advantages – Simple to analyze – Allows for any improvement in functioning to be reflected Challenges – Lack of established benchmarks

Met Needs Analysis Steps 1.Identify youth with ratings of 2 or 3 on individual items at Intake 2.Determine whether item ratings decreased to a 0 or 1 by Discharge 3.Calculate the percent of youth who met the item

Met Needs Reporting Results – Percent of youth who met individual needs – Results for individual needs within a dimension Benchmarks – N/A

Met Needs Advantages – Effective way to communicate improvement – Simple for audiences familiar with the CANS to understand Challenges – Requires additional explanation if audience includes individuals not familiar with the CANS – Lack of established benchmarks

Item-Level Analyses Questions? Additional Methods? Thoughts?

Establishing Benchmarks Grab those pens and pencils!

Establishing Benchmarks 1.Youth 2.Service models 3.CANS versions 4.Availability of data 5.Analysis methods 6.Reporting results

Establishing Benchmarks 1.Youth a.Age b.Race/ethnicity c.Strengths and needs prior to intervention

Establishing Benchmarks 2. Service models a.Wraparound b.Residential treatment c.Crisis intervention d.Outpatient therapy e.Detention

Establishing Benchmarks 3.CANS versions a.Comprehensive b.Mental health c.Juvenile justice d.Child welfare e.Education f.Crisis

Establishing Benchmarks 4.Availability of data a.Number of youth served annually b.Method for completing CANS c.Data management d.Willingness/ability to share data

Establishing Benchmarks 5.Analysis methods Dimension-Level a.Dimension scores b.Any improvement c.Actionable Needs d.Met Needs

Establishing Benchmarks 5.Analysis methods Item-Level a.Item scores b.Any improvement c.Met Needs

Establishing Benchmarks 6.Reporting results a.Youth demographics b.Service context c.Amount, frequency and/or duration of services d.Sample size e.Length of stay f.CANS version used g.Data analysis method used

Next Steps 1.Compile your survey responses 2.Share survey with other CANS users 3.Form CANS Benchmarking Workgroup a.John Lyons b.Volunteers? c.Nominations 4.Develop action plan 5.Provide updates on progress

Thank You! Vicki Sprague Effland, Ph.D. Director, Outcomes and Evaluation Choices, Inc N. Keystone Ave., #150 Indianapolis, IN