Putting Your Data to Work

Slides:



Advertisements
Similar presentations
Beginning Action Research Learning Cedar Rapids Community Schools February, 2005 Dr. Susan Leddick.
Advertisements

A Guide to Analyzing PrOF Instructional Data Packets CRC Research Office 2009.
1 Strengthening Child Welfare Supervision as a Key Practice Change Strategy Unit I: Helping Child Welfare Leaders Re-conceptualize Supervision.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
EARLY WARNING SYSTEMS EARLY ADOPTERS’ SURVEY Interpretive Summary Highlights of EWS Early Adopters Learning and Sharing Summit Survey, George W. Bush Institute,
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Continuous Quality Improvement Basics Created by Michigan’s Campaign to End Homelessness Statewide Training Workgroup 2010.
Welcome! These slides are designed to help you think through presenting your benchmark planning and progress. Feel free to pick and choose the slides that.
Building Capacity to Conduct Scientifically and Culturally Rigorous Evaluations in Tribal Communities through the Tribal Home Visiting Evaluation Institute.
Root Cause is Critical to Improvement!
Administration for Children and Families
Evaluating the Quality and Impact of Community Benefit Programs
Sharing your CQI Story: Creating a CQI Story Board Tribal MIECHV Annual Grantee Meeting Washington, DC May 6, 2015.
Fostering a Culture of Data Use
Christy Hormann, LMSW, CPHQ Project Manager May 25, 2016
Building Processes for Conducting and Managing Data Collection
Strategies for Supporting Home Visitors with Data Collection
Making CLAS Happen A Guide to Implementing Culturally and Linguistically Appropriate Services (CLAS) Hello, thank you for coming! Introduce myself Rodrigo.
The New Educator Evaluation System
The New Educator Evaluation System
Supporting Community Priorities and Emphasizing Rigor An Approach to Evaluation Capacity Building with Tribal Home Visiting Programs Kate Lyon, MA Julie.
Tribal Home Visiting Evaluation Institute
Kate Lyon, MA, James Bell Associates, Inc.
Building Tribal Capacity for Home Visiting Evaluation through a Relational Technical Assistance Approach American Evaluation Association Annual Conference.
Loren Bell Linnea Sallack, MPH, RD Altarum Institute
CESSATION SERVICES IN AMERICAN INDIAN COMMUNITIES: RECOMMENDATIONS
The New Educator Evaluation System
Zelphine Smith-Dixon, State Director of Special Education
Family-Guided Routines-Based Intervention Introduction Module
Christy Hormann, LMSW, CPHQ Project Manager
Breastfeeding Initiation and Extension
Title: Owner: Ver: Date:
Suicide Prevention Coalitions: The Backbone of Community Prevention
Opportunities for Growth
Title: Owner: Ver: Date:
Taking Charge of Your Health
Title: Owner: Ver: Date:
Developmental Promotion, Early Detection and Linkage
Parent-Teacher Partnerships for Student Success
2.04 Keys to effective emergency shelter
The Family Development Matrix Pathway Project
Instructional Learning Cycle:
Structures for Implementation
Adult Reengagement in Michigan
Income Eligible Re-Procurement
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
As we reflect on policies and practices for expanding and improving early identification and early intervention for youth, I would like to tie together.
Your Library: Explore, Learn, Read, Connect
Let’s Talk Data: Making Data Conversations Engaging and Productive
Evaluating Your Home Visiting Program
Background This slide should be removed from the deck once the template is updated. During the 2018 Legislative Session, Act 555 was passed requiring schools.
Using Data for Program Improvement
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Using Data to Monitor Title I, Part D
HV CoIIN Measures Maternal Depression
Parent-Teacher Partnerships for Student Success
Response to Instruction/Intervention (RtI) for Parents and Community
Response to Instruction/Intervention (RtI) for Parents and Community
Using Data for Program Improvement
Lecturette 1: Leveraging Change through Strategic Planning
Family Engagement Policy
Lecturette 1: Leveraging Change through Strategic Planning
Performance and Quality Improvement
Lecturette 2: Planning Change
Fahrig, R. SI Reorg Presentation: DCSI
Making Middle Grades Work
Monitoring & Evaluation
Teacher Evaluator Student Growth Retraining Academy
Module: 9 Mapping the Standards How the 2020 Colorado Academic Standards Work Together for Colorado Students! Estimated time: 60 minutes.
Presentation transcript:

Putting Your Data to Work Using Data to Learn from Successes, Problem Solve and Improve Services Kate Lyon, MA Tribal Evaluation Institute January 2013

Now that you have all this information, what do you do with it? Service utilization data Benchmark data Demographic data Model specific data Satisfaction data As you know, you are collecting a lot of data! When grantees were developing their benchmark plans, they were encouraged to select performance measures and data elements that would be useful for your CQI process. Well now is the time to use these data!!! These data are YOUR data, it’s time to put your data to work. To learn from your data and apply what you learn to problem solve when things aren’t going the way you want them to and to also learn from things that are going well. Agency specific data Qualitative data

Using Data in a Continuous Quality Improvement Process Reference the PDSA cycle that Valerie reviewed. Everyone has heard of the circle of life, well this is the circle of data. You collect data, analyze the data and review your findings, and then apply what you learned from the data to better serve families. By systematically reviewing data you can improve services and highlight successes. I’m going to talk about some tools and provide examples of how you can use data to understand and improve your program.

Tools for Using Data Highlighting findings that need attention using a “stoplight” approach Examining data over time with graphs Tools for exploring what is contributing to successes and challenges Fishbone Diagram Key Driver Diagram I’m going to walk you through some tools that you can use to put your data to work First I’ll talk about how you want to present data for CQI, so that everyone can understand what they are looking at. I’ll focus on trend charts and go over some meaningful comparisons you can make with these charts. Stop light analysis is a very simple and straightforward way of highlighting findings that need attention. It’s easy to feel overwhelmed by a lot of data, but we’ll talk about ways to begin to make sense of it. The next 3 tools, key driver analysis, fishbone diagrams, and 5 why’s are designed to help you interpret data, identify the root cause of a challenge, and figure out what to do about it.

Using Data in a PDSA Cycle The processes of presenting and reviewing data mostly fit into the “study” part of the PDSA cycle, but you will use your data throughout the entire PDSA cycle. Reviewing data will help you plan any changes that you want to make to improve your program. Your data will also help you plan your CQI efforts, for example selecting a topic or selecting a target. And of course your data are up front and center in the “do” part of the cycle, when you are collecting the data. For most of you the data you are already collecting as part of the Parents As Teachers model, for the benchmarks, evaluation, or Form 1 data will be a great source of data for CQI. Some of you may decide to collect some additional information if your current data collection efforts aren’t getting at something you want to understand.

Presenting Data for CQI Multiple data points reflecting performance over time are presented Short intervals are used A target for performance is clearly indicated Data are presented in graph and table form Because CQI involves reviewing data and making decisions based on those data, it’s important to present the data in a way that will help people interpret and make sense of the data.

Setting a Target A target should be realistic Gather information from multiple sources: past program performance, model objectives, agency objectives, HV literature Get input from stakeholders & advisory board Can adjust targets over time

Setting Targets in an PDSA Cycle Setting targets is part of the planning process. Now let’s transition to talking about presenting data and some tools that you can use in the study phase of the PDSA cycle.

Highlighting Findings with the Stoplight Approach Indicator Target Community A Community B Program Total Time to first visit 70% in 5 days 61% 66% 64% Time to completed assessment 90% in 2 months 86% 77% 82% % of visits completed All families 70% 67% 55% 60% - Families enrolled < 6 months 75% 71% 45% 58% > 6 months 65% 59% 63% % attending group connections 80% 68% 74% 72% Once the data have been tabulated and analyzed, the next step is to identify those findings that appear unusual, unexpected, overly low, or overly high. This applies whether the data are compared with data for previous reporting periods, with targets, with other similar programs, or whether data are compared among client demographic groups, or among clients served in different ways. This step is primarily judgmental. For example, how much worse than the previous reporting period, or worse than the target, does the latest outcome value have to be before it warrants significant attention? Here we offer some “rules of thumb” for outcome indicators expressed as percentages: If the difference in percentage points is 10 or more, the difference warrants attention (“red light” treatment). If the difference is between 5 and 10 percentage points, this warrants modest attention, or at least flagging, to examine what happens in the next reporting period (“yellow light” treatment). If the difference is 5 percentage points or less, the difference is likely too small to be of concern (“green light” treatment). However, this rule of thumb would depend on how much variability you expect to see in your data. 0-5% below target 6-9% below target 10% or more below target

Examining Data Over Time In this example, a line graph shows initiation of breastfeeding at the program-level over 16 months. Performance has been steady and consistent over this interval. In Example 1, the program has performed below the target over the 16 months. There is a high degree of consistency over this time. Barring any changes in external influences, we can predict future performance will look the same unless a change in the process is introduced. Efforts should look at where the process can be altered to drive an improved outcome. Tracking data at the smallest interval of time will assist in showing if the change appears to be making a positive impact on the results and moving them closer to target

Examining Data Over Time This example also shows initiation of breastfeeding at the program-level over 16 months. In this graph, there is an improvement in performance starting in November 2011 that leads to the most recent data point which is almost at the target level of 60%. In Example 2, the steady improvement in the last half of the time interval suggests that something has happened to facilitate initiation of breastfeeding. It could be as a result of a planned change in the process through an improvement project or something occurring outside of the system (perhaps there has been an increase in the availability of lactation consultants in the site communities) that warrants follow up and further examination.

Examining Data Over Time This graph shows deteriorating performance. Because there has been a downward trend away from target for the past 7 straight months, we would expect that the next data point would likely show a further drop in the number of mothers that initiate breastfeeding. The cause of the deterioration in performance is not revealed by the graph in this example, although the multiple and steady drops in performance over 7 months suggest that something important is happening that requires further investigation. Efforts should focus on what internal and external influences occurred since October 2011 that could be influenced by implementing a focused improvement project.

Examining Data by Home Visitor Example 4: Initiation of Breastfeeding Home Visitor Trends This graph shows the performance of the three home visitors that are working within a single program. Both line and bar graphs are acceptable for data presentation. While a bar graph is used in this example to clearly illustrate the multiple variables, the key elements for CQI are maintained (change over time, target, etc.). There is considerable variability between the home visitors, which is otherwise masked if their data is combined. Although all three home visitors start at the same performance level in January, their performance begins to vary over time. Home Visitor 1 begins to improve after four months, eventually reaching the target and consistently maintaining this performance over time. Home Visitor 2 shows a fluctuating performance, followed by a steady deterioration over the most recent 7 months beginning with September 2011. Home Visitor 3 slips in the first few months, and then steadily improves, approaching, but not reaching, the target by the end of this interval. This graph suggests that there is much to be learned through closer examination of these three home visitors. For example: Home Visitor 1 may have identified ways to implement the curriculum in an effective way, and this might be informative for other home visitors.   Home Visitor 2 is struggling to meet the expectations. Efforts should focus on mapping the process being used to achieve the outcome for all three home visitors. This will identify variation in the process that can be addressed. Perhaps there are elements within the home visitor’s control that could lead to improved performance. Additionally, external factors specific to the families or community resources should be considered for positive and negative impact. Or, personal factors may be making it difficult for the home visitor to perform effectively. A deeper understanding of how each of the home visitors is implementing the model and supporting mothers in initiating breastfeeding may also reveal effective practices that have implications for the program as a whole. Reiterate that CQI is non-punitive.

Fishbone Diagram Category 2 Category 1 Topic Name Category 3 Factor Factor Factor Factor Factor Topic Name Factor Factor Factor Factor Category 3 Category 4

Family Characteristics Fishbone Diagram Supervision Training Frequency Training Attendance Workload Curriculum content Prenatal care Partnerships Belief systems Service availability Cultural competency Service Linkage Family Characteristics

Key Driver Diagram Interventions Key Drivers Goal Increase the number of families who receive 80% of the number of recommended home visits from 53.2% to 65%

Key Driver Diagram Interventions Key Drivers Goal Supervision and support on family engagement Program model contact protocols Mother’s prior experience with home visiting/other service providers Mother’s home situation/stability Interventions Key Drivers Home visitor training on family engagement Goal Home visitor understanding of contact protocols Increase the number of families who receive 80% of the number of recommended home visits from 53.2% to 65% Turnover in home visitors Families are hard to reach

Key Driver Diagram Interventions Key Drivers Goal Revise training on family engagement Implement ongoing training and coaching on FE Training for supervisors in supporting HVs in FE during reflective supervision Strengthen contact protocols Retrain on contact protocols Provide cell phones to HVs Identify family’s barriers to completing home visits Revise presentation to families on expectations for participation Social marketing of HV program Allow a one-week window for completing home visits Allow HVs to work non-traditional hours and give flex time Supervision and support on family engagement Program model contact protocols Mother’s prior experience with home visiting/other service providers Mother’s home situation/stability Interventions Key Drivers Home visitor training on family engagement Goal Home visitor understanding of contact protocols Increase the number of families who receive 80% of the number of recommended home visits from 53.2% to 65% Turnover in home visitors Families are hard to reach

Key Driver Diagram Interventions Key Drivers Goal Revise training on family engagement Implement ongoing training and coaching on FE Training for supervisors in supporting HVs in FE during reflective supervision Strengthen contact protocols Retrain on contact protocols Provide cell phones to HVs Identify family’s barriers to completing home visits Revise presentation to families on expectations for participation Social marketing of HV program Allow a one-week window for completing home visits Allow HVs to work non-traditional hours and give flex time Supervision and support on family engagement Program model contact protocols Mother’s prior experience with home visiting/other service providers Mother’s home situation/stability Interventions Key Drivers Home visitor training on family engagement Goal Home visitor understanding of contact protocols Increase the number of families who receive 80% of the number of recommended home visits from 53.2% to 65% Turnover in home visitors Families are hard to reach

Key Driver Analysis in a PDSA Cycle

What do you need to do to get started? Form a CQI team Select a CQI topic and target Data system and data to analyze Someone to run reports and develop tables and graphs An “all hands on deck,” non-punitive environment In time, you’ll be able to look at multiple data points over time

CQI Technical Assistance Want some assistance improving the quality of your program? Would you like to find out more about using data in a CQI process? Contact your FPO who will connect you with your VisTA and TEI liaisons.

Questions?

The Tribal Evaluation Institute is funded by the Office of Planning, research and Evaluation within the Administration for Children and Families. TEI was awarded to James Bell Associates in partnership with the University of Colorado’s Centers for American Indian and Alaska Native Health and Michigan Public Health Institute. For more information, contact the individuals on this slide. The Tribal Home Visiting Evaluation Institute (TEI) is funded by the Office of Planning, Research and Evaluation, Administration for Children and Families, Department of Health and Human Services under contract number HHSP23320095644WC. TEI is funded to provide technical assistance to Tribal Home Visiting grantees on rigorous evaluation, performance measurement, continuous quality improvement, data systems, and ethical dissemination and translation of evaluation findings. TEI1 was awarded to MDRC; James Bell Associates, Inc.; Johns Hopkins Bloomberg School of Public Health, Center for American Indian Health, and University of Colorado School of Public Health, Centers for American Indian and Alaska Native Health. For more information on TEI contact: Nicole Denmark Kate Lyon Federal Project Officer Project Director Office of Planning Research and Evaluation James Bell Associates, Inc. nicole.denmark@acf.hhs.gov lyon@jbassoc.com