Presentation is loading. Please wait.

Presentation is loading. Please wait.

Putting Your Data to Work

Similar presentations


Presentation on theme: "Putting Your Data to Work"— Presentation transcript:

1 Putting Your Data to Work
Using Data to Learn from Successes, Problem Solve and Improve Services Kate Lyon, MA Tribal Evaluation Institute January 2013

2 Now that you have all this information, what do you do with it?
Service utilization data Benchmark data Demographic data Model specific data Satisfaction data As you know, you are collecting a lot of data! When grantees were developing their benchmark plans, they were encouraged to select performance measures and data elements that would be useful for your CQI process. Well now is the time to use these data!!! These data are YOUR data, it’s time to put your data to work. To learn from your data and apply what you learn to problem solve when things aren’t going the way you want them to and to also learn from things that are going well. Agency specific data Qualitative data

3 Using Data in a Continuous Quality Improvement Process
Reference the PDSA cycle that Valerie reviewed. Everyone has heard of the circle of life, well this is the circle of data. You collect data, analyze the data and review your findings, and then apply what you learned from the data to better serve families. By systematically reviewing data you can improve services and highlight successes. I’m going to talk about some tools and provide examples of how you can use data to understand and improve your program.

4 Tools for Using Data Highlighting findings that need attention using a “stoplight” approach Examining data over time with graphs Tools for exploring what is contributing to successes and challenges Fishbone Diagram Key Driver Diagram I’m going to walk you through some tools that you can use to put your data to work First I’ll talk about how you want to present data for CQI, so that everyone can understand what they are looking at. I’ll focus on trend charts and go over some meaningful comparisons you can make with these charts. Stop light analysis is a very simple and straightforward way of highlighting findings that need attention. It’s easy to feel overwhelmed by a lot of data, but we’ll talk about ways to begin to make sense of it. The next 3 tools, key driver analysis, fishbone diagrams, and 5 why’s are designed to help you interpret data, identify the root cause of a challenge, and figure out what to do about it.

5 Using Data in a PDSA Cycle
The processes of presenting and reviewing data mostly fit into the “study” part of the PDSA cycle, but you will use your data throughout the entire PDSA cycle. Reviewing data will help you plan any changes that you want to make to improve your program. Your data will also help you plan your CQI efforts, for example selecting a topic or selecting a target. And of course your data are up front and center in the “do” part of the cycle, when you are collecting the data. For most of you the data you are already collecting as part of the Parents As Teachers model, for the benchmarks, evaluation, or Form 1 data will be a great source of data for CQI. Some of you may decide to collect some additional information if your current data collection efforts aren’t getting at something you want to understand.

6 Presenting Data for CQI
Multiple data points reflecting performance over time are presented Short intervals are used A target for performance is clearly indicated Data are presented in graph and table form Because CQI involves reviewing data and making decisions based on those data, it’s important to present the data in a way that will help people interpret and make sense of the data.

7 Setting a Target A target should be realistic
Gather information from multiple sources: past program performance, model objectives, agency objectives, HV literature Get input from stakeholders & advisory board Can adjust targets over time

8 Setting Targets in an PDSA Cycle
Setting targets is part of the planning process. Now let’s transition to talking about presenting data and some tools that you can use in the study phase of the PDSA cycle.

9 Highlighting Findings with the Stoplight Approach
Indicator Target Community A Community B Program Total Time to first visit 70% in 5 days 61% 66% 64% Time to completed assessment 90% in 2 months 86% 77% 82% % of visits completed All families 70% 67% 55% 60% - Families enrolled < 6 months 75% 71% 45% 58% > 6 months 65% 59% 63% % attending group connections 80% 68% 74% 72% Once the data have been tabulated and analyzed, the next step is to identify those findings that appear unusual, unexpected, overly low, or overly high. This applies whether the data are compared with data for previous reporting periods, with targets, with other similar programs, or whether data are compared among client demographic groups, or among clients served in different ways. This step is primarily judgmental. For example, how much worse than the previous reporting period, or worse than the target, does the latest outcome value have to be before it warrants significant attention? Here we offer some “rules of thumb” for outcome indicators expressed as percentages: If the difference in percentage points is 10 or more, the difference warrants attention (“red light” treatment). If the difference is between 5 and 10 percentage points, this warrants modest attention, or at least flagging, to examine what happens in the next reporting period (“yellow light” treatment). If the difference is 5 percentage points or less, the difference is likely too small to be of concern (“green light” treatment). However, this rule of thumb would depend on how much variability you expect to see in your data. 0-5% below target 6-9% below target 10% or more below target

10 Examining Data Over Time
In this example, a line graph shows initiation of breastfeeding at the program-level over 16 months. Performance has been steady and consistent over this interval. In Example 1, the program has performed below the target over the 16 months. There is a high degree of consistency over this time. Barring any changes in external influences, we can predict future performance will look the same unless a change in the process is introduced. Efforts should look at where the process can be altered to drive an improved outcome. Tracking data at the smallest interval of time will assist in showing if the change appears to be making a positive impact on the results and moving them closer to target

11 Examining Data Over Time
This example also shows initiation of breastfeeding at the program-level over 16 months. In this graph, there is an improvement in performance starting in November 2011 that leads to the most recent data point which is almost at the target level of 60%. In Example 2, the steady improvement in the last half of the time interval suggests that something has happened to facilitate initiation of breastfeeding. It could be as a result of a planned change in the process through an improvement project or something occurring outside of the system (perhaps there has been an increase in the availability of lactation consultants in the site communities) that warrants follow up and further examination.

12 Examining Data Over Time
This graph shows deteriorating performance. Because there has been a downward trend away from target for the past 7 straight months, we would expect that the next data point would likely show a further drop in the number of mothers that initiate breastfeeding. The cause of the deterioration in performance is not revealed by the graph in this example, although the multiple and steady drops in performance over 7 months suggest that something important is happening that requires further investigation. Efforts should focus on what internal and external influences occurred since October 2011 that could be influenced by implementing a focused improvement project.

13 Examining Data by Home Visitor
Example 4: Initiation of Breastfeeding Home Visitor Trends This graph shows the performance of the three home visitors that are working within a single program. Both line and bar graphs are acceptable for data presentation. While a bar graph is used in this example to clearly illustrate the multiple variables, the key elements for CQI are maintained (change over time, target, etc.). There is considerable variability between the home visitors, which is otherwise masked if their data is combined. Although all three home visitors start at the same performance level in January, their performance begins to vary over time. Home Visitor 1 begins to improve after four months, eventually reaching the target and consistently maintaining this performance over time. Home Visitor 2 shows a fluctuating performance, followed by a steady deterioration over the most recent 7 months beginning with September Home Visitor 3 slips in the first few months, and then steadily improves, approaching, but not reaching, the target by the end of this interval. This graph suggests that there is much to be learned through closer examination of these three home visitors. For example: Home Visitor 1 may have identified ways to implement the curriculum in an effective way, and this might be informative for other home visitors. Home Visitor 2 is struggling to meet the expectations. Efforts should focus on mapping the process being used to achieve the outcome for all three home visitors. This will identify variation in the process that can be addressed. Perhaps there are elements within the home visitor’s control that could lead to improved performance. Additionally, external factors specific to the families or community resources should be considered for positive and negative impact. Or, personal factors may be making it difficult for the home visitor to perform effectively. A deeper understanding of how each of the home visitors is implementing the model and supporting mothers in initiating breastfeeding may also reveal effective practices that have implications for the program as a whole. Reiterate that CQI is non-punitive.

14 Fishbone Diagram Category 2 Category 1 Topic Name Category 3
Factor Factor Factor Factor Factor Topic Name Factor Factor Factor Factor Category 3 Category 4

15 Family Characteristics
Fishbone Diagram Supervision Training Frequency Training Attendance Workload Curriculum content Prenatal care Partnerships Belief systems Service availability Cultural competency Service Linkage Family Characteristics

16 Key Driver Diagram Interventions Key Drivers Goal
Increase the number of families who receive 80% of the number of recommended home visits from 53.2% to 65%

17 Key Driver Diagram Interventions Key Drivers Goal
Supervision and support on family engagement Program model contact protocols Mother’s prior experience with home visiting/other service providers Mother’s home situation/stability Interventions Key Drivers Home visitor training on family engagement Goal Home visitor understanding of contact protocols Increase the number of families who receive 80% of the number of recommended home visits from 53.2% to 65% Turnover in home visitors Families are hard to reach

18 Key Driver Diagram Interventions Key Drivers Goal
Revise training on family engagement Implement ongoing training and coaching on FE Training for supervisors in supporting HVs in FE during reflective supervision Strengthen contact protocols Retrain on contact protocols Provide cell phones to HVs Identify family’s barriers to completing home visits Revise presentation to families on expectations for participation Social marketing of HV program Allow a one-week window for completing home visits Allow HVs to work non-traditional hours and give flex time Supervision and support on family engagement Program model contact protocols Mother’s prior experience with home visiting/other service providers Mother’s home situation/stability Interventions Key Drivers Home visitor training on family engagement Goal Home visitor understanding of contact protocols Increase the number of families who receive 80% of the number of recommended home visits from 53.2% to 65% Turnover in home visitors Families are hard to reach

19 Key Driver Diagram Interventions Key Drivers Goal
Revise training on family engagement Implement ongoing training and coaching on FE Training for supervisors in supporting HVs in FE during reflective supervision Strengthen contact protocols Retrain on contact protocols Provide cell phones to HVs Identify family’s barriers to completing home visits Revise presentation to families on expectations for participation Social marketing of HV program Allow a one-week window for completing home visits Allow HVs to work non-traditional hours and give flex time Supervision and support on family engagement Program model contact protocols Mother’s prior experience with home visiting/other service providers Mother’s home situation/stability Interventions Key Drivers Home visitor training on family engagement Goal Home visitor understanding of contact protocols Increase the number of families who receive 80% of the number of recommended home visits from 53.2% to 65% Turnover in home visitors Families are hard to reach

20 Key Driver Analysis in a PDSA Cycle

21 What do you need to do to get started?
Form a CQI team Select a CQI topic and target Data system and data to analyze Someone to run reports and develop tables and graphs An “all hands on deck,” non-punitive environment In time, you’ll be able to look at multiple data points over time

22 CQI Technical Assistance
Want some assistance improving the quality of your program? Would you like to find out more about using data in a CQI process? Contact your FPO who will connect you with your VisTA and TEI liaisons.

23 Questions?

24 The Tribal Evaluation Institute is funded by the Office of Planning, research and Evaluation within the Administration for Children and Families. TEI was awarded to James Bell Associates in partnership with the University of Colorado’s Centers for American Indian and Alaska Native Health and Michigan Public Health Institute. For more information, contact the individuals on this slide. The Tribal Home Visiting Evaluation Institute (TEI) is funded by the Office of Planning, Research and Evaluation, Administration for Children and Families, Department of Health and Human Services under contract number HHSP WC. TEI is funded to provide technical assistance to Tribal Home Visiting grantees on rigorous evaluation, performance measurement, continuous quality improvement, data systems, and ethical dissemination and translation of evaluation findings. TEI1 was awarded to MDRC; James Bell Associates, Inc.; Johns Hopkins Bloomberg School of Public Health, Center for American Indian Health, and University of Colorado School of Public Health, Centers for American Indian and Alaska Native Health. For more information on TEI contact: Nicole Denmark Kate Lyon Federal Project Officer Project Director Office of Planning Research and Evaluation James Bell Associates, Inc.


Download ppt "Putting Your Data to Work"

Similar presentations


Ads by Google