Download presentation
Presentation is loading. Please wait.
1
Data Data Data
2
Learning Objective We want you all to be comfortable using SBIRT data to improve clinical processes.
3
Some questions when reviewing data
Is this number correct? If not, did we submit incorrect data? Did Aaron mess up our data? Are we happy with this number? Why or why not? How do we know whether to be happy? Do we need more information? Can we look at the drivers of the metric? What went well? What can be improved? What is our plan for reviewing this information in the future? What is the timeline for revisiting the metric?
4
Are we happy with this number? Do we need more information?
Is this number correct? Are we happy with this number? Do we need more information? What went well? What can be improved? What is our plan for reviewing this information in the future? We don’t really know if we’re happy because we never set a benchmark. The following two slides show how additional information can give insight into whether you are happy with the number.
5
Are we happy with the number of patients screened
Are we happy with the number of patients screened? Maybe more information will help. This: Looks better than this: This is fake information given to prove a point. The first chart shows that we are screening more and more people each month. That’s good! We want to screen people. The second shows that we are barely screening any people in the most recent months. Maybe screening became less of a priority in this (fake) scenario?
6
And this: Looks better than this:
The first graph shows that we are beating our target number of people screened. The second shows that, even though we are screening more people each month, we still have a ways to go before reaching our target.
7
Prevalence information is helpful to allocate resources
Prevalence information is helpful to allocate resources. If you have this information for one site, you can plan for the need generated by screens at additional sites.
8
12 people had scores that warranted a brief intervention
12 people had scores that warranted a brief intervention. Of those, 7 received a brief intervention. Why didn’t everyone receive a brief intervention?
9
This shows the services received by people whose score indicated the need for BI and referral to treatment. Why would some people not need any BI or RT? Discuss why some only receive BI. Is it because of lack of referral opportunities? Clinicians don’t know how to make a referral?
10
how can we increase the number of people who attend treatment
how can we increase the number of people who attend treatment? More information would be nice….something like a brief questionnaire to determine the reasons why people don’t attend treatment. Maybe it’s financial. Maybe it’s transportation. Maybe the patient doesn’t feel comfortable.
11
For this, get a volunteer from the audience to sit in front of the group with one of the dashboard items they identified as being noteworthy.
12
Which metric would you like to discuss
Which metric would you like to discuss? Who is involved in the success of this metric? How often would you like to review the metric with the metric stakeholders? Bring up everyone else on the team that is involved in the success of the metric. It’s important that all dashboard information is reviewed by everyone whose actions affect the metric.
13
Is the number accurate. Are you happy with the metric. What went well
Is the number accurate? Are you happy with the metric? What went well? What can be improved?
14
Would you like to track additional information related to this metric
Would you like to track additional information related to this metric? When do you expect to see the results of your proposed changes?
15
Once they’re done they “won” the data-ing game. Good for them.
16
Any Questions/Comments?
Please remember that I’m available to help you have these conversations
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.