Download presentation
Presentation is loading. Please wait.
Published byHugh Peregrine Anderson Modified over 9 years ago
1
Boards, Dashboards, and Data From the Top: Getting the Board on Board 1-3 p.m., June 11, 2007 Boston, Massachusetts James L. Reinertsen, M.D.
2
Boards ask two types of questions about quality and safety 1.How good is our care? ─How do we compare to others like us? 2.Is our care getting better? ─Are we on track to achieve our key quality and safety objectives? ─If not, why not? Is the strategy wrong, or is it not being executed effectively?
3
For all of these questions… In God we trust. All others bring data. Yes, but what data?
4
Purpose of Measurement Research Comparison or Accountability Improvement Key question“What is the truth?” “Are we better or worse than…?” “Are we getting better?” Penalty for being wrong Misdirection for the profession Misdirected reward or punishment Misdirection for an initiative Measurement requirements and characteristics Complete, accurate, controlled, glacial pace, expensive Risk adjusted, with denominators, attributable to individuals or orgs, validity Real time, raw counts, consistent operational definitions, utility Typical displaysComparison of control and experimental populations Performance relative to benchmarks and standards… Run charts, control charts, time between events… Adapted from Solberg,Mosser, McDonald Jt Comm J Qual Improv. 1997 Mar;23(3):135-47.
5
Example of an answer to “How good is our care?” Date of this report is October 24, 2006 Hospital could be “green” but still worse than median of comparison group Compared to others
6
Another example of “How do we compare?” Hospital Adverse Events per 1,000 Patient Days Adverse Events Include (but are not limited to): Allergic rash Excessive bleeding, unintentional trauma of a blood vessel Respiratory depression requiring intubation due to pain medications Hyperkalemia as the result of overdose of potassium Lethargy/shakiness associated with low serum glucose Drug-induced renal failure Surgical site infection, sepsis, infected lines, other hospital-acquired infections Internal bleeding following the first surgery and requiring a second surgery to stop the bleeding Atelectasis, skin breakdown, pressure sores DVT or pulmonary embolism during a hospital stay Source: Roger Resar, John Whittington, IHI Collaborative 150 Number of Adverse Events per 1,000 Patient Days Using IHI Global Trigger Tool 0255075 100 125 Current IHI Best IHI Average 5 40 Our Hospital, May 2007
7
What Boards should know about data on “How good are we and how do we compare to others?” Upside Often risk adjusted Apples to Apples Source of pride Source of energy for improvement Downside Time lag (months) Static (no data over time) If you look bad, energy is wasted on “the data must be wrong” If you look good, you become complacent How you look depends on how others perform Standards and Benchmarks are full of defects (“The cream of the crap”)
8
Recommendations for Board use of “How do we compare to others?” 1.Ask this question to help you set aims, and perhaps annually thereafter, but don’t use these sorts of reports to oversee and guide improvement at each meeting. 2.Compare to the best, not the 50 th %tile e.g. Toyota Specs 3.Always make sure you know how “Green” is determined
9
Boards ask two types of questions about quality and safety 1.How good is our care? ─How do we compare to others like us? 2.Is our care getting better? ─Are we on track to achieve our key quality and safety objectives? ─If not, why not? Is the strategy wrong, or is it not being executed effectively? Where dashboards and scorecards can be helpful to boards
10
1.1 Satisfy Our Patients Example: Immanuel St. Joseph’s Mayo Health System Board’s answer to the question “Is our mortality rate getting better?” Available in January 2007!
11
Is our quality and safety getting better? Are we going to achieve our aims? To answer these questions for Boards… ─The aims should be clearly displayed and understood ─A few system-level measure(s) should be graphically displayed over time ─The measures should be displayed monthly, at worst, and should be close to “real time” ─Measures do not necessarily need to be risk adjusted ─Measures of critical initiatives (projects that must be executed to achieve the aim) should be available if needed to answer the Board’s questions
12
The Board question “are we going to achieve our aims?” requires management to have a strategic theory Big Dots (Pillars, BSC…) Drivers (Core Theory of Strategy) Projects (Ops Plan) What are your key strategic aims? How good must we be, by when? What are the system-level measures of those aims? Down deep, what really has to be changed, or put in place, in order to achieve each of these goals? What are you tracking to know whether these drivers are changing? What set of projects will move the Drivers far enough, fast enough, to achieve your aims? How will we know if the projects are being executed?
13
The ideal dashboard will display a cascaded set of measures that reflect the “theory of the strategy.”
14
Example Dashboard for Harm (for 5M Lives Campaign) System Level Measure: Global Harm Trigger Tool Drivers: Handwashing, culture of discipline, and teamwork Projects: High alert meds, surgical complications, pressure ulcers, CHF, MRSA Board
15
The full Board should review the System- level Measures (Big Dots.) The Board Quality Committee should review both the System-level Measures and the Key Drivers of those Measures. Occasionally, but not often, the Board will need to see measures of Key Projects, but these are generally the responsibility of management to oversee and execute.
16
Common Flaws in Dashboards No system-level measures or aims (so it’s possible to quality and safety to be worse, and yet to achieve “green” on all the measures the Board sees!) Hodge-podge of system, driver, and project measures (so the Board doesn’t know what’s important) Static measures (so the Board has to take management’s word that “we’re on track to achieve our aims” Too many measures (so the Board doesn’t understand any of them) Mixture of “How do we compare to others” and “are we getting better?” measures (so the Board doesn’t know what questions to ask) Low, unclear standards for “green” (so the Board becomes complacent despite significant opportunities for improvement!)
17
Can you identify the flaws in the following “dashboard?”
19
No display over time Low standards for “Green” Mix of system, project measures Mostly comparison measures
20
Summary of Best Practices for Quality and Safety Dashboards for Boards Separate the two types of oversight questions ─How good is our quality? How do we compare to others? ─Are we getting better? Are we on track to achieve our aims? Ask the comparison question annually, when setting quality and safety aims. Avoid use of comparative data to track improvement. Frame your aims with reference to the theoretical ideal, and to the “best in the world,” not to benchmarks Ask the ‘improvement question’ at every meeting, and track with a dashboard that shows real-time data on system level and driver measures displayed on run charts Demand that management develop a “theory of the strategy to achieve the annual quality and safety aims Do not put project-level measures (often about one unit, disease, or department) on the Board’s dashboard
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.