Presentation is loading. Please wait.

Presentation is loading. Please wait.

Aunyika Moonan, PhD, MSPH, CPHQ

Similar presentations


Presentation on theme: "Aunyika Moonan, PhD, MSPH, CPHQ"— Presentation transcript:

1 Using Dashboards to present data to your Board: Quality and Patient Safety
Aunyika Moonan, PhD, MSPH, CPHQ SCHA’s Director of Quality Measurement Services SC AHQ, May 9, 2008

2 Objectives What is a dashboard?
Making the case why boards need to be on board? How do you get your board to improve quality and patient safety? What data do you include in dashboards? Which performance improvement tools do you use with the board? How do you present your data to the board?

3 Purpose of a dashboard? A dashboard is a powerful took to keep leaders focused on the organization’s key issues and strategies. Well-chosen performance indicators displayed at a glance format help identify areas that are doing well and need improvement. Dashboard can include indicators such as financial viability, clinical outcomes, patient safety, quality of care or satisfaction rates.

4 Use of a Dashboard Focus senior executives’ attention
Link to organization’s aims/goals and strategic plan Few pages Show Improvement

5 Board Leadership is a critical ingredient to achieving better, safer care: Survey link better outcomes are associated with hospitals where... 1. The board spends >25% of time on quality issues(p = 0.009); 2. The board receives a formal quality performance measurement report (p=0.005); 3. There is a high level of interaction between the board and the medical staff on quality strategy (p=0.021); 4. The senior executives’ compensation is based inpart on QI performance (p=0.008); 5. The CEO is identified as the person with the greatest impact on QI (p=0.01) Kroch et al. Hospital Boards and Quality Dashboards. J Patient Safety. Volume 2, Number 1. March 2006

6 So…How do you get your Board to improve quality and patient safety?
Board Recruitment: Choosing Board members with the right stuff Education: Educate the board Bader and Associates Governance Consultants. Great Boards, Spring 2006, Volume VI, No.1

7 How do you get your Board to improve quality and patient safety?
Measurement: Use measures to focus broad work on what’ important 4. High Expectations: Pursue perfection Recognition and Rewards: Recognize and reward excellence

8 How do you get your Board to improve quality and patient safety?
5. Culture Promotion: Pay more attention to culture 6. Board Time: Exercise leaders powerful influence Recognition and Rewards: Recognize and reward excellence

9 Is our care getting better?
What type of data do you include? Boards ask two types of questions about quality and safety How good is our care? How do we compare to others like us? Is our care getting better? Are we on an acceptable track to achieve our key quality and safety objectives or do we change direction? If not, why not? Is the strategy wrong, or is it not being executed effectively? Forward slides adapted from James L. Reinertsen, M.D: Boards, Dashboards and Data (IHI)

10 Comparison or Accountability
Purpose of Measurement Research Comparison or Accountability Improvement Key question “What is the truth?” “Are we better or worse than…?” “Are we getting better?” Measurement requirements and characteristics Complete, accurate, controlled, glacial pace, expensive Risk adjusted, with denominators, attributable to individuals or orgs, validity Real time, raw counts, consistent operational definitions, utility Typical displays Comparison of control and experimental populations Performance relative to benchmarks and standards… Run charts, control charts, time between events… Adapted from Solberg,Mosser, McDonald Jt Comm J Qual Improv Mar;23(3):

11 Example of an answer to “How good is our care?”
Compared to others Hospital could be “green” but still worse than median of comparison group Date of this report is October 24, 2006

12 Another example of “How do we compare?”
Hospital Adverse Events per 1,000 Patient Days Using IHI Global Trigger Tool Our Hospital, May 2007 Current IHI Best IHI Average 5 40 100 25 50 75 125 150 Number of Adverse Events per 1,000 Patient Days Adverse Events Include (but are not limited to): Allergic rash Excessive bleeding, unintentional trauma of a blood vessel Respiratory depression requiring intubation due to pain medications Hyperkalemia as the result of overdose of potassium Lethargy/shakiness associated with low serum glucose Drug-induced renal failure Surgical site infection, sepsis, infected lines, other hospital-acquired infections Internal bleeding following the first surgery and requiring a second surgery to stop the bleeding Atelectasis, skin breakdown, pressure sores DVT or pulmonary embolism during a hospital stay Source: Roger Resar, John Whittington, IHI Collaborative

13 What Boards should know about data on “How good are we and how do we compare to others?”
Upside Often risk adjusted Apples to Apples Source of pride Source of energy for improvement Downside Time lag Static “the data must be wrong” you become complacent How you look depends on how others perform Standards and Benchmarks are full of defects

14 Recommendations for Board use of “How do we compare to others?”
Ask this question to help you set aims, but don’t use these sorts of reports to oversee and guide improvement at each meeting. Compare to the best, not the 50th %tile Always make sure you know how “Green” is determined

15 Boards ask two types of questions about quality and safety
How good is our care? How do we compare to others like us? Is our care getting better? Are we on an acceptable track to achieve our key quality and safety objectives or do we need to change direction? If not, why not? Is the strategy wrong, or is it not being executed effectively? Where dashboards and scorecards can be helpful to boards

16 Big Dots Drivers Projects
What data should you include to your board? The Board question “are we going to achieve our aims?” requires management to have a strategic theory Big Dots (Pillars, BSC…) Drivers Projects (Ops Plan) What are your key strategic aims? How good must we be, by when? What are the system-level measures of those aims? Down deep, what really has to be changed, or put in place, in order to achieve each of these goals? What are you tracking to know whether these drivers are changing? What set of projects will move the Drivers far enough, fast enough, to achieve your aims? How will we know if the projects are being executed?

17 Example Dashboard for Harm (for 5M Lives Campaign)
System Level Measure: Global Harm Trigger Tool Projects: High alert meds, surgical complications, pressure ulcers, CHF, MRSA Drivers: Handwashing, culture of discipline, and teamwork

18 Performance Improvement Tools to use with the Board
Run or Trend Charts Control Charts

19 19 19

20 20 20

21 Control Chart Statistical Process Control-dynamic view
Types of Variation Common Cause Variation-points between control limits in no particular pattern; normally expected from process Special Cause Variation-arises form sources not inherent in process; points outside limits, exhibit special patterns 67-71 21 21

22 Control Charts 22 22

23 Mostly comparison measures
No display over time Mix of system, project measures Mostly comparison measures Low standards for “Green”

24 Is our quality and safety getting better
Is our quality and safety getting better? Are we going to achieve our aims? To answer these questions for Boards… The aims should be clearly displayed and understood A few system-level measure(s) and drivers should be graphically displayed over time The measures should be displayed monthly, at worst, and should be close to “real time” Measures of critical initiatives (projects that must be executed to achieve the aim) should be available if needed to answer the Board’s questions

25 Data to include: The full Board should review the System-level Measures (Big Dots.) The Board and mainly the Board Quality Committee should review both the System-level Measures and the Key Drivers of those Measures. Occasionally, but not often, the Board will need to see measures of Key Projects, but these are generally the responsibility of management to oversee and execute.

26 Common Flaws in Dashboards
No system-level measures or aims Hodge-podge of system, driver, and project measures Static measures Too many measures Mixture of “How do we compare to others” and “are we getting better?” measures Low, unclear standards for “green”

27 Summary of Best Practices for Quality and Safety Dashboards for Boards
Separate the two types of oversight questions How good is our quality? How do we compare to others? Are we getting better? Are we on track to achieve our aims? Ask the comparison question annually, when setting quality and safety aims. Avoid use of comparative data to track improvement. Frame your aims with reference to the theoretical ideal, and to the “best in the world,” not to benchmarks

28 Summary of Best Practices for Quality and Safety Dashboards for Boards
Ask the ‘improvement question’ at every meeting, and track with a dashboard that shows real-time data on system level and driver measures displayed on run charts Demand that management develop annual quality and safety aims Do not put project-level measures (often about one unit, disease, or department) on the Board’s dashboard but have it prepared in case they ask

29 Data Presentation to the board
Great data presented poorly will not achieve your goals! Include: Magnitude Direction Variability Rate Quick and easy format- callouts, annotate Provide conclusions with your data Connect data to organizational strategy

30 SCHA’s Director of Quality Measurement Services
Aunyika Moonan SCHA’s Director of Quality Measurement Services


Download ppt "Aunyika Moonan, PhD, MSPH, CPHQ"

Similar presentations


Ads by Google