Presentation is loading. Please wait.

Presentation is loading. Please wait.

LEARNING BY DOING An improvement simulation exercise Brant Oliver Suzie Miltner QSEN National Forum May 2016.

Similar presentations


Presentation on theme: "LEARNING BY DOING An improvement simulation exercise Brant Oliver Suzie Miltner QSEN National Forum May 2016."— Presentation transcript:

1

2 LEARNING BY DOING An improvement simulation exercise Brant Oliver Suzie Miltner QSEN National Forum May 2016

3 Learning Objectives After completing this simulation exercise, participants will be able to: (1)describe the IHI Model for Improvement, including the Plan-Do-Study-Act Cycle; (2)conduct simple PDSA cycles in a simulated environment; (3)create simple data displays for performance measurement; and (4)describe and interpret Run Charts.

4 In this exercise we will simulate the model for improvement… IHI (2004)

5 The PDSA Cycle 4. ACT What changes are to be made? Next cycle? Action based on prior results 1. PLAN Objective (goal) Outcome predictions Implementation plan (who, what, where, when, how) Measurement plan 3. STUDY Complete data analysis Compare to predictions Summarize what was learned 2. DO Carry out the plan Document problems and unexpected observations Begin data analysis

6 Simulation Exercise: Mr. Potato Head A scene from “Toy Story” (Pixar Studios) Credits: Original program: Institute for Healthcare Improvement (IHI), Cambridge, MA (2004) Adapted by Steve Harrison, Sheffield MCA, Sheffield, UK (2013) Adapted for collaborative simulation with real time measurement dashboard and registry (B. Oliver, 2015, 2016) & playbook (M Godfrey (2015).

7 Imagine that building Mr. Potato Head is improving the quality of diabetes care in a primary care setting...

8 Your Evidence Based Practice GuidelinePotato Head

9 8 What we aim to achieve… “Build it right” (adhere to the evidence based practice guideline) “Build it fast” (optimize access to care) “Do it consistently” (optimize reliability) “Continuously improve” (optimize value)

10 Value = Quality/Cost (V =Q/C) Biologic/Clinical Functional Status Cost Experience of Care Nelson et al. (2004) Quality Indicators Resource Utilization A Potato Value Compass

11 Biologic/Clinical Functional Status Cost Experience of Care A Potato Value Compass: Conceptual Definitions Access to appropriate monitoring and management to achieve optimal Hgb A1C status…. ED and hospitalizations secondary to poor diabetes control… Quarterly CGCHAPS surveys Wellness capability- adherence to evidence based regimen

12 Biologic/Clinical Functional Status Cost Short-term process measures Access to appropriate monitoring and management to achieve optimal Hgb A1C status…. ED and hospitalizations secondary to poor diabetes control…

13 Cost Experience of Care Long term outcome measures ED and hospitalizations secondary to poor diabetes control… Quarterly CGCHAPS surveys

14 Biologic/Clinical Functional Status Cost Experience of Care Process measures >>> Outcome measures Access to appropriate monitoring and management to achieve optimal Hgb A1C status…. ED and hospitalizations secondary to poor diabetes control… Quarterly CGCHAPS surveys Wellness capability- adherence to evidence based regimen

15 Biologic/Clinical Functional Status A Potato Value Compass: Operational Definitions Access to appropriate monitoring and management to achieve optimal Hgb A1C status…. Wellness capability- adherence to evidence based regimen “Build it right” (accuracy score/adherence to evidence guideline) “Build it fast” (access/building speed in seconds)

16 Balancing Measures “Build it fast” (building speed in seconds) “Build it right” (accuracy score)

17 Linking Short-Term Measures and Long-Term Measures Hospital Admissions Speed & Accuracy P DS A P D S A P DS A PDSA 1 2 3

18 Microsystem Teams for the PDSA Simulation… Surgeon Timer Recorder Observer

19 We will simulate a microsystem level improvement collaborative… 1 Baseline cycle and successive PDSA cycles Simulate rapid cycle improvement in separate microsystems Track performance (building speed and accuracy score) using Run Charts and descriptive displays Cascade measures and simulate an improvement collaborative- compare gender, balanced measures Benchmarking Playbooks

20 Successive PDSA cycles for Improvement Hospital Admissions Speed & Accuracy P DS A P D S A P DS A PDSA 1 2 3

21 Measuring performance over time using Run Charts 1) Median Performance level 2) Range (Precision) 3) Type of Variation (Common or Special)

22 Shifts A SHIFT is eight (8) or more consecutive points above or below the median.

23 Trends A TREND is seven (7) or more consecutively increasing or decreasing points.

24 Common Cause Variation caused by chance causes, by random variation in the system, resulting from many small factors. Example: Variation in work commute due to traffic lights, pedestrian traffic, parking issues. Special Cause Variation caused by special circumstances or assignable cause not inherent to the system. Example: Variation in work commute impacted by flat tyre, road closure, heavy frost/ice. Types of Variation Statistically significant 23

25 Common Cause Variation Reduce Variation (Increase Precision): Make the process even more reliable. Sub-Optimal Average Performance: Redesign process to get a better result. Special Cause Variation Identify the Cause: If Positive: “Maximize, optimize, replicate, or standardize.” If Negative: “Minimize or eliminate” 24 Application – Responding to Variation

26 PDSAPlanTime Accuracy 1 2 3 4

27 Benchmarking helps to empower improvement collaboratives…

28 Potato Head “Best Practice” Flow Diagram to Standardize for Playbook

29 “Potato Head Playbook”


Download ppt "LEARNING BY DOING An improvement simulation exercise Brant Oliver Suzie Miltner QSEN National Forum May 2016."

Similar presentations


Ads by Google