WHA Improvement Forum For July    “Data Driven Improvement”   Presented by Stephanie Sobczak Courtesy Reminders: Please place your phones on MUTE.

Slides:



Advertisements
Similar presentations
Beginning Action Research Learning Cedar Rapids Community Schools February, 2005 Dr. Susan Leddick.
Advertisements

P D S A REVIEW ACT PLAN STUDY DO Plan Continuous Improvement
Presented by: The Eclectic Elective Department Chapter 9.
Welcome to the Leadership for Safety Webinar Engaging Physicians in Safety Initiatives The webinar will be starting momentarily… If you are having technical.
2013 CollaboRATE Survey Results
Leadership for Safety: Safety Briefing (Part I) Essential Hospitals Engagement Network November 14, 2013.
The Blueprint Your SIP (School Improvement Plan) A living, breathing, document.
Change Starts Here. The One about Outcomes and Indicators ICPC National Coordinating Center This material was prepared by CFMC (PM CO 2011), the.
Current Comfort Level with Learning Targets
The Health Roundtable 3-3b_HRT1215-Session_MILLNER_CARRUCAN_WOOD_ADHB_NZ Orthopaedic Service Excellence – Implementing Management Operating Systems Presenter:
HRM-755 PERFORMANCE MANAGEMENT
Tracy Unified School District Leadership Institute – “Leading the Transformation” Breakout Session Authentic Data Driven Decision Making July/August 2014.
MCHC Leaders At All Levels Go Lean! – Standardizing Work For Daily Problem Solving March 6, 2012.
Analytical Thinking.
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
NoCVA Readmission Collaborative October 25, 2012.
1 Qualitative Evaluation Terms Coding/categorization The process of condensing qualitative data through the identification of common themes. Data Matrix.
WHA Improvement Forum For December    “Removing Waste and Improving Efficiencies”   Tom Kaster Courtesy Reminders: Please place your phones on MUTE.
University of North Dakota Office of Institutional Research November 8, 2013 Drivers get ready - new dashboards are coming your way! Presented at the.
Erimo Consulting Executive Development Capabilities Prepared for Maureen Gullo May 13, 2009.
1 The Role of the Executive Sponsor Key Learning from IHI HAI ACTION WORKGROUP Jim Conway Senior Vice President, IHI
1 Improvement Measurement plan in Vulnerable Children Program Orame Ngozi M&E/KM Advisor University Research Company Plot 432,AMMA House,Yakubu J. Pam.
Measurement for Improvement 18 March 2008 Mike Davidge.
The Evaluation Plan.
Everyone Has A Role and Responsibility
Building Leadership Teams Driving Continuous Improvement Throughout the School! Session #3 January 2012.
Creative ways to use data: A toolkit for schools Susan Barrett
Missouri Integrated Model Mid-Year Meeting – January 14, 2009 Topical Discussion: Teams and Teaming Dr. Doug HatridgeDonna Alexander School Resource SpecialistReading.
Part I – Data Collection and Measurement Ruth S. Gubernick, MPH Quality Improvement Advisor Lori Morawski, MPH CHES Manager, Quality Improvement Programs.
QI approach to EDL completion May 2012 Emma Vaux.
Improvement Forum    A webinar series for QI Managers, Nurse Leaders and others supporting healthcare improvement in Wisconsin’s hospitals    March.
WHA Improvement Forum For April    “Prioritizing New Interventions”   Stephanie Sobczak Courtesy Reminders: Please place your phones on MUTE unless.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
WHA Improvement Forum For June    “Tapping Front-line Knowledge”   Presented by Stephanie Sobczak and Jill Hanson Courtesy Reminders: Please place.
Systematic Improvement VTE 1 Courtesy Reminders: Please place your phones on MUTE unless you are speaking (or use *6 on your keypad) Please do not take.
Home Truths: How well do you understand GPs? 18 th April 2013.
Allied Healthcare Professions Service Improvement Projects Regional Event Turning Data Into Knowledge Resource Pack.
Overview The Importance of Data: As easy as balancing your checkbook.
WHA Improvement Forum For May    “Strategies for ‘in-process’ Measurement”   Travis Dollak Courtesy Reminders: Please place your phones on MUTE unless.
Rapid cycle PI Danielle Scheurer, MD, MSCR Chief Quality Officer Medical University of South Carolina.
Comprehensive Unit Based Safety Program    A webinar series for QI Managers, Nurse Leaders and others supporting healthcare improvement in Wisconsin’s.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
A partnership of the Healthcare Association of New York State and the Greater New York Hospital Association NYSPFP Preventable Readmissions Pilot Project.
Data-driven Decisions for Redesigning Care Delivery >
WHA Improvement Forum For September    “Managing the Improvement Portfolio”   Tom Kaster & Travis Dollak Courtesy Reminders: Please place your phones.
Improvement Forum    A webinar series for QI Managers, Nurse Leaders and others supporting healthcare improvement in Wisconsin’s hospitals    June.
July 2012 Your hosts: Jody Rothe, MetaStar Stephanie Sobczak, WHA.
Reducing Readmissions K-HEN Data Collection & Submission Dolores Hagan, RN BSN K-HEN Education and Data Manager August 2012.
Parental Involvement & Engagement Standardised process to monitor pupil progress Support material for parental involvement in supporting pupil study.
Transforming Patient Experience: The essential guide
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Efficiently Implementing Protocols and Bundles: Engaging Stakeholders    December 9 from 2 – 3 pm    Hosted by: Stephanie Sobczak Courtesy Reminders:
Kathy Corbiere Service Delivery and Performance Commission
Chapter 9 Review How can you measure employee engagement levels over time?
Review of Reliable Design QUEST Frank Federico. What does this line graph tell you? Education Standardization.
Moving ON Audits Illawarra Retirement Trust. Foundation An opinion without data is just another opinion Real data helps services and managers to make.
Courtesy Reminders: During the webinar, you may select *7 on your phone to speak, and use *6 to mute. Please refrain from placing the phone on HOLD during.
How does Data Influence the Aim To Manage Change To Impact the Aim And Improve Process of Care Todd Molfenter.
Measuring Development Impact: Beyond Satisfaction Deloitte Services LP Janelle HughesJohn DeVille Development LeaderTalent Analytics Manager.
IPSP Outcomes Reporting Framework What you need to know and what you need to do.
DISCUSSION QUESTIONS What challenges do chronically ill patients face in staying out of the hospital? Are today’s medical students prepared to recognize.
Courtesy Reminders: During the webinar, you may select *7 on your phone to speak, and use *6 to mute. Please refrain from placing the phone on HOLD during.
Driving to Results: Key Changes and Leadership Behaviors: Management Systems to Deploy & Sustain the Improvements David Munch M.D. IHI Faculty Chief Clinical.
Local HealthWatch: Information Event Monday 16 th July 2012, 2pm – 5pm NHS Gloucestershire, Sanger House, Brockworth, Gloucester.
FALLS PROJECT Falls Auditing  Falls audits in the care homes had traditionally focused upon the number of falls per month – was a paper exercise.
Greetings Nick Szubiak, MSW, LCSW Integrated Health Consultant
Welcome Recording Slide Deck Chat Box Mute
Run charts Science of improvement
Improvement 101 Learning Series
Coaching change through data driven team work
Presentation transcript:

WHA Improvement Forum For July    “Data Driven Improvement”   Presented by Stephanie Sobczak Courtesy Reminders: Please place your phones on MUTE unless you are speaking (or use *6 on your keypad) Please do not take calls and place the phone on HOLD during the presentation.

2 Today’s Webinar Agenda  Data Driven Decision Making  Data Mining Your Processes  From Data to Information  “Real Time” Improvement

Why measure? The main reason for conducting an improvement project is to achieve results, no matter the issue or topic. And how do we know we have achieve a desired result that can be proven to others?  We must demonstrate change from a baseline, or initial measurement, and assess the degree of change after an intervention. 3

Linking Measures to Small Tests of Change AIM: Improve M/S Unit HCAHPS Score for “Patients Always received requested help” by 10 points by Nov Small Test of Change: Anyone within 6 feet of a room will answer that patient’s need. Possible Process Measures: Track how many call lights occur between the hours of 10am -11am and 1pm - 2pm Hallway observation of day shift staff response to call light by a volunteer for 1 hour on 3 different days each week How will we prove this change is effective?

Measurement Best Practices 5  Measures are the proof of improvement  Measures guide improvement by informing the decisions about which changes to test  Outcome and Process Data should be plotted over time on annotated graphs.  Process measurement should be integrated into the team’s daily routine, and shared with staff in ‘real time’.  Really important skill set for staff to understand

6 Graph Your Data Over Time

Time Series Charts 7 Like the EKG of a process!

Data on the Surface 8 No Improvement over time… WHY?

Taking a Deeper Dive – Looking for Drivers What are your Drivers for your Outcomes? Do you have data for these processes? 9 What does the evidence say? If not, can you collect some? If yes, what does it tell you?

Connecting Outcome & Process Measure Data 10 Key: Up is Better

Data Mining Your Processes 11

6 months, no improvement 12 WHY? We do follow-up calls!

Follow- up Calls – quarterly data 13 Need to work on improving the number of calls made

Dive deeper into each STOC results How soon do you see relationships between outcome – process – small test of change? Ex: Follow-up Calls – Was the plan executed well? – Was it the wrong plan? – Were the results not sustained over time? – Is more effort required? 14

How to segment data Start with theories on ‘why’? Dive deeper to demographics such as days/night, age, Dx, unit, etc. Spilt graph into multiple graphs to see driving forces Example: Pts. discharged with appointments 15

16 Evidence  INTERACT adoption in LTC and/or Care Transitions Coaches

Some positive trend - is it related? 17 Adopt INTERACT Toolkit in LTC?

Improvement Plan AIM: Reduce Readmissions by 50% for those over 65 through implementing Care Transitions Coaches Process Measure: Patients receiving coaching Plan: Implement in March, gather monthly data 18

19

Decisions Purpose 20 Improve follow-up call process Prevent Readmissions INTERACT Toolkit Adoption Decrease Transfers to ED Care Transitions Coaching Model Assistance for older adults to prevent readmissions

21 New F/u Call Process INTERACT Initiative in LTC Start Care Transitions Coaching

“Mining” your data Other analysis: Does our performance go down with a higher census? Is performance the same on every shift? Are there variations in the practices of individuals? Disclaimer information here…22

Common Mistakes Only one person looks at the process measures Staff aren’t aware of the process measures and how they are directly linked to improvement Processes are only measured for a short period of time Processes aren’t measured at all 23

Turn Your Data to Information 24 Visual displays of data can provide greater insights into the systemic knowledge that lives in the data Turn “data” into “information” Data: raw facts Information: data that has been processed and analyzed so that it is directly useful Visual displays of data highlight variation in the system Systems thinking and understanding of variation is essential for Improvement

Telling a Strong Story 25

The Importance of Data Display If data is not displayed or interpreted correctly, incorrect assumptions can be made leading to poor decisions Aggregated or data presented in tabular formats or with summary statistics, will not help you measure the impact of process improvement/redesign efforts. Aggregated data can only lead to judgment, not to improvement. ©Copyright 2012 IHI/R. Lloyd26

Robut Data Displays Tell a Better Story 27 Identical Improvement

Using Data to Develop an Improvement Plan 1. Which process do you want to improve or redesign? 2. Does the process contain non-random patterns or special causes? 3. How do you plan on actually making improvements? What strategies do you plan to follow to make things better? 4. What effect (if any) did your plan have on the process performance? 28 Run & Control Charts will help you answer questions 2 & 4. YOU need to figure out the answers to questions 1 & 3.

Annotate Your Run Charts 29

30

Tells WHY something is changing Helps show your Improvements work Improves support/ ‘buy-in’ Documents what you have learned when revisited in the future. 31 Why Annotate?

Keep visuals simple – and visible! 32

Provide Users will Real Time Feedback Don’t save the reports until meetings Staff will better understand improvement work Enables faster improvement Empowers staff to make their own improvements Better shows where the opportunities for improvements live 33

Summary Look further than Outcome Measures – Measure what you DO / what you TEST Create data displays that are simple but informative – Measure over time Provide real time feedback – Easy to access, Meaningful measurement 34

Next Month: 35 Establishing an Accountable Culture August 22 Noon  The “Two Jobs” of Work  Who’s job is “accountability?”  Strategies to build engagement

Thank You! Questions Please complete 3 question survey when closing webinar window. 36