Download presentation
Presentation is loading. Please wait.
Published byShauna Dorsey Modified over 8 years ago
1
1 Improving Performance in Practice On the road to a large scale system to improve outcomes for populations of patients DeWalt DA, McNeill J, Stanford K, Rome M, Margolis P http://creativecommons.org/licenses/by-nc-sa/3.0/ 1 Funded by the Robert Wood Johnson Foundation
2
2 Outline IPIP purpose and design Intervention and evolution Data source and evaluation methods Results Interpretation 2
3
3 IPIP Program Purpose Align efforts and motivate action across primary care specialties and all levels of the health care system to transform care delivery Assist practices in re-designing care Initial focus on diabetes and asthma Spread to preventive services and other conditions Improve outcomes for populations of patients 3
4
4 IPIP: A Multi-Level Improvement Effort National State Network Practice Patient 4
5
55 IPIP Design—Practice Level Local improvement networks Data sharing for learning QI support for improvement networks and individual practices through quality improvement coaches State leaders, network leaders, and IPIP experience led to evolution of how these were operationalized This evolution and variation gives us the opportunity to learn about different effects
6
6 Practice Coaching Onsite assessment of current systems Onsite teaching and technical assistance Team formation and practice engagement QI methods (Model for Improvement) Information systems advice Measures and reporting Interpretation of performance data Recommended changes in care delivery Improvement ideas Linkage with collaborative improvement programs
7
7 Other Elements of Practice Support (Context) The IPIP intervention is multifaceted Other factors could affect improvement as much or more than coaching style or content Collaborative improvement efforts Practice selection External motivators and incentives External leadership Recommended practice design changes
8
8 Objective To evaluate outcomes of the IPIP improvement effort for three states in their first year
9
9 Comparison 9 Prototype YearSecond Year State AState BState C Practice Selection Practices signed up— media campaign Practices were “hand picked”—buddies, cooperative…. Practices were recruited for PCMH pilot CollaborationNo group workEvening mtg 3x/yrBreakthrough Series Financial Incentives None$2000 to report dataDramatic incentives (e.g., 28-95K/FTE and payer mix for PCMH) Prepared Registry No Yes ConsultingQIC support TopicFocus on diabetes or asthma
10
10 Measures Process measures (e.g., % with DM with eye exam) Outcome measures (e.g., % with DM with BP<130/80) Implementation Rated on scale of 0-5 Registries Protocols Templates Self-management support Overall
11
11 Example Rating System 0 - No activityNo activity on registry adoption or use 1 - SelectedPractice has chosen a registry, but not yet begun using it. 2 - InstalledPractice has registry installed on a computer, set up a template, entered demographic data on patients of interest (e.g., diabetes) or has a process outlined to systematically enter the data. 3 - Testing workflowPractice is testing process for entering clinical data into registry; not yet using the registry to help with daily care of patients. 4 - Patient management All clinical data is entered into the registry and practice is using the registry daily to plan care for patients and is able to produce consistent reports on population performance. 5 - Full integrationRegistry is kept up to date with consistent, reliable processes. Practice has checks and monitors registry processes. Practice uses registry to manage entire patient panel (population).
12
12 Data Source Practice reports own performance Establish baseline Takes number of months to stabilize data quality Take baseline at stable data (biases toward null) Assume no improvement if never achieve baseline (biases toward null) States A and B started February 2007 State C started June 2008
13
13
14
14 Analysis Compare percent of practices with specified absolute improvement >10% improvement in process measures >5% improvement in outcome measures Calculate average change in performance per month Allows us to take into account different amount of time per practice Based on difference between first stable month and final month 14
15
15 Results: Description of Practices State A N=16 State B N=12 State C N=24 EHR12%58%63% Improvement Experience24%25%38% Median Number of Providers 4 (range: 1-11) 8 (range: 1-61) 5 (range: 2-65) 15
16
16 Time to Data Stability State C: 24 PracticesState B: 12 PracticesState A: 16 Practices Maximum months of analysis 16 1612
17
17 Baseline Performance Measure State A N=12 State B N=10 State C N=22 % Attn to Nephropathy 44.361.160.3 % Foot Exam 36.455.446.1 % LDL test 66.174.368.0 % Flu Vacc 23.143.444.9 % Eye Exam 20.232.325.3 % A1C < 9 72.588.970.6 % BP < 130 38.140.241.5 % BP < 140 64.260.265.0 % LDL < 100 46.939.839.0 % LDL < 130 59.861.355.7
18
18 Percent of Practices with >10% Improvement * Denotes statistically significant difference Preliminary
19
19 Percent of Practices with >5% Improvement * Denotes statistically significant difference Preliminary
20
20 Mean Percent Improvement Per Month * Denotes statistically significant difference Preliminary
21
21 Mean Percent Improvement Per Month * Denotes statistically significant difference Preliminary
22
22 A look under the hood Self-Management SupportRegistry Implementation and Use
23
23 Limitations Using data collected and reported by practices Coaches often spent a lot of time on data reporting Time to stable data led to underestimate of improvement Statistical tests do not take advantage of repeated measures analysis (sorting out those models now)
24
24 Interpretation Magnitude of improvement in process measures is similar to or greater than improvement seen in Health Disparities Collaborative evaluations State C had more consistent improvement across measures, but the differences are not staggering at this point Design of the practice support may affect results Collaborative learning Clear expectations Payment
25
25 Where does this lead? IPIP is creating a system for improving improvement Variation provides opportunity for significant learning about systems required to drive improvement Move toward more controlled variation Now close to 250 practices nationwide Growth of the program will offer more statistical power With stable ongoing reporting, the data analysis will become easier and more robust Any single intervention will have a modest effect need to combine elements of practice support
26
26 Acknowledgements American Board of Medical Specialties American Board of Pediatrics American Board of Family Medicine American Academy of Pediatrics American Academy of Family Physicians States of Colorado Michigan Minnesota North Carolina Pennsylvania Washington Wisconsin Funded by the Robert Wood Johnson Foundation
27
27 Comparison to Other Results
28
28 Self-Management Support Rating Scale 0 - No activityNo activity on self management support 1 - Materials on handPractice has obtained patient education materials and handouts to support self-management. 2 - Roles assignedPractice has completed a plan for providing self-management support that includes all of the elements indicated in the change package. Staff roles and responsibilities are clearly delineated. 3 - Testing workflowPractice actively testing their process for self-management support. All staff involved in self-management support has undergone appropriate training. Patient goal setting and systematic follow-up are being implemented at least in part of the practice. 4 - Implementation 70%Self-management support is consistently offered. Practice documents self- management goals for patient in the chart or registry, getting performed across the entire practice. Monitoring reliability is occurring. 5 - Implementation 90%Patients consistently have self-management goals documented, follow-up system is reliable, staff are comfortable providing self-management support. Ongoing monitoring ensures the process is carried out consistently for all patients.
29
29 Simplified Change Package Registry to identify patients prior to visit Templates for planned care (e.g., visit planner) Protocols to standardize care Standard Protocols Nursing Standing Orders Defined Care team roles Self-management support strategies 29
30
30 1. Communicate high expectations at all levels 2. Use multiple communication methods 3. Use structured participatory process for setting population- based goals and targets 4. Enumerate and describe entire population of practices 5. Plan for sustainable leadership 6. Develop leaders' improvement skills 1. Partners assume responsibility for outcomes 2. Link to hospitals, public health organizations, quality organizations and others for resources, expertise, data, 3. Access to administrative data (e.g. hospitalizations) 1. Maintenance of Certification 2. CME 3. Engage payers in design of rewards (e.g. Pay for Performance) 4. NCQA recognition 1.Create enduring collaborative improvement networks Promote practice teams that improve rapidly (“super improvers”) Share best practices in clinical and process improvements Promote peer-to-peer communication Ongoing cross-organizational and state learning 2. Provide tools and information that promote evidence-based best practices 3. Share knowledge and improve QI support 1. Routine performance measurement 2. Transparency of comparative data 3. Standardized measures and definitions 4. Promote and support the effective use of registries Attractive motivators and incentives Partnerships that promote health care quality Accountable leadership focused on health outcomes Measure performance and share data Active participation in an organized quality improvement effort Goals (by January 1, 2010) 350 new practices participating 90,000 new patients in denominators Increase in clinical process measures Improvement in clinical outcome measures Interventions Key Drivers IPIP National Key Driver Diagram 30
31
31 Goals for IPIP Performance MeasureGoal DMPctA1CAbove95 DMPctBPBelow13070 DMPctBPBelow14090 DMPctEyeExam80 DMPctFluVacc80 DMPctFootExam90 DMPctLDLUnder10070 DMPctLDLUnder13090 DMPctMicroalb90 DMPctSmokCess90 DMPctWithLDL90
32
32 IPIP Data Flow
33
33 Total Number of Diabetes Patients July 2009
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.