Download presentation
Presentation is loading. Please wait.
1
Results-Based Financing & Quality of Care
Measuring and Paying for Quality Improvement Dinesh Nair / World Bank
2
Session Outline: Measuring and Paying for Quality
Existing Instruments and Methods Using Data for decision making Verifying Data Accuracy Innovations in Measuring and Paying for Quality
3
1. Existing Instruments and Methods
4
Measuring if the right inputs are in place
5
Measuring processes and results (1) Continuous monitoring
(a) Management and Structural Checklist Business/Operation Plan ~60% Min 50% (1) Continuous monitoring PBF Bonus Calculation Tool (b) Process of Care Quality Checklists Health Worker Bonus Allocation ~20% (c) Quantity Checklist Max 50% Liberia: Quality Assessment/ Monitoring Tools ~20% (d) Impact Evaluation
6
Liberia : Standards for Management Obstructed Labor: Illustrative Checklist Distilling Essential care Items (admission, labor) Chart review elements (see chart review guide for specific criteria) ; each element if recorded = 1 point Charts Admission 1 2 3 4 5 1. Cervical dilation recorded at admission (# of cm) 2. Contraction frequency and duration charted at admission 3. Fetal presentation charted at admission 4. Partograph started when cervical dilation 4 cm or greater Admission Score (x/4) 2. Labor Monitoring (partograph) 1. Cervical dilation recorded at least every 4 hours 2. Frequency and duration contractions recorded at least every 30 minutes 3. Fetal HR recorded at least every 30 minutes Labor Monitoring Score (x/3) Five patient charts reviewed: average score (% adherence best practices) links with bonus Each item has chart review guide that defines criteria
7
Kyrgyzstan: multiple approaches to measuring quality
Record Reviews Simulations of routine labor and delivery, postpartum hemorrhage and eclampsia using Mama Natalie Simulation of newborn resuscitation using Neo Natalie Simulation of surgical safety checklist use Patient interviews by phone include basic quality tracers (access to sanitation facilities; recall health education messages; informal payments and general satisfaction using a Likert scale) Peer to Peer verification and they were learning together
8
2. Using Data for Decision Making
9
Nigeria: Institutional Deliveries increased from 20% to 44% during 2015(120% increase)
After scall up three state wide in January 2015, after 12 months of PBF, there are clear differences in the increase: Ondo State for instance is flatlining (there are multiple reasons for that, related to health facility autonomy (there was very limited autonomy health facilities had to ask the district for permission on their expenses, also earnings from PBF were cut to 25% instead of the 50%). However, Adamawa and Nasarawa States show very large improvements.
10
Large variability in Institutional Deliveries across Health Centers Fufore District, Adamawa State Nigeria during 2013 Data include a mix of both quantity data (service productivitiy across service packages) and qualtiy data. This is an example of one specific service, institutional deliveries, that take off after introduction PBF in a pilot district in Adamawa State I December There is a LARGE VARIABIILTY in the increase. The Mayo Ine HC case is pretty famous: the health center went from 10% coverage for institutional deliveries to 100% coverage in a period of six months.
11
Burundi. Average total quality score for health centers, by province and time
12
Quthing District: average quality in health centers is the same after 12 months piloting of PBF due to autonomy problems This is an ICONIC picture of the impact of NO AUTONOMY on Implementation. The same as the previous slides, but averaging the qualities across health centers. The BEFORE and AFTER radar grapsh form a perfect match!!
13
3. Verifying Data Accuracy
14
NIGERIA: Quality of Care at PHCs : Raising the Bar
Another area where we use data: to adjust the bar by changing the content of the quality measures….here is an example from Nigeria where end 2013, the qualtiy measures were adjusted to weight much heavier process measures of quality. There was a subsequent drop in quality results. Quality of care also improved significantly with emphasis on structural and process of care indicators (higher emphasis on process end 2013 leads to drop) Overall patient perceptions on quality of care is relatively satisfactory Counter-verification of the quality: relative large discrepancies
15
Concordance in 2015 and 2016 CCSS (quantity counterverification) clearly seems to work. Attempt to introduce mobile phone technology (Android SMART phones) with automatic upload and dashboards to increase frequency and to enhance analysis (in works)
16
Challenges to Measuring and Rewarding Quality Performance
Ex-ante verification by district health team may be too gentle and not accurate: too close for comfort or still old fashioned ‘filling under the banana tree’? Regular counter-verification with credible sanctions are an important requirement Specifying incentives for district supervisors also seems a promising route (share of earnings; accreditation status; carrots and sticks) Introduction of modern ICT such as tablet based checklists, which embed meta data (location; time; interviewer passcode) seem a promising approach too
17
4. Innovations in Measuring and Paying for Quality
18
Vignettes Provide a Standard Measure of Practice
Virtual Patient presents with symptoms Provider cares for a variety of clinical cases Provider goes through the different clinical domains as when they see a patient Take History Conduct a Physical Exam Order Tests Make a provisional diagnosis Decide on treatment
19
Technology Aids for Quality Measurements in PBF
Tablets for quantified quality checklists (‘balanced score cards’) with automated uploads to a cloud based database and public dashboard. Offline data entry possible (as above) Tablet based solution for Vignettes (under development) Smart phone for community client interviews. Off line data entry possible. Automated uploads to a cloud based database and public dashboard. Results impact on performance payments Web-based public dashboard for performance benchmarking
21
In Summary Quality is poor and varied
Much improvements in access and structural elements of care Improving clinical processes remains the big immediate challenge Innovations are happening in the space of measuring clinical processes Data from measurement needs to translate to decisions
22
THANK YOU
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.