Download presentation
Presentation is loading. Please wait.
1
Healthcare Performance Measurement
Quality & Accreditation Directorate *Quality is every body business *One of Q&AD big goals is to Improve the quality of healthcare services, (do no harm to patients) Q&AD Role: Provide technical support through training & guidance on how to achieve quality healthcare service & Maintaining continuous Q improvement, e.g.: Strategic & action plans Policies & procedures Problem solving How to measure performance (indicators, auditing) Accreditation (which is a tool for measuring quality) Other Skills that assure & help in implementing & maintaining high quality of healthcare services One of the tools used is indicators ( performance measures) To develop an indicator which is scientifically sound certain steps should be followed: * training * formulate a team * review literature * adopt or develop indicator * do necessary modifications (what should be included /excluded from N/ D Define terms..... Data elements From where data should be extracted How frequent) -----> information set * develop data collection form if needed * pilot * modify * apply Q&AD because it's a new concept did the training, shared in developing IS &DC forms, collects data, analyses it , produce reports for hospital to *Review it *Investigate why & identify weaknesses in the process or system *Recommend & Implement corrective actions *Follow up next report to monitor if such corrective actions where effective or NOT What is an indicator It's a number reflecting a rate ( N/D) for you to ask yourselves *How far am I from the standard figure *Or am I satisfied with the number * can I improve it by improving our performance * how can we improve our performance Feedback to Q&AD and it's importance eg of indicators report from life: Bill from MoE Shows fig of how much is your electricity consumption in relation to cost What would be your reaction if it is high? Investigate is it due to error in reading Or it's actual What next if actual? You would take serious actions in cutting off unnecessary consumption which reflect on reducing cost.!! Next MoE Bills are your way to find out if actions are effective & whether maintained or not This example is compatible with Q&AD reporting of indicators As the directorate gathers the data & provide you with the figures For you to act upon them as you know more about your practices & hospital.
2
Introduction Definitions Understanding Quality Indicators Exercise
3
Quality Accessibility Client-Centered Safety Work-life
Continuity of Service Competence Population Focus Effectiveness Efficiency Accessibility: Providing available, timely and equitable services. Client-Centered: Putting clients and families first. Safety: Keeping people safe and preventing harm to the users, providers and environment Work-life: Supporting wellness in the work environment and promoting work-life balance Continuity of Service: Experiencing coordinated and seamless services over time Competence: Knowledge, skills and actions of individuals providing service are appropriate to the services provided Population Focus: Working with the communities to anticipate and meet needs Effectiveness: Doing the right thing to achieve best possible results. Efficiency: Making the best use of resources.
4
Measurement Measurement.. is the assignment of numbers to objects or events. Managing quality requires measuring it. By monitoring specific objects or events, we are able to measure defined variables They may either directly measure or indirectly reflect the quality of care provided. Hence, they are referred to as Quality Indicators Indicators are customised according to the local context and circumstances. Explain the concept of the indicator threshold (e.g. thermometer, Speed meter) What is an indicator It's a number reflecting a rate ( N/D) for you to ask yourselves *How far am I from the standard figure *Or am I satisfied with the number * can I improve it by improving our performance * how can we improve our performance Feedback to Q&AD and it's importance eg of indicators report from life: Bill from MoE Shows fig of how much is your electricity consumption in relation to cost What would be your reaction if it is high? Investigate is it due to error in reading Or it's actual What next if actual? You would take serious actions in cutting off unnecessary consumption which reflect on reducing cost.!! Next MoE Bills are your way to find out if actions are effective & whether maintained or not This example is compatible with Q&AD reporting of indicators As the directorate gathers the data & provide you with the figures For you to act upon them as you know more about your practices & hospital.
5
Indicators They are measurable and defined variables that refer to the structures, processes, or outcomes of care. Indicators indicate good quality of care or potential problems They are developed using review criteria and standards Indicators are customised according to the local context and circumstances. Explain the concept of the indicator threshold (e.g. thermometer, Speed meter) What is an indicator It's a number reflecting a rate ( N/D) for you to ask yourselves *How far am I from the standard figure *Or am I satisfied with the number * can I improve it by improving our performance * how can we improve our performance Feedback to Q&AD and it's importance eg of indicators report from life: Bill from MoE Shows fig of how much is your electricity consumption in relation to cost What would be your reaction if it is high? Investigate is it due to error in reading Or it's actual What next if actual? You would take serious actions in cutting off unnecessary consumption which reflect on reducing cost.!! Next MoE Bills are your way to find out if actions are effective & whether maintained or not This example is compatible with Q&AD reporting of indicators As the directorate gathers the data & provide you with the figures For you to act upon them as you know more about your practices & hospital.
6
Dimensions of Care Structure Process Outcome Human Resources Buildings
Equipment Supplies Finance Process Assessment Admission & Discharge Procedures Medication Investigation Outcome Test results Mortality Morbidity Complications Satisfaction
7
Rate-based or Sentinel
uses data about events that are expected to occur with some frequency Sentinel identifies individual events or phenomena that are serious, undesirable, and always trigger further analysis and investigation
8
Generic or Disease-Specific
Can be applied to most patients Disease-Specific Specific to a particular clinical condition/disease
9
Indicators Development
Choose area to evaluate High volume High risk High cost Problem prone Variability in practice Availability of guidelines/indicators Actionable Diagnosis Specialty Procedure
10
Criteria for area selection
Total score Rank High volume Practice Gap Guidelines Actionable
11
Criteria for area selection
Total score Rank High volume Practice Gap Guidelines Actionable 1. Lab test orders duplicated 2. Identification Errors 3. Specimen identification, preparation, and transport 4. Analytical Errors 5. Reporting Delays
12
Criteria for area selection
Total score Rank High volume Practice Gap Guidelines Actionable 1. Lab test orders duplicated 3 12 2. Identification Errors 2 11 3. Specimen identification, preparation, and transport 1 8 4. Analytical Errors 10 5. Reporting Delays 7
13
Criteria for area selection
Total score Rank High volume Practice Gap Guidelines Actionable 1. Lab test orders duplicated 3 12 1 2. Identification Errors 2 11 3. Specimen identification, preparation, and transport 8 4 4. Analytical Errors 10 5. Reporting Delays 7 5
14
Criteria for area selection
Total score Rank High volume Practice Gap Guidelines Actionable 1. Lab test orders duplicated 3 12 1 2. Identification Errors 2 11 3. Specimen identification, preparation, and transport 8 4 4. Analytical Errors 10 5. Reporting Delays 7 5
15
Indicators Development
Choose area to evaluate Organize indicator team Relevant level of field specialists or groups Readiness and commitment Reasonable size (8-12) Planned meetings Time management
16
Indicators Development
Choose area to evaluate Organize indicator team Literature & evidence review Scientific literature Pubmed BMJ Google Scholar Consensus about existing evidence/practice
18
Indicators Development
Choose area to evaluate Organize indicator team Literature & evidence review Select indicators qualitymeasures.ahrq.gov/
19
Indicators Development
Choose area to evaluate Organize indicator team Literature & evidence review Select indicators Feasibility Easily available data collection process Relevance Relevant to important aspects of quality Reliability Available and reliable data sources qualitymeasures.ahrq.gov/ Validity Strong correlation with the current quality of care Robust Statistically adequate number of event to be monitored
20
Criteria for Indicator selection
Total score Rank Feasible Reliable Valid Robust Relevant 1. 2. 3. 4. 5. 6. 7. 8. 9. 10.
21
Criteria for Indicator selection
Total score Rank Feasible Reliable Valid Robust Relevant 1. Indicator 1 2 1 3 12 2. Indicator 2 10 3. Indicator 3 11 4. Indicator 4 5. Indicator 5 6. Indicator 6 7 4 7. 8. 9. 10.
22
Indicators Development
Choose area to evaluate Organize indicator team Literature & evidence review Select indicators Describe design specifications Information Set Define indicators and standards Identify target population Determine inclusion and exclusion criteria Devise risk adjustment strategy Identify data sources Describe data collection procedures The target population refers to the patient group whose care the clinical indicator is designed to assess. Specific inclusion and exclusion criteria have to be defined. It should also be decided whether the selection should be based on confirmed diagnoses, or symptoms or signs. Whether prevalent or incident cases (or both) are included should also be taken into consideration. It might be relevant to describe upper or lower age limits. Finally, decisions about the time period for measurement should be taken. Data Collection Form
23
Numerator & Denominator
Total Population Denominator Denominator Excluded Population Numerator Numerator Excluded Population
24
Measure ID: Measure Set: Measure Name: Measure description: Numerator Statement: Numerator Excluded Population: Numerator Data Elements: Denominator Statement: Denominator Excluded Population: Denominator Data Elements: Rationale: Type of Measure: Data reported as: Improvement as: Sampling: References: Risk adjustment: Suggested analysis: Definitions: Frequency of analysis: Age Group: Data Sources: Data Collection Form:
25
Timely follow-up of abnormal test results
All Tests Abnormal test results Normal sample results Records of documented follow-up of abnormal test results
28
Indicators Development
Choose area to evaluate Organize indicator team Literature & evidence review Select indicators Describe design specifications Pilot the indicator Test Reliability and Validity Reliability of a clinical indicator expresses the extent to which repeated measurements of a stable phenomenon by different providers and instruments, at different times and places, obtain similar results. Reliability is important for comparing groups or comparing the same group over time periods. Reliability can be tested as inter-rater reliability, where different people or methods provide data on the same indicator. Reliability can also be tested as internal consistency, for which two indicators, expected to measure the same aspect of quality of care, are compared. Measuring inter-rater reliability, internal consistency, and test–re-test reliability allows users to determine if the data collection methods are precise enough to provide reproducible results. These methods assess data quality powerfully and identify whether the measure and data collection procedures are well specified. Validity determines the degree to which an indicator measures what it is intended to measure, that is whether the results of a measurement corresponds to the true state of the phenomenon being measured. Validity can be tested by confirming that the scores of a measure are linked to specific outcomes, and that the measure can reflect good and bad quality.
29
Indicators Development
Choose area to evaluate Organize indicator team Literature & evidence review Select indicators Describe design specifications Pilot the indicator Utilization of indicators Identify target audience Results feedback methods Poor performance actions
30
Indicators Development
Choose area to evaluate Organize indicator team Literature & evidence review Select indicators Describe design specifications Pilot the indicator Utilization of indicators Update, review and revision Regular & Periodic
31
Indicators Development
Choose area to evaluate Organize indicator team Literature & evidence review Select indicators Describe design specifications Pilot the indicator Utilization of indicators Update, review and revision
32
Thank You Questions? Dr Hanaa Al-Ghanim
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.