Download presentation
Presentation is loading. Please wait.
Published byDorthy Wiggins Modified over 6 years ago
1
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel, RN, MPH, CPHQ Manager, Performance Improvement Projects David Mabb, MS Sr. Director, Statistical Evaluation
2
Presentation Outline PIP Overall Comments Aggregate MCO PIP Findings
Aggregate NHDP Specific Findings Technical Assistance with Group Activities Study Design Study Implementation Quality Outcomes Achieved Questions and Answers
3
Conduct outcome-oriented projects Achieve demonstrable improvement
Key PIP Strategies Conduct outcome-oriented projects Achieve demonstrable improvement Sustain improvement Correct systemic problems Just go through slide. This is consistent with CQI and the move to improve health outcomes
4
Validity and Reliability of PIP Results
Activity 3 of the CMS Validating Protocol: Evaluating overall validity and reliability of PIP results: Met= Confidence/High confidence in reported PIP results Partially Met= Low confidence in reported PIP results Not Met= Reported PIP results not credible
5
Summary of PIP Validation Scores
6
Proportion of PIPs Meeting the Requirements for Each Activity
7
Aggregate Valid Percent Met
II III IV V VI VII VIII IX X
8
NHDP Specific Findings
20 PIPs submitted Scores ranged from 17% to 75% Average score was 40% Assessed evaluation elements were scored as Met 40% of the time
9
Summary of NHDP Validation Score
10
Study Design Four Components:
Activity I. Selecting an Appropriate Study Topic Activity II. Presenting Clearly Defined, Answerable Study Question(s) Activity III. Documenting Clearly Defined Study Indicator(s) Activity IV. Stating a Correctly Identified Study Population
11
Activity I. Selecting an Appropriate Study Topic - NHDP Overall Score
12
Activity I. Selecting an Appropriate Study Topic
Results: 71 percent of the six evaluation elements were Met 29 percent were Partially Met or Not Met None of the evaluation elements were Not Applicable or Not Assessed
13
Activity I: Review the Selected Study Topic
HSAG Evaluation Elements: Reflects high-volume or high-risk conditions (or was selected by the State). Is selected following collection and analysis of data (or was selected by the State). Addresses a broad spectrum of care and services (or was selected by the State). Includes all eligible populations that meet the study criteria. Does not exclude members with special health care needs. Has the potential to affect member health, functional status, or satisfaction. Bolded evaluation elements show areas for improvement
14
Activity II. Presenting Clearly Defined, Answerable Study Question(s) - NHDP Overall Score
15
Activity II. Presenting Clearly Defined, Answerable Study Question(s)
Results: 28 percent of the two evaluation elements were Met 73 percent were Partially Met or Not Met None of the evaluation elements were Not Applicable or Not Assessed
16
Activity II: Review the Study Question(s)
HSAG Evaluation Elements: States the problem to be studied in simple terms. Is answerable. Bolded evaluation elements show areas for improvement
17
Activity III. Documenting Clearly Defined Study Indicator(s) - NHDP Overall Score
18
Activity III. Documenting Clearly Defined Study Indicator(s)
Results: 27 percent of the seven evaluation elements were Met 54 percent were Partially Met or Not Met 19 percent were Not Applicable or Not Assessed
19
Activity III: Review Selected Study Indicator(s)
HSAG Evaluation Elements: Is well defined, objective, and measurable. Are based on practice guidelines, with sources identified. Allows for the study question to be answered. Measures changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives. Have available data that can be collected on each indicator. Are nationally recognized measure such as HEDIS®, when appropriate. Includes the basis on which each indicator was adopted, if internally developed. Bolded evaluation elements show areas for improvement
20
Activity IV. Stating a Correctly Identified Study Population - NHDP Overall Score
21
Activity IV. Stating a Correctly Identified Study Population
Results: 48 percent of the three evaluation elements were Met 52 percent were Partially Met or Not Met None of the evaluation elements were Not Applicable or Not Assessed
22
Activity IV: Review the Identified Study Population
HSAG Evaluation Elements: Is accurately and completely defined. Includes requirements for the length of a member’s enrollment in the managed care plan. Captures all members to whom the study question applies. Bolded evaluation elements show areas for improvement
23
Group Activity
24
Study Implementation Three Components:
Activity V. Valid Sampling Techniques Activity VI. Accurate/Complete Data Collection Activity VII. Appropriate Improvement Strategies
25
Activity V. Presenting a Valid Sampling Technique - NHDP Overall Score
26
Activity V. Presenting a Valid Sampling Technique
Results: 5 out of the 20 PIP studies used sampling. 9 percent of the six evaluation elements were Met. 16 percent were Partially Met or Not Met. 75 percent of the evaluation elements were Not Applicable or Not Assessed.
27
Activity V: Review Sampling Methods
* This section is only validated if sampling is used. HSAG Evaluation Elements: Consider and specify the true or estimated frequency of occurrence. (N=5) Identify the sample size. (N=5) Specify the confidence level to be used. (N=5) Specify the acceptable margin of error. (N=5) Ensure a representative sample of the eligible population. (N=5) Ensure that the sampling techniques are in accordance with generally accepted principles of research design and statistical analysis. (N=5) Bolded evaluation elements show areas for improvement
28
Populations or Samples?
All? Some? Populations or Samples? Generally, Administrative data uses populations Hybrid (chart abstraction) method uses samples identified through administrative data
29
Activity VI. Specifying Accurate/Complete Data Collection - NHDP Overall Score
30
Activity VI. Specifying Accurate/Complete Data Collection
Results: 25 percent of the eleven evaluation elements were Met 66 percent were Partially Met or Not Met 10 percent of the evaluation elements were Not Applicable or Not Assessed
31
Activity VI: Review Data Collection Procedures
HSAG Evaluation Elements: Clearly defined data elements to be collected. Clearly identified sources of data. A clearly defined and systematic process for collecting data that includes how baseline and remeasurement data will be collected. A timeline for the collection of baseline and remeasurement data. Qualified staff and personnel to collect manual data. A manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications. Bolded evaluation elements show areas for improvement
32
Activity VI: Review Data Collection Procedures (cont.)
HSAG Evaluation Elements: A manual data collection tool that supports interrater reliability. Clear and concise written instructions for completing the manual data collection tool. An overview of the study in the written instructions. Administrative data collection algorithms that show steps in the production of indicators. An estimated degree of automated data completeness (important if using the administrative method).
33
Where do we look for our sources of data?
In a nutshell, data can be found in: Medical Records -- Generally, the gold standard of “it was written, therefore, it was done” has long been an accepted principle by HCFA. Administrative claims/encounter data Survey Data -- provider, HOS, CAHPS, other MCO program data, like lab data Other sources: Hybrid -- combination of record & administrative. Ex: Administrative data can be used to extract the MCO’s diabetic population. Administrative data can further identify those diabetics who had HbA1c’s drawn. A chart abstraction of this administrative data can review the values of HbA1c’s in this patient population HEDIS .
34
Baseline Data Sources Medical Records
Administrative claims/encounter data Hybrid HEDIS Survey Data MCO program data Other QISMC Organization must clearly specify what data are to be used to -identify the population at risk -that data can reliably and validly capture the entire population without systematically excluding a subset of the population Appropriate clinical sources: administrative, medical records “ non-clinical sources: enrollee or provider surveys 3
35
Activity VII. Documenting the Appropriate Improvement Strategies - NHDP Overall Score
36
Activity VII. Documenting the Appropriate Improvement Strategies
Results: 15 percent of the four evaluation elements were Met 18 percent were Partially Met or Not Met 68 percent of the evaluation elements were Not Applicable or Not Assessed
37
Activity Seven: Assess Improvement Strategies
HSAG Evaluation Elements: Related to causes/barriers identified through data analysis and Quality Improvement (QI) processes. System changes that are likely to induce permanent change. Revised if original interventions are not successful. Standardized and monitored if interventions are successful. Bolded evaluation elements show areas for improvement
38
Determining Interventions
Once you know how you are doing at baseline, what interventions will produce meaningful improvement in the target population? Just intro slide -- do not need to discuss Type Target Audience Evaluating Intervention
39
First Do A Barrier Analysis
What did an analysis of baseline results show ? How can we relate it to system improvement? Opportunities for improvement Determine intervention(s) Identify barriers to reaching improvement *This is optional for the M+CO’s to fill out on the report form Look at what you need to improve (more women need to get mammography) and what obstacles are preventing performance (why aren’t all women getting a mammography?) Focus groups can sometimes be helpful here. Root causes Examples for each of the barriers listed: Knowledge: Resources: Organization: Missed opportunities (Or do standard QI : people, processes, etc.)
40
How was intervention(s) chosen?
By reviewing the literature Evidence-based Pros & cons Benefits & costs Develop list of potential interventions -- what is most effective?
41
Types of Interventions
Education Provider performance feedback Reminders & tracking systems Organizational changes Community level interventions Mass media RAND has done studies for HCFA to determine the most effective interventions
42
Choosing Interventions
Balance potential for success with ease of use acceptability to providers & collaborators cost considerations (direct and indirect) Feasibility adequate resources adequate staff and training to ensure a sustainable effort
43
Physician Interventions: Multifaceted Most Effective
real-time reminders outreach/detailing opinion leaders provider profiles Less effective: educational materials (alone) formal CME program without enabling or reinforcing strategies
44
Patient Interventions
Educational programs Disease-specific education booklets Lists of questions to ask your physician Organizing materials: flowsheets, charts, reminder cards Screening instruments to detect complications Direct mailing, media ads, websites ?? Need to check “Healthy Aging Study” done by RAND for HCFA.
45
Evaluating Interventions
Does it target a specific quality indicator? Is it aimed at appropriate stakeholders? Is it directed at a specific process/outcome of care or service? Did the intervention begin after baseline measurement period?
46
Interventions Checklist
Analyze barriers (root causes) Choose & understand target audience Select interventions based on cost-benefit Track intermediate results Evaluate effectiveness Modify interventions as needed Re-Measure You can use this guide when you are onsite at the M+CO to give advice THINK OF AN EXERCISE WHERE THE GROUP GETS UP AND USES THEIR MUSCLES -- ?PUT UP 3 CHART PAD PAGES WITH AN INTERVENTION(S) ON EACH. HAVE THEM WALK AND READ EACH AND THEN PUT A STAR ON THE ONE THAT THEY THINK IS MOST EFFECTVE. Meanwhile be changing to next file.
47
Group Activity
48
Quality Outcomes Achieved
Three Components: Activity VIII. Presentation of Sufficient Data Analysis and Interpretation Activity IX. Evidence of Real Improvement Achieved Activity X. Data Supporting Sustained Improvement Achieved
49
Activity VIII. Presentation of Sufficient Data Analysis and Interpretation - NHDP Overall Score
50
Activity VIII. Presentation of Sufficient Data Analysis and Interpretation
Results: 10 percent of the nine evaluation elements were Met 18 percent of the evaluation elements Partially Met or Not Met 72 percent of the evaluation elements were Not Applicable or Not Assessed
51
Activity VIII: Review Data Analysis and Interpretation of Study Results
HSAG Evaluation Elements: Is conducted according to the data analysis plan in the study design. Allows for generalization of the results to the study population if a sample was selected. Identifies factors that threaten internal or external validity of findings. Includes an interpretation of findings. Is presented in a way that provides accurate, clear, and easily understood information.
52
Activity VIII: Review Data Analysis and Interpretation of Study Results (cont.)
HSAG Evaluation Elements: Identifies initial measurement and remeasurement of study indicators. Identifies statistical differences between initial measurement and remeasurement. Identifies factors that affect the ability to compare initial measurement with remeasurement. Includes the extent to which the study was successful. Bolded evaluation elements show areas for improvement
53
Changes in Study Design?
Study design should be same as baseline Data source Data collection methods Data analysis Target population or sample size Sampling methodology If change: rationale must be specified & appropriate Generally, changes in study design should be rare. However, occasionally something may occur that is not in the control of the M+CO Examples: HEDIS diabetes measure is changed Data collection period for CAHPS is changed
54
Activity IX. Evidence of Real Improvement - NHDP Overall Score
55
Activity IX. Evidence of Real Improvement
Results: 5 percent of the four evaluation elements were Met 5 percent were Partially Met or Not Met 90 percent of the evaluation elements were Not Applicable or Not Assessed
56
Activity IX: Assess the Likelihood that Reported Improvement is “Real” Improvement
HSAG Evaluation Elements: The remeasurement methodology is the same as the baseline methodology. There is documented improvement in processes or outcomes of care. The improvement appears to be the result of intervention(s). There is statistical evidence that observed improvement is true improvement. Bolded evaluation elements show areas for improvement
57
Statistical Significance Testing
Time Periods Measurement Periods Numerator Denominator Rate or Results Industry Benchmark Statistical Testing and Significance CY 2003 Baseline 201 411 48.9% 60% N/A CY 2004 Re-measurement 1 225 54.7% Chi-square = 2.8 P-value = NOT SIGNIFICANT AT THE 95% CONFIDENCE LEVEL
58
No Met evaluation elements for this Activity
Activity X. Data Supporting Sustained Improvement Achieved - NHDP Overall Score No Met evaluation elements for this Activity
59
Activity X. Data Supporting Sustained Improvement Achieved
Results: 0 percent of the one evaluation element was Met 10 percent was Partially Met or Not Met 90 percent of the evaluation element was Not Applicable or Not Assessed
60
Activity X: Assess Whether Improvement is Sustained
HSAG Evaluation Elements: Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant.
61
Quality Outcomes Achieved
Baseline 1st Yr Demonstrable Improvement For the Q.A.P.I. Projects there are 3 data points Many M+Cos will do multiple re-measurements, but only need to report on 3. Sustained Improvement
62
Sustained Improvement
Modifications in interventions Changes in study design Improvement sustained for 1 year One year after demonstrable improvement is achieved, the M+CO must submit a report including information on any changes. Were the interventions modified or new ones added? Was the rationale clear? Were the interventions system-level? Design modifications should be rare! Is the justification clear? Did they describe the change? Are the changes appropriate?
63
HSAG Contact Information Cheryl Neel, RN, MPH,CPHQ Manager, Performance Improvement Projects Denise Driscoll Administrative Assistant
64
Questions and Answers
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.