Performance Improvement Projects Technical Assistance – PIP 101

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
WELCOME to the PIP Technical Assistance Training for Florida HMOs/PSNs We will begin shortly. Please place your phone on mute, unless you are speaking.
Template: Making Effective Presentation about Your Evidence-based Health Promotion Program This template is intended for you to adapt to your own program.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
Causal / Barrier Analysis Florida EQR Quarterly Meeting
Quality Improvement Strategies Root Cause and Subgroup Analyses.
External Quality Review Organization (EQRO) Kick-Off Meeting
Chapter 15 Evaluation.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
How to Write Goals and Objectives
HEDIS Audit – Appropriate Monitoring and Oversight of Vendors Presenter: Yolanda Strozier, MBA Project Manager, EQRO Services.
Standards and Guidelines for Quality Assurance in the European
Quality Improvement Prepeared By Dr: Manal Moussa.
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group June 18, :15 p.m.–4:45.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Behavior Management: Applications for Teachers (5 th Ed.) Thomas J. Zirpoli Copyright © 2008 by Pearson Education, Inc. All rights reserved. 1 CHAPTER.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Slide 1 Florida SIPP Quarterly Meeting Presenter: Jolene Rasmussen, MS Healthcare Analyst, PIP Review Team March 23, 2011.
Performance Measures 101 Presenter: Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services Health Services Advisory Group March 28, :00.
Copyright 2012 Delmar, a part of Cengage Learning. All Rights Reserved. Chapter 9 Improving Quality in Health Care Organizations.
Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, :45 p.m. – 4:45 p.m. David Mabb, MS, CHCA.
Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, :30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,
Performance Improvement Projects Technical Assistance – PIP 101 Monday, June 18, :30 p.m. – 3:00 p.m. David Mabb, MS, CHCA Sr. Director of Statistical.
External Review Team: Roles and Responsibilities A Very Brief Training! conducted by JoLynn Noe Office of Assessment.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
Session 2: Developing a Comprehensive M&E Work Plan.
Quality Improvement Tools for Intervention Determination Presenters: Kris Hartmann, MS Healthcare Analyst, Performance Improvement Projects Don Grostic,
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
MEASURE Evaluation Data Quality Assurance Workshop Session 3 Introduction to Routine Data Quality Assessment.
Understanding Populations & Samples
CH 14 Implementing CH 15 Evaluating
A FRUIT AND VEGETABLE PRESCRIPTION PROGRAM
Understanding Standards: Nominee Training Event
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Projects: From Idea to PIP
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Data Quality Assurance
MUHC Innovation Model.
Presenter: Christi Melendez, RN, CPHQ
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 2. Session 6. Developing indicators.
Supplementary Table 1. PRISMA checklist
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Part C State Performance Plan/Annual Performance Report:
OSEP Project Directors Meeting
Chapter Eight: Quantitative Methods
Funding Opportunity Announcement Number: HRSA
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel,
Measuring Data Quality and Compilation of Metadata
Downingtown Area School District Central Office April 4, 2018
Performance Improvement Project (PIP) Reviews
Please mute yourselves, on phone or computer. Thank you.
June 21, 2018 Amy McCurry Schwartz, Esq., MHSA EQRO Consultant
Performance Improvement Projects: PIP Library
Performance Measures 101 March 30, 2012 Presenter:
Early Childhood Transition APR Indicators and National Trends
Performance Improvement Projects: From Idea to PIP
IV-E Prevention Family First Implementation & Policy Work Group
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Presenter: Kate Bell, MA PIP Reviewer
AFIX Standards: a new programmatic tool
Research Design and Methods
Using State and Local Data to Improve Results
Performance Improvement Projects Technical Assistance
Part II Objectives Describe how policies and procedures are used
M & E Plans and Frameworks
Institutional Self Evaluation Report Team Training
Presentation transcript:

Performance Improvement Projects Technical Assistance – PIP 101 Health Services Advisory Group, Inc.

Outline Overview of the PIP Process PIP Summary Form Review PIP Scoring Methodology

Overview of PIPs What is a PIP? Quality improvement project. Purpose is to assess and improve processes and outcomes of care. Time frame is minimum of 3 years from beginning to termination (according to CMS).

Overview of PIPs (cont.) The PIP process provides an opportunity to: Identify and measure a targeted area (clinical or nonclinical). Analyze the results. Implement interventions for improvement.

Overview of PIPs (cont.) HSAG’s role is to: Validate PIPs using CMS’ protocol, Validating Performance Improvement Projects, A Protocol for Use in Conducting Medicaid External Quality Review Activities, Final Protocol, Version 1.0. Validate the study’s findings on the likely validity and reliability of the results Provide PIP Validation Reports

PIP Completion Instructions Are used to ensure each HSAG evaluation element has been addressed. Help to simplify PIP submissions. Promote efficiency in preparing PIP documentation.

PIP Activities 10 PIP Activities CMS rationale HSAG evaluation elements

Activity I: Select the Study Topics CMS Rationale The study topics should: Impact a significant portion of the members. Reflect Medicaid enrollment in terms of demographic characteristics, prevalence of disease, and the potential consequences (risks) of the disease.

Activity I: Select the Study Topics (cont.) CMS Rationale The goal of the study should be to improve processes and outcomes of health care. The study topic may be specified by the State or on the basis of Medicaid member input.

Activity I: Select the Study Topics (cont.) HSAG Evaluation Elements Reflect high-volume or high-risk conditions. Are selected following collection and analysis of data (include plan-specific data). Address a broad spectrum of care and services.

Activity I: Select the Study Topics (cont.) HSAG Evaluation Elements The study topic: Includes all eligible populations that meet the study criteria. Includes members with special health care needs. If any population excluded, explain why. Has the potential to affect member health, functional status, or satisfaction.

Activity I: Study Topic Examples Improving Diabetic Screening. Improving Well-Child Visits in the First 15 Months of Life. Member or Provider Satisfaction. Access to Care and Services. Improving Blood Lead Screening.

Activity II: Define the Study Question CMS Rationale Stating the question(s) helps maintain the focus of the PIP and sets the framework for data collection, analysis, and interpretation.

Activity II: Define the Study Question (cont.) HSAG Evaluation Elements The study question: States the problem to be studied in simple terms Is answerable In general, should illustrate this point: Does doing X result in an increase or decrease in Y?

Activity II: Study Question Example Do targeted interventions increase the percentage of children who receive 6 or more well-child visits in the first 15 months of life?

Activity III: Select the Study Indicators CMS Rationale The study indicators: Represent a quantitative or qualitative characteristic (a variable). Represent a discrete event (member has or has not experienced event X ). Are appropriate for the study topic. Are objective, clearly and unambiguously defined.

Activity III: Select the Study Indicators (cont.) HSAG Evaluation Elements The study indicators: Are well-defined, objective, and measurable. Are based on practice guidelines, with sources identified.

Activity III: Select the Study Indicators (cont.) HSAG Evaluation Elements The study indicators: Allow for the study question to be answered. Align with the study question. Measure changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives.

Activity III: Select the Study Indicators (cont.) HSAG Evaluation Elements The study indicators: Have available data that can be collected on each indicator. Are nationally recognized measures such as HEDIS®, when appropriate Include the basis on which each indicator was adopted, if internally developed HEDIS® is a registered trademark of the National Committee for Quality Assurance (NCQA).

Activity III: Select the Study Indicators (cont.)

Activity IV: Use a Representative and Generalizable Study Population CMS Rationale The study population: Represents the entire Medicaid-eligible enrolled population. Allows systemwide measurement. Allows the implementation improvement efforts to which the study indicators apply.

HSAG Evaluation Elements Activity IV: Use a Representative and Generalizable Study Population (cont.) HSAG Evaluation Elements The method for identifying the eligible population: Is accurately and completely defined. Includes requirements for the length of a members enrollment in the plan. Captures all members to whom the study question apply.

Activity IV: Study Population Example The study population is defined as follows: All members who turned 15 months of age during the measurement year who were continuously enrolled from 31 days through 15 months of age with no more than one gap in enrollment of up to 45 days during the continuous enrollment period. CPT codes: 99381, 99382, 99391, 99432 ICD-9 codes: V20.2, V70.0, V70.3, V70.5, V70.6, V70.8, V70.9

Activity V: Use Sound Sampling Techniques CMS Rationale Sample size impacts the level of statistical confidence in the study. Statistical confidence is a numerical statement of the probable degree of certainty or accuracy of an estimate. The sample size is large enough to detect improvement in indicators between measurement periods. The sample is representative of the entire eligible population.

Activity V: Use Sound Sampling Techniques* (cont.) HSAG Evaluation Elements Sampling techniques: Consider and specify the true or estimated frequency of occurrence. Identify the sample size (or use the entire population). Specify the confidence interval to be used (or use the entire population). * Activity V is only scored if sampling techniques were used. If entire population was used, document this in Activity V.

Activity V: Use Sound Sampling Techniques (cont.) HSAG Evaluation Elements Sampling techniques: Specify the acceptable margin of error. Ensure a representative sample of the eligible population. Are in accordance with generally accepted principles of research design and statistical analysis.

Activity VI: Reliably Collect Data CMS Rationale Procedures used to collect data for a given PIP must ensure that the data collected on the PIP indicators are valid and reliable.

Activity VI: Reliably Collect Data (cont.) CMS Rationale Administrative data collection. Manual data collection.

Activity VI: Reliably Collect Data (cont.) HSAG Evaluation Elements The data collection techniques: Provide clearly defined data elements to be collected. Clearly specify sources of data. Provide for a clearly defined and systematic process for collecting data that includes how baseline and remeasurement data will be collected.

Activity VI: Reliably Collect Data (cont.) HSAG Evaluation Elements The data collection techniques: Provide a timeline for the collection of baseline and remeasurement data. If data were collected manually, the PIP would: Provide qualifications, training, and experience of manual data collection staff members.

Activity VI: Reliably Collect Data (cont.) HSAG Evaluation Elements The manual data collection tool: Ensures consistent and accurate collection of data according to indicator specifications. Supports inter-rater reliability. Has clear and concise written instructions that include an overview of the study.

Activity VI: Reliably Collect Data (cont.) HSAG Evaluation Elements For administrative data collection, the PIP provides: An administrative data collection algorithm, data flow chart, or narrative description that outlines the steps in the production of indicators. An estimated degree of administrative data completeness and supporting documentation for how the percentage was determined.

Activity VII: Implement Intervention and Improvement Strategies CMS Rationale An intervention is designed to change behavior at an institutional, practitioner, or member level. An intervention increases the likelihood of measurable change.

HSAG Evaluation Elements Activity VII: Implement Intervention and Improvement Strategies (cont.) HSAG Evaluation Elements Planned/implemented strategies for improvement are: Related to causes/barriers identified through data analysis and quality improvement (QI) processes. System changes that are likely to induce permanent change. Revised if original interventions are not successful. Standardized and monitored if interventions are successful.

HSAG Evaluation Elements Activity VII: Implement Intervention and Improvement Strategies (cont.) HSAG Evaluation Elements Planned/implemented strategies for improvement: Should be realistic, feasible, and clearly defined. Need a reasonable amount of time to be effective.

Activity VII: Implement Intervention and Improvement Strategies (cont Examples of Improvement Strategies: Member: Reminder postcard mailings to children due for a well-child visit, which continue monthly based on the child’s date of birth. Member: Development of an on-hold message addressing the importance of well-child visits. Provider: Face-to-face meetings with low-reporting providers to provide them educational materials.

Causal/Barrier Analysis Tools Methods: Quality improvement committee. Internal task force. Tools: Fishbone diagram. Process mapping. Barrier/intervention table.

Fishbone Diagram

Activity VIII: Analyze Data and Interpret Study Results CMS Rationale Data analysis begins with examining performance on the selected clinical or non-clinical indicators. Data analysis and interpretation initiated using statistical analysis techniques defined in the data analysis plan.

Activity VIII: Analyze Data and Interpret Study Results* (cont.) HSAG Evaluation Elements The data analysis: Is conducted according to the data analysis plan in the study design. Allows for generalization of the results to the study population (sample selected). Identifies factors that threaten internal or external validity of findings. Includes an interpretation of findings. * For PIPs that provide baseline data, Evaluation Elements 1-5 will be scored in Activity VIII of the PIP Validation Tool.

Activity VIII: Analyze Data and Interpret Study Results (cont.) HSAG Evaluation Elements The data analysis: Is presented in a way that provides accurate, clear, and easily understood information. Identifies initial measurement and remeasurement of study indicators. Identifies statistical differences between initial measurement and remeasurement. Identifies factors that affect the ability to compare initial measurement with remeasurement. Includes the extent to which the study was successful.

Activity IX: Assess for Real Improvement CMS Rationale Change represents “real” change. Results show the probability that improvement is true improvement. Results show the degree to which change is statistically significant.

Activity IX: Assess for Real Improvement* (cont.) HSAG Evaluation Elements The remeasurement methodology is the same as the baseline methodology. There is documented improvement in processes or outcomes of care. The improvement appears to be the result of intervention(s). There is statistical evidence that observed improvement is true improvement. * Activity IX will be scored when the PIP has progressed to Year 2 (Remeasurement 1)

Activity X: Assess for Sustained Improvement CMS Rationale Change results from modifications in the processes of health care delivery. If real change has occurred, the project should be able to sustain improvement.

Activity X: Assess for Sustained Improvement* (cont.) HSAG Evaluation Elements Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant. * Activity X is not scored until the PIP has reported baseline and at least two annual remeasurements of data (Year 3)

PIP Scoring Methodology HSAG Evaluation Tool 13 Critical Elements 53 Evaluation Elements (including the Critical Elements)

PIP Scoring Methodology (cont.) Overall PIP Score Percentage Score: Calculated by dividing the total Met by the sum of the total Met, Partially Met, and Not Met. Percentage Score of Critical Elements: Calculated by dividing the total critical elements Met by the sum of the critical elements Met, Partially Met, and Not Met. Validation Status: Met, Partially Met, or Not Met.

PIP Scoring Methodology (cont.) All critical elements were Met and 80 percent to 100 percent of all elements were Met across all activities.

PIP Scoring Methodology (cont.) Partially Met All critical elements were Met, and 60 percent to 79 percent of all elements were Met across all activities or (2) One or more critical element(s) were Partially Met.

PIP Scoring Methodology (cont.) Not Met All critical elements were Met, and less than 60 percent to 79 percent of all elements were Met across all activities or (2) One or more critical element(s) were Not Met.

PIP Scoring Methodology (cont.) Not Applicable (NA) NA elements (including critical elements) were removed from all scoring. Not Assessed Not Assessed elements (including critical elements) were removed from all scoring. Point of Clarification A Point of Clarification is used when documentation for an evaluation element includes the basic components to meet requirements for the evaluation element (as described in the narrative PIP), but enhanced documentation would demonstrate a stronger understanding of CMS protocols.

PIP Scoring Methodology (cont.) Example 1 Met = 43, Partially Met = 2, Not Met = 0, NA = 8, and all critical elements were Met. The health plan receives an overall Met status, indicating the PIP is valid. The score for the health plan is calculated as 43/45 = 95.6 percent.

PIP Scoring Methodology (cont.) Example 2 Met = 52, Partially Met = 0, Not Met = 1, NA = 0, and one critical element was Not Met. The health plan receives an overall Not Met status and the PIP is not valid.

PIP Tips Complete the demographic page before submission. Label ALL attachments and reference them in the body of the PIP study. Make sure to submit attachments with PIP submission. HSAG does not require personal health information to be submitted. Submit only aggregate results. Ensure all HSAG evaluation elements have been addressed in the PIP Summary Form. Notify HSAG when the PIP documents are uploaded to the secure FTP site and state the number of documents uploaded. This is a desk audit - document, document, document!

Resources These sites offer protocols, literature, guidelines, and tools used for quality improvement projects. Agency for Healthcare Research and Quality: www.ahrq.gov Center for Healthcare Strategies: www.chcs.org Centers for Medicare and Medicaid Services(CMS): www.cms.hhs.gov

Resources (cont.) National Committee for Quality Assurance(NCQA): www.ncqa.org Institute for Healthcare Improvement(IHI): www.ihi.org National Guideline Clearinghouse (NGC): www.guidelines.gov Sampling Calculator: www.surveysystem.com Statistical Testing Calculator: www.graphpad.com