Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.

Slides:



Advertisements
Similar presentations
HSAG Performance Improvement Projects Using Data to Develop Interventions and Statistical Testing to Evaluate Results Breakout Session #1 Florida EQR.
Advertisements

Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
WELCOME to the PIP Technical Assistance Training for Florida HMOs/PSNs We will begin shortly. Please place your phone on mute, unless you are speaking.
Title: The title should accurately describe the issue to be addressed and can never contain a proposed countermeasure. Overview Note: Remember that the.
Welcome to the EQR Quarterly Meeting! Wednesday, March 26, :00 p.m. – 2:30 p.m. (EDT) We will begin shortly. Call-in information is ,
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
Schneider Institute for Health Policy, The Heller School for Social Policy and Management, Brandeis University Components of health services research projects.
External Quality Review Quarterly Meeting Tuesday, September 26, p.m. – 3:30 p.m.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. February 14, 2008.
Causal / Barrier Analysis Florida EQR Quarterly Meeting
Quality Improvement Strategies Root Cause and Subgroup Analyses.
© Grant Thornton UK LLP. All rights reserved. Review of Sickness Absence Vale of Glamorgan Council Final Report- November 2009.
Chapter 15 Evaluation.
How to Write Goals and Objectives
Standards and Guidelines for Quality Assurance in the European
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Quality Improvement Prepeared By Dr: Manal Moussa.
Nursing Process- Evaluation. Evaluation Evaluation measures the client’s response to nursing actions and progress toward achieving health care goals.
Copyright c 2001 The McGraw-Hill Companies, Inc.1 Chapter 7 Sampling, Significance Levels, and Hypothesis Testing Three scientific traditions critical.
Performance Measurement and Analysis for Health Organizations
Sina Keshavaarz M.D Public Health &Preventive Medicine Measuring level of performance & sustaining improvement.
Lect 6 chapter 3 Research Methodology.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Slide 1 Florida SIPP Quarterly Meeting Presenter: Jolene Rasmussen, MS Healthcare Analyst, PIP Review Team March 23, 2011.
Demonstrating Effectiveness Background and Context.
Successful Concepts Study Rationale Literature Review Study Design Rationale for Intervention Eligibility Criteria Endpoint Measurement Tools.
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. January 11, 2011.
Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, :45 p.m. – 4:45 p.m. David Mabb, MS, CHCA.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, :30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,
© 2006 by The McGraw-Hill Companies, Inc. All rights reserved. 1 Chapter 7 Sampling, Significance Levels, and Hypothesis Testing Three scientific traditions.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Performance Improvement Projects Technical Assistance – PIP 101 Monday, June 18, :30 p.m. – 3:00 p.m. David Mabb, MS, CHCA Sr. Director of Statistical.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Evaluate Phase Pertemuan Matakuliah: A0774/Information Technology Capital Budgeting Tahun: 2009.
Onsite Quarterly Meeting PMHP Collaborative PIP Follow-up Within Seven Days After Acute Care Discharge for a Mental Health Diagnosis January 11, 2012 Presenter:
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.
Monitoring Afghanistan, 2015 Food Security and Agriculture Working Group – 9 December 2015.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
All health care professionals must understand and use the EBP approach to practice Incorporates expertise of clinician and patient’s values and preferences.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
V 2.1 Version 2.1 School-wide PBIS Tiered Fidelity Inventory.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Dr. Aidah Abu Elsoud Alkaissi An-Najah National University Employ evidence-based practice: key elements.
Building the Perfect PIP for HMOs and PSNs Facilitators: Christi Melendez, RN, CPHQ Manager, Performance Improvement Projects Debbie Chotkevys, DHA, MBA.
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
ASK MBSAQIP Agenda ASK MBSAQIP November 10, 2016 Time (CST)
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Projects: From Idea to PIP
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
MUHC Innovation Model.
Presenter: Christi Melendez, RN, CPHQ
Research Process №5.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel,
Performance Improvement Projects Technical Assistance – PIP 101
Results of the Organizational Performance
Performance Improvement Project (PIP) Reviews
Performance Improvement Projects: PIP Library
Performance Improvement Projects: From Idea to PIP
Presenter: Kate Bell, MA PIP Reviewer
Performance Improvement Projects Technical Assistance
Presentation transcript:

Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team

CMS PIP Protocol Changes Activities III, IV, VII, and VIII have been reversed in order. Activity III: Use a Representative and Generalizable Study Population Activity IV: Select the Study Indicator(s) Activity VII: Data Analysis and Interpretation of Results Activity VIII: Improvement Strategies

Activity I: Choose the Study Topic HSAG Evaluation Elements: The Study Topic Is selected following collection and analysis of data (critical element) Has the potential to affect member health, outcomes of care, functional status, or satisfaction

Activity II: State the Study Question HSAG Evaluation Elements: The Study Question States the problem to be studied in simple terms and is in the recommended X/Y format (critical element)

Activity III: Identify the Study Population HSAG Evaluation Elements: The Study Population Is accurately and completely defined and captures all members to whom the study question applies (critical element)

Activity IV: Select the Study Indicator HSAG Evaluation Elements: The Study Indicator Is well-defined, objective, and measures changes in health or functional status, consumer satisfaction, or valid process alternatives (critical element) Includes the basis on which the indicator was adopted, if internally developed Allows for the study question to be answered (critical element)

Activity V: Use Valid Sampling Techniques* HSAG Evaluation Elements: Sampling Techniques Specify the measurement period for the sampling methods used Provide the title of the applicable study indicator Identify the population size Identify the sample size (critical element) Specify the margin of error and confidence level Describe in detail the methods used to select the sample * Activity V is only scored if sampling techniques were used.

Activity VI: Define Data Collection HSAG Evaluation Elements: Data Collection The data collection procedures: Identify the data elements to be collected Include a defined and systematic process for collecting baseline and remeasurement data

Activity VI: Define Data Collection HSAG Evaluation Elements: Data Collection The manual data collection procedures: Include the qualifications of staff member(s) collecting manual data Include a manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications (critical element)

Activity VI: Define Data Collection HSAG Evaluation Elements: Data Collection The administrative data collection procedures: Include an estimated degree of administrative data completeness Describe the data analysis plan

Activity VII: Analyze Data and Interpret Study Results HSAG Evaluation Elements: Data Analysis Is conducted according to the data analysis plan in the study design Allows for the generalization of results to the study population if a sample was selected (critical element) Identifies factors that threaten internal or external validity of findings Includes an interpretation of findings *Evaluation Elements 1-5, in Activity VII, are scored for PIPs that provide baseline data

Activity VII: Analyze Data and Interpret Study Results HSAG Evaluation Elements: Interpretation of Study Results Is presented in a way that provides accurate, clear, easily understood information (critical element) Identifies the initial measurement and the remeasurement of study indicators Identifies statistical differences between the initial measurement and the remeasurement Identifies factors that affect the ability to compare the initial measurement with the remeasurement Includes an interpretation of the extent to which the study was successful

Activity VIII: Implementing Interventions and Improvement Strategies HSAG Evaluation Elements: Improvement Strategies Are related to causes/barriers identified through data analysis and quality improvement processes (critical element) Are system changes that are likely to induce permanent change Are revised if the original interventions are not successful Are standardized and monitored if interventions are successful

Activity IX: Real Improvement* HSAG Evaluation Elements: Report Improvement The remeasurement methodology is the same as the baseline methodology There is documented improvement in processes or outcomes of care There is statistical evidence that observed improvement is true improvement over baseline (critical element) The improvement appears to be the result of planned intervention(s) * Activity IX is scored when the PIP has progressed to Remeasurement 1 and will be scored on an annual basis until statistically significant improvement is achieved from baseline to a subsequent remeasurement for all study indicators. Once Evaluation Element 3 receives a Met score, it will remain Met for the duration of the PIP.

Activity X: Sustained Improvement* HSAG Evaluation Elements: Sustained Improvement Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant (critical element) *HSAG will not validate Activity X until statistically significant improvement has been achieved across all study indicators. Once statistically significant improvement is achieved, the MCO will need to demonstrate a subsequent remeasurement period demonstrating that they sustained that improvement to receive an overall Met validation status. 

PIP Tool Format Old Tool Format New Tool Format 10 Activities 53 Evaluation Elements Activity VII Interventions Activity VIII Data Analysis 13 Critical Elements 10 Activities 37 Evaluation Elements Activity III Study Population Activity IV Study Indicator(s) Activity VII Data Analysis Activity VIII Interventions 12 Critical Elements

Outcome Focused PIP Scoring HSAG Evaluation Tool 37 Evaluation Elements Total 12 Critical Elements (CE) Activity I: Activity II: Activity III: Activity IV: Activity V: 1 CE 2 CE Activity VI: Activity VII: Activity VIII: Activity IX: Activity X: 1 CE 2 CE

Outcome Focused PIP Scoring Changes Activity VII Evaluation Element 5 is critical MCOs should ensure that data reported in all PIPs are accurate and align with what has been reported in its IDSS. Activity IX Evaluation Elements 3 and 4 have been reversed New criteria for scoring Activity IX Activity X New criteria for scoring Activity X

Activity IX: Outcome Focused PIP Scoring HSAG Evaluation Elements: Assessing for Real Improvement The remeasurement methodology is the same as the baseline methodology There is documented improvement in processes or outcomes of care There is statistical evidence that observed improvement is true improvement over baseline and across all study indicators The improvement appears to be the result of planned intervention(s)

Activity X: Outcome Focused PIP Scoring HSAG Evaluation Elements: Assessing for Sustained Improvement Repeated measurements over comparable time periods demonstrate sustained improvement, or that a decline in improvement is not statistically significant across all study indicators

Outcome Focused PIP Scoring Activity IX Repeated measurement of the indicators demonstrates meaningful change in performance Improvement must be statistically significant for all study indicators to receive an overall Met validation status Is scored on an annual basis until statistically significant improvement over baseline has been achieved for all study indicators Once Evaluation Element 3 receives a Met score, it will remain Met for the duration of the PIP Evaluation elements 3 and 4 are linked

Outcome Focused PIP Scoring Activity X Repeated measurement of the indicators demonstrates sustained improvement HSAG will not validate Activity X until Evaluation Element 3 of Activity IX is Met Once statistically significant improvement has been achieved for all indicators, the MCO will need to document a subsequent measurement period demonstrating sustained improvement in order to receive a Met in Activity X

Outcome Focused PIP Rationale Overall Met Validation Status The changes align the actual outcomes of the project with the overall validation status Emphasis on statistically significant, sustained improvement in outcomes

Critical Analysis HSAG will be evaluating whether or not… A current causal/barrier analysis was completed-MCOs should conduct an annual causal/barrier and drill-down analysis in addition to periodic analyses of their most recent data. MCOs should include the updated causal/barrier analysis outcomes in its PIPs.

Critical Analysis HSAG will be evaluating whether or not… Barriers and interventions were relevant to the focus of the study and can impact the study indicator(s) outcomes

Critical Analysis For any intervention implemented, the MCO should have a process in place to evaluate the efficacy of the intervention to determine if it is having the desired effect. This evaluation process should be detailed in the PIP documentation. If the interventions are not having the desired effect, the MCO should discuss how it will be addressing these deficiencies and what changes will be made to its improvement strategies.

Critical Analysis The MCO should ensure that the intervention(s) implemented will impact the study indicator(s) outcomes. Member-focused interventions will not impact a study indicator measuring the quality of service provided by a PCP- WCC HEDIS Measure (Childhood Obesity PIP) Interventions focused on educating MCO staff on HEDIS measures will not impact members accessing care and seeking well-child visits

Critical Analysis The MCO should be cognizant of the timing of interventions. Interventions implemented in the last few months of the year will not have been in place long enough to have an impact on the results.

Questions and Answers