Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.

Slides:



Advertisements
Similar presentations
Performance Improvement Projects (PIPs) Technical Assistance for Florida Medicaid PMHPs August 21 st, 2007 Christy Hormann, MSW PIP Review Team Project.
Advertisements

WELCOME to the PIP Technical Assistance Training for Florida HMOs/PSNs We will begin shortly. Please place your phone on mute, unless you are speaking.
Participation Requirements for a Guideline Panel Member.
Participation Requirements for a Guideline Panel Co-Chair.
Participation Requirements for a Patient Representative.
A Self Study Process for WCEA Catholic High Schools
Nan Jeannero, Kerry McGuire Phoenix State of Connecticut, Department of Social Services Mystery Shopper Project November 2006.
4/30/20151 Quality Assurance Overview. 4/30/20152 Quality Assurance System Overview FY 04/05- new Quality Assurance tools implemented, taking into consideration.
Welcome to the EQR Quarterly Meeting! Wednesday, March 26, :00 p.m. – 2:30 p.m. (EDT) We will begin shortly. Call-in information is ,
Designs to Estimate Impacts of MSP Projects with Confidence. Ellen Bobronnikov March 29, 2010.
External Quality Review Quarterly Meeting Tuesday, September 26, p.m. – 3:30 p.m.
HSAG Performance Improvement Training Statistical Testing Presented by Donald Grostic, MS Health Services Advisory Group, Inc. February 14, 2008.
Causal / Barrier Analysis Florida EQR Quarterly Meeting
Quality Improvement Strategies Root Cause and Subgroup Analyses.
External Quality Review Organization (EQRO) Kick-Off Meeting
Chapter 15 Evaluation.
Orientation to the Accreditation Internal Evaluation (Self-Study) Flex Activity March 1, 2012 Lassen Community College.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
Purpose of the Standards
© 2008 Prentice Hall11-1 Introduction to Project Management Chapter 11 Managing Project Execution Information Systems Project Management: A Process and.
Maine Course Pathways Maine School Superintendents’ Conference June 24 – 25, 2010.
Quality Improvement Prepeared By Dr: Manal Moussa.
REGIONAL PEER REVIEW PANELS (PRP) August Peer Review Panel: Background  As a requirement of the ESEA waiver, ODE must establish a process to ensure.
WELCOME! External Quality Review Quarterly Meeting Wednesday, June 18, :00 a.m. – 3:30 p.m. WELCOME!
1 What are Monitoring and Evaluation? How do we think about M&E in the context of the LAM Project?
Slide 1 Long-Term Care (LTC) Collaborative PIP: Medication Review Tuesday, October 29, 2013 Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP.
Bilingual Students and the Law n Title VI of the Civil Rights Act of 1964 n Title VII of the Elementary and Secondary Education Act - The Bilingual Education.
Performance Improvement Projects: Validating Process, Tips, and Hints Eric Jackson, MA Research Analyst October 19, 2009.
Lecture 11 Managing Project Execution. Project Execution The phase of a project in which work towards direct achievement of the project’s objectives and.
Slide 1 Florida SIPP Quarterly Meeting Presenter: Jolene Rasmussen, MS Healthcare Analyst, PIP Review Team March 23, 2011.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
 Read through problems  Identify problems you think your team has the capacity and interest to solve  Prioritize the problems and indicate the.
Upcoming EQR Activities Contract Year Two 1:45 p.m. – 2:30 p.m. Peggy Ketterer, RN, BSN, CHCA Executive Director, EQRO Services.
QUALITY MANAGEMENT STATEMENT
NHDP Quarterly Meeting Performance Improvement Projects September 30th, 2010 Presenters: Don Grostic, MS Associate Director, Research and Analysis Yolanda.
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
Primary Purposes of the Evaluation System
Performance Improvement Projects (PIPs) Agency for Health Care Administration (AHCA) Tuesday, June 27, :45 p.m. – 4:45 p.m. David Mabb, MS, CHCA.
Tier III Implementation. Define the Problem  In general - Identify initial concern General description of problem Prioritize and select target behavior.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
Performance Improvement Projects Technical Assistance Prepaid Mental Health Plans Thursday, March 29, :30 a.m. – 12:15 p.m. Cheryl L. Neel, RN,
TigerLIFE Behavioral Unit J. Brian Smith, Ed.D., BCBA Marissa Harris, M.S., Ed.D. Graduate Student.
Performance Improvement Projects Technical Assistance – PIP 101 Monday, June 18, :30 p.m. – 3:00 p.m. David Mabb, MS, CHCA Sr. Director of Statistical.
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Onsite Quarterly Meeting PMHP Collaborative PIP Follow-up Within Seven Days After Acute Care Discharge for a Mental Health Diagnosis January 11, 2012 Presenter:
NHDP Plans Quarterly Meeting Performance Improvement Projects September 13, 2011 Presenter: Don Grostic, MS Associate Director, State and Corporate Services.
1 Community-Based Care Readiness Assessment and Peer Review Overview Department of Children and Families And Florida Mental Health Institute.
EVALUATION OF THE SEE SARMa Project. Content Project management structure Internal evaluation External evaluation Evaluation report.
WISCONSIN’S NEW RULE FOR SPECIFIC LEARNING DISABILITIES Effective December 1, 2010.
Performance Improvement Project (PIP) Reviews Presenter: Christi Melendez, RN, CPHQ Associate Director, PIP Review Team.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Building the Perfect PIP for HMOs and PSNs Facilitators: Christi Melendez, RN, CPHQ Manager, Performance Improvement Projects Debbie Chotkevys, DHA, MBA.
April 29-30, Review information related to the RF monitoring system Ensure that the agency meets its ongoing obligation to have a monitoring system.
United Nations Statistics Division
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
Performance Improvement Projects: From Idea to PIP
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
NCATE Standard 3: Field Experiences & Clinical Practice
Presenter: Christi Melendez, RN, CPHQ
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Performance Improvement Projects Technical Assistance Nursing Home Diversion Programs Thursday, March 29, :30 a.m. – 10:30 a.m. Cheryl L. Neel,
Performance Improvement Projects Technical Assistance – PIP 101
Performance Improvement Project (PIP) Reviews
Performance Improvement Projects: PIP Library
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Presenter: Kate Bell, MA PIP Reviewer
United Nations Statistics Division
Physical Therapist Assistant Program School of Science, Health, and Criminal Justice Fall 2016 Assessment Report Curriculum Coordinator: Deborah Molnar.
Performance Improvement Projects Technical Assistance
Presentation transcript:

Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team

Overview of Presentation  Progression of SIPP PIPs  SFY 2012 validation results  Areas for improvement

SFY 2011 SIPP PIPs  First year SIPPs completed the PIP process  Each SIPP required to submit one PIP  Total of 14 PIPs were submitted

SFY 2011 SIPP PIPs cont.  One was completed through Activity VI  13 were completed through VIII  Initial scores were lower due to lack of the proper documentation

SFY 2012 SIPP PIPs  For SFY 2012, the SIPPs were required to submit the collaborative PIP and a PIP topic of their choosing  Some of the individual SIPP topics were increasing family participation in treatment, minimizing weight gain during treatment, and reducing readmissions

SFY 2012 SIPP PIPs cont.  There were a total of 27 PIPs submitted for validation  Six PIPs were assessed through Activity VI  Four PIPs were assessed through Activity VII  Five PIPs were assessed through Activity VIII

SFY 2012 SIPP PIPs cont.  Twelve PIPs were assessed through Activity IX  None of the PIPs were assessed for Activity X-Sustained Improvement

PIP Stages

Study Design Stage  Establishes methodological framework for the PIP  Includes development of study topic, question, indicators, and population (Activities I through IV)  A strong study design is necessary for the successful progression of a PIP

Study Design Stage Evaluation Elements Activity I: Study Topic  Reflects high-volume or high-risk conditions  Is selected following collection and analysis of data

Study Design Stage Evaluation Elements Activity I: Study Topic  Addresses a broad spectrum of care and services  Includes all eligible populations that meet the study criteria  Does not exclude members with special health care needs  Has the potential to affect member health, functional status, or satisfaction

Study Design Stage Evaluation Elements Activity II: Study Question  States the problem to be studied in simple terms  Is answerable

Study Design Stage Evaluation Elements Activity III: Study Indicators  Are well-defined, objective, and measurable  Are based on current, evidence-based practice guidelines, pertinent peer-reviewed literature, or consensus expert panels  Allow for the study question to be answered

Study Design Stage Evaluation Elements Activity III: Study Indicators  Measure changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives  Have available data that can be collected on each indicator

Study Design Stage Evaluation Elements Activity III: Study Indicators  Are nationally recognized measures, such as HEDIS technical specifications, when appropriate  Includes the basis on which indicator(s) was adopted, if internally developed

Study Design Stage Evaluation Elements Activity IV: Study Population  Is accurately and completely defined  Includes requirements for the length of a member’s enrollment in the MCO  Captures all members to whom the study question applies

SIPP Design Stage Results Study StageActivityMet Partially Met Not Met Design I. Appropriate Study Topic* 87% (139/160) 3% (4/160) 11% (17/160) II. Clearly Defined, Answerable Study Question(s) 74% (40/54) 11% (6/54) 15% (8/54) III. Clearly Defined Study Indicator(s) * 68% (86/127) 9% (11/127) 24% (30/127) IV. Correctly Identified Study Population 75% (48/64) 14% (9/64) 11% (7/64) Design Total * 77% (313/405) 7% (30/405) 15% (62/405) * The activity or stage total may not equal 100 percent due to rounding.

Study Implementation Stage  Includes sampling, data collection, and interventions (Activities V through VII)  During this stage, MCOs collect data, evaluate and identify barriers to performance, and development interventions targeted to improve outcomes  The implementation of effective improvement strategies is necessary to improve PIP outcomes

Study Implementation Stage Evaluation Elements Activity V: Sampling  Consider and specify the true or estimated frequency of occurrence  Identify the sample size  Specify the confidence level  Specify the acceptable margin of error

Study Implementation Stage Evaluation Elements Activity V: Sampling  Ensure a representative sample of the eligible population  Are in accordance with generally accepted principles of research design and statistical analysis

Study Implementation Stage Evaluation Elements Activity VI: Data Collection  The identification of data elements to be collected  The identification of specified sources of data  A defined and systematic process for collecting baseline and remeasurement data  A timeline for the collection of baseline and remeasurement data

Study Implementation Stage Evaluation Elements Activity VI: Data Collection  Qualified staff and personnel to abstract manual data  A manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications  A manual data collection tool that supports interrater reliability

Study Implementation Stage Evaluation Elements Activity VI: Data Collection  Clear and concise written instructions for completing the manual data collection tool  An overview of the study in written instructions

Study Implementation Stage Evaluation Elements Activity VI: Data Collection  Administrative data collection algorithms/ flow charts that show activities in the production of indicators  An estimated degree of administrative data completeness

Study Implementation Stage Evaluation Elements Activity VII: Interventions  Related to causes/barriers identified through data analysis and quality improvement processes  System changes that are likely to induce permanent change  Revised if the original interventions are not successful  Standardized and monitored if interventions are successful

SIPP Implementation Stage Results Study StageActivity Met Partially MetNot Met Implementation V. Valid Sampling Techniques (if sampling was used) 86% (6/7) 0% (0/7) 14% (1/7) VI. Accurate/Complete Data Collection * 58% (159/272) 13% (36/272) 28% (77/272) VII. Appropriate Improvement Strategies 70% (45/64) 11% (7/64) 19% (12/64) Implementation Total 61% (210/343) 13% (43/343) 26% (90/343) * The activity or stage total may not equal 100 percent due to rounding.

Outcomes Stage  The final stage of the PIP process (Activities VIII through X)  Involves data analysis and the evaluation of improvement based on the reported results and statistical testing  Sustained improvement is achieved when outcomes exhibit improvement over multiple measurements

Outcomes Stage Evaluation Elements Activity VIII: Data Analysis  Are conducted according to the data analysis plan in the study design  Allow for the generalization of results to the study population if a sample was selected  Identify factors that threaten the internal or external validity of findings  Include an interpretation of findings

Outcomes Stage Evaluation Elements Activity VIII: Data Analysis  Are presented in a way that provides accurate, clear, and easily understood information  Identify the initial measurement and the remeasurement of the study indicators  Identify statistical differences between the initial measurement and the remeasurement

Outcomes Stage Evaluation Elements Activity VIII: Data Analysis  Identify factors that affect the ability to compare the initial measurement with the remeasurement  Include an interpretation of the extent to which the study was successful

Outcomes Stage Evaluation Elements Activity IX: Real Improvement  The remeasurement methodology is the same as the baseline methodology  There is documented improvement in processes or outcomes of care  The improvement appears to be the result of planned intervention(s)  There is statistical evidence that observed improvement is true improvement

Outcomes Stage Evaluation Elements Activity X: Sustained Improvement  Repeated measurements over comparable time periods demonstrate sustained improvement or that a decline in improvement is not statistically significant

SIPPs Outcomes Stage Results Study StageActivity Met Partially MetNot Met Outcomes VIII. Sufficient Data Analysis and Interpretation * 54% (63/116) 17% (20/116) 28% (33/116) IX. Real Improvement Achieved 54% (26/48) 21% (10/48) 25% (12/48) X. Sustained Improvement Achieved‡‡‡ Outcomes Total * 54% (89/164) 18% (30/164) 27% (45/164) * The activity or stage total may not equal 100 percent due to rounding. ‡ The PIPs did not progress to this phase during the review period and could not be assessed for real or sustained improvement.

SIPP Indicator Results  There were a total of 44 study indicators  22 were not assessed for improvement  15 demonstrated improvement  Of those that demonstrated improvement, 11 demonstrated statistically significant improvement

SIPP Indicator Results SFY 2012 Performance Improvement Project Outcomes for the SIPPs (N=27 PIPs) SIPPs Total Number of Study Indicators Comparison to Study Indicator Results from Prior Measurement Period Sustained Improvement 1 Declined Statistically Significant Decline Improved Statistically Significant Improvement Not Assessed Plan A ‡ Plan B ‡ Plan C ‡ Plan D ‡ Plan E ‡ Plan F ‡ Plan G ‡ Plan H ‡ Plan I ‡ Plan J ‡ Plan K ‡ Plan L ‡ Plan M ‡ Plan N ‡ Overall Totals ‡ 1 One or more study indicators demonstrated sustained improvement. ‡ The PIP(s) did not progress to this phase during the review period and/or required an additional measurement period; therefore, sustained improvement could not be assessed.

Common Areas for Improvement Activity I: Study Topic  No historical plan-specific data provided to support the selection of the study topic (Evaluation Element #2)

Common Areas for Improvement Activity IV: Study Population  Length of enrollment required not specified (Evaluation Element #2)

Common Areas for Improvement Activity VI: Data Collection  Timeline for data collection not provided (Evaluation Element #4)  Not all information regarding manual data collection was provided (Evaluation Elements #5-9)

Common Areas for Improvement Activity VIII: Data Analysis  Baseline data and data analysis not reported in this year’s submission (Evaluation Elements #1-5)

Recommendations  Use the PIP Summary Form Completion Instructions when documenting the PIP Summary Form  If you have questions, contact HSAG for technical assistance

HSAG Contacts For any PIP questions or to request PIP technical assistance contact: Christy Hormann Jenny Montano

Questions