Presentation is loading. Please wait.

Presentation is loading. Please wait.

Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team.

Similar presentations


Presentation on theme: "Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team."— Presentation transcript:

1 Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team

2 Overview of Presentation  Progression of SIPP PIPs  SFY 2012 validation results  Areas for improvement

3 SFY 2011 SIPP PIPs  First year SIPPs completed the PIP process  Each SIPP required to submit one PIP  Total of 14 PIPs were submitted

4 SFY 2011 SIPP PIPs cont.  One was completed through Activity VI  13 were completed through VIII  Initial scores were lower due to lack of the proper documentation

5 SFY 2012 SIPP PIPs  For SFY 2012, the SIPPs were required to submit the collaborative PIP and a PIP topic of their choosing  Some of the individual SIPP topics were increasing family participation in treatment, minimizing weight gain during treatment, and reducing readmissions

6 SFY 2012 SIPP PIPs cont.  There were a total of 27 PIPs submitted for validation  Six PIPs were assessed through Activity VI  Four PIPs were assessed through Activity VII  Five PIPs were assessed through Activity VIII

7 SFY 2012 SIPP PIPs cont.  Twelve PIPs were assessed through Activity IX  None of the PIPs were assessed for Activity X-Sustained Improvement

8 PIP Stages

9 Study Design Stage  Establishes methodological framework for the PIP  Includes development of study topic, question, indicators, and population (Activities I through IV)  A strong study design is necessary for the successful progression of a PIP

10 Study Design Stage Evaluation Elements Activity I: Study Topic  Reflects high-volume or high-risk conditions  Is selected following collection and analysis of data

11 Study Design Stage Evaluation Elements Activity I: Study Topic  Addresses a broad spectrum of care and services  Includes all eligible populations that meet the study criteria  Does not exclude members with special health care needs  Has the potential to affect member health, functional status, or satisfaction

12 Study Design Stage Evaluation Elements Activity II: Study Question  States the problem to be studied in simple terms  Is answerable

13 Study Design Stage Evaluation Elements Activity III: Study Indicators  Are well-defined, objective, and measurable  Are based on current, evidence-based practice guidelines, pertinent peer-reviewed literature, or consensus expert panels  Allow for the study question to be answered

14 Study Design Stage Evaluation Elements Activity III: Study Indicators  Measure changes (outcomes) in health or functional status, member satisfaction, or valid process alternatives  Have available data that can be collected on each indicator

15 Study Design Stage Evaluation Elements Activity III: Study Indicators  Are nationally recognized measures, such as HEDIS technical specifications, when appropriate  Includes the basis on which indicator(s) was adopted, if internally developed

16 Study Design Stage Evaluation Elements Activity IV: Study Population  Is accurately and completely defined  Includes requirements for the length of a member’s enrollment in the MCO  Captures all members to whom the study question applies

17 SIPP Design Stage Results Study StageActivityMet Partially Met Not Met Design I. Appropriate Study Topic* 87% (139/160) 3% (4/160) 11% (17/160) II. Clearly Defined, Answerable Study Question(s) 74% (40/54) 11% (6/54) 15% (8/54) III. Clearly Defined Study Indicator(s) * 68% (86/127) 9% (11/127) 24% (30/127) IV. Correctly Identified Study Population 75% (48/64) 14% (9/64) 11% (7/64) Design Total * 77% (313/405) 7% (30/405) 15% (62/405) * The activity or stage total may not equal 100 percent due to rounding.

18 Study Implementation Stage  Includes sampling, data collection, and interventions (Activities V through VII)  During this stage, MCOs collect data, evaluate and identify barriers to performance, and development interventions targeted to improve outcomes  The implementation of effective improvement strategies is necessary to improve PIP outcomes

19 Study Implementation Stage Evaluation Elements Activity V: Sampling  Consider and specify the true or estimated frequency of occurrence  Identify the sample size  Specify the confidence level  Specify the acceptable margin of error

20 Study Implementation Stage Evaluation Elements Activity V: Sampling  Ensure a representative sample of the eligible population  Are in accordance with generally accepted principles of research design and statistical analysis

21 Study Implementation Stage Evaluation Elements Activity VI: Data Collection  The identification of data elements to be collected  The identification of specified sources of data  A defined and systematic process for collecting baseline and remeasurement data  A timeline for the collection of baseline and remeasurement data

22 Study Implementation Stage Evaluation Elements Activity VI: Data Collection  Qualified staff and personnel to abstract manual data  A manual data collection tool that ensures consistent and accurate collection of data according to indicator specifications  A manual data collection tool that supports interrater reliability

23 Study Implementation Stage Evaluation Elements Activity VI: Data Collection  Clear and concise written instructions for completing the manual data collection tool  An overview of the study in written instructions

24 Study Implementation Stage Evaluation Elements Activity VI: Data Collection  Administrative data collection algorithms/ flow charts that show activities in the production of indicators  An estimated degree of administrative data completeness

25 Study Implementation Stage Evaluation Elements Activity VII: Interventions  Related to causes/barriers identified through data analysis and quality improvement processes  System changes that are likely to induce permanent change  Revised if the original interventions are not successful  Standardized and monitored if interventions are successful

26 SIPP Implementation Stage Results Study StageActivity Met Partially MetNot Met Implementation V. Valid Sampling Techniques (if sampling was used) 86% (6/7) 0% (0/7) 14% (1/7) VI. Accurate/Complete Data Collection * 58% (159/272) 13% (36/272) 28% (77/272) VII. Appropriate Improvement Strategies 70% (45/64) 11% (7/64) 19% (12/64) Implementation Total 61% (210/343) 13% (43/343) 26% (90/343) * The activity or stage total may not equal 100 percent due to rounding.

27 Outcomes Stage  The final stage of the PIP process (Activities VIII through X)  Involves data analysis and the evaluation of improvement based on the reported results and statistical testing  Sustained improvement is achieved when outcomes exhibit improvement over multiple measurements

28 Outcomes Stage Evaluation Elements Activity VIII: Data Analysis  Are conducted according to the data analysis plan in the study design  Allow for the generalization of results to the study population if a sample was selected  Identify factors that threaten the internal or external validity of findings  Include an interpretation of findings

29 Outcomes Stage Evaluation Elements Activity VIII: Data Analysis  Are presented in a way that provides accurate, clear, and easily understood information  Identify the initial measurement and the remeasurement of the study indicators  Identify statistical differences between the initial measurement and the remeasurement

30 Outcomes Stage Evaluation Elements Activity VIII: Data Analysis  Identify factors that affect the ability to compare the initial measurement with the remeasurement  Include an interpretation of the extent to which the study was successful

31 Outcomes Stage Evaluation Elements Activity IX: Real Improvement  The remeasurement methodology is the same as the baseline methodology  There is documented improvement in processes or outcomes of care  The improvement appears to be the result of planned intervention(s)  There is statistical evidence that observed improvement is true improvement

32 Outcomes Stage Evaluation Elements Activity X: Sustained Improvement  Repeated measurements over comparable time periods demonstrate sustained improvement or that a decline in improvement is not statistically significant

33 SIPPs Outcomes Stage Results Study StageActivity Met Partially MetNot Met Outcomes VIII. Sufficient Data Analysis and Interpretation * 54% (63/116) 17% (20/116) 28% (33/116) IX. Real Improvement Achieved 54% (26/48) 21% (10/48) 25% (12/48) X. Sustained Improvement Achieved‡‡‡ Outcomes Total * 54% (89/164) 18% (30/164) 27% (45/164) * The activity or stage total may not equal 100 percent due to rounding. ‡ The PIPs did not progress to this phase during the review period and could not be assessed for real or sustained improvement.

34 SIPP Indicator Results  There were a total of 44 study indicators  22 were not assessed for improvement  15 demonstrated improvement  Of those that demonstrated improvement, 11 demonstrated statistically significant improvement

35 SIPP Indicator Results SFY 2012 Performance Improvement Project Outcomes for the SIPPs (N=27 PIPs) SIPPs Total Number of Study Indicators Comparison to Study Indicator Results from Prior Measurement Period Sustained Improvement 1 Declined Statistically Significant Decline Improved Statistically Significant Improvement Not Assessed Plan A 300021‡ Plan B 300003‡ Plan C 301011‡ Plan D 300201‡ Plan E 311001‡ Plan F 200002‡ Plan G 300021‡ Plan H 300003‡ Plan I 300021‡ Plan J 301011‡ Plan K 300111‡ Plan L 310101‡ Plan M 402002‡ Plan N 500023‡ Overall Totals442541122‡ 1 One or more study indicators demonstrated sustained improvement. ‡ The PIP(s) did not progress to this phase during the review period and/or required an additional measurement period; therefore, sustained improvement could not be assessed.

36 Common Areas for Improvement Activity I: Study Topic  No historical plan-specific data provided to support the selection of the study topic (Evaluation Element #2)

37 Common Areas for Improvement Activity IV: Study Population  Length of enrollment required not specified (Evaluation Element #2)

38 Common Areas for Improvement Activity VI: Data Collection  Timeline for data collection not provided (Evaluation Element #4)  Not all information regarding manual data collection was provided (Evaluation Elements #5-9)

39 Common Areas for Improvement Activity VIII: Data Analysis  Baseline data and data analysis not reported in this year’s submission (Evaluation Elements #1-5)

40 Recommendations  Use the PIP Summary Form Completion Instructions when documenting the PIP Summary Form  If you have questions, contact HSAG for technical assistance

41 HSAG Contacts For any PIP questions or to request PIP technical assistance contact: Christy Hormann chormann@hsag.com 602-801-6836 Jenny Montano jmontano@hsag.com@hsag.com 602-801-6851

42 Questions


Download ppt "Onsite Quarterly Meeting SIPP PIPs June 13, 2012 Presenter: Christy Hormann, LMSW, CPHQ Project Leader-PIP Team."

Similar presentations


Ads by Google