Encounter Data Validation: Review and Project Update August 25, 2015 Presenters: Amy Kearney, BA Director, Research and Analysis Team Thomas Miller, MA Executive Director, Research and Analysis Team 1
Welcome About the presenters Rules for engagement Presentation overview Understanding Encounter Data Validation (EDV) studies SFY EDV Results and Recommendations Encounter data quality improvement activities 2
Meeting Objectives 1. To understand the structure and purpose of Encounter Data Validation studies 2. To review the SFY EDV results and recommendations 3. To review ongoing efforts to improve encounter data quality 3
What’s an EDV? Encounter Data Validation –Optional EQR activity Assess completeness, timeliness, and accuracy of encounter data submitted to a state by its MCOs 4
Importance of EDV State Medicaid agencies rely on quality of encounter data submissions to: –Accurately and effectively monitor and improve its program’s quality of care –Establish appropriate performance measures and acceptable rates of performance –Generate accurate and complete reports –Obtain complete and accurate utilization information 5
AHCA’s Annual EDV Studies EDV –Information Systems Review –AHCA Encounter Data File Review –Medical Record Review (MRR) EDV –Encounter Data File Review –Comparative Analysis –MRR EDV –Study design in progress 6
SFY EDV: Study Design Objectives –Determine extent to which encounters in Florida’s Medicaid Management Information System (FMMIS) are complete and accurate when compared to plans’ data –Completeness and accuracy of plans’ encounter data stored in FMMIS through MRR 7
SFY EDV: Study Design Evaluation Components –Encounter data file review –Comparative analysis –MRR Dates of Service –January 1, 2013 – March 31, 2014 Examined three encounter types –Professional –Dental –Institutional 8
SFY EDV: Study Design Encounter Data File Review –Examined the extent to which data submitted by AHCA and the plans were reasonable and complete –Unique encounters were identified by using a combination of plan, recipient ID, provider identification number, and date of service Unique control number not utilized across plans and AHCA 9
SFY EDV: Study Design Encounter Data File Review, continued –Key measures Volume of submitted encounters over time Percent of encounter data fields with value present Percent of encounter data with valid values Documentation of anomalies associated with data extraction and submission were documented 10
SFY EDV: Study Design Comparative Analysis –Based on encounter data records present in AHCA’s and plans’ encounter data: Element Omission- Were data elements present in plans’ files not present in AHCA’s files? Element Surplus- Were data elements present in AHCA’s files not present in plans’ files? Element Agreement- For data elements present in both sources, did the values match? 11
SFY EDV: Study Design Comparative Analysis, continued –Three Key Steps: 1.Develop data submission requirements 2.Conduct file review 3.Conduct comparative analysis of encounter data 12
SFY EDV: Study Design MRR –Assessed whether key data elements in AHCA’s data were complete and accurate when compared to medical records –Four Key Steps: 1.Identification of eligible population and generation of MRR samples 2.Medical record procurement 3.Medical record abstraction 4.Conduct MRR analysis of abstracted data 13
SFY EDV: Study Design MRR, continued Assessed whether key data elements were complete and accurate 14 Key Data Elements for Medical Records Review Key Data FieldsProfessionalDentalInstitutional Date of Service √√√ Diagnosis Code √√ CPT/CDT/HCPCS Code/ Surgical Procedure Code √√√ Procedure Code Modifier √√√
SFY EDV: Study Design MRR, continued –Four study indicators to report results: Medical Record Omission Encounter Data Omission Coding Accuracy Overall Accuracy 15
Meeting Objectives 1. To understand the structure and purpose of Encounter Data Validation studies 2. To review the SFY EDV results and recommendations 3. To review ongoing efforts to improve encounter data quality16 16
Meeting Objectives 1. To understand the structure and purpose of Encounter Data Validation studies 2. To review the SFY EDV results and recommendations 3. To review ongoing efforts to improve encounter data quality17 17
EDV Results: Encounter Data File Review Figure 1—Monthly Variations in Professional Encounters for Plans and AHCA 18
EDV Results: Encounter Data File Review Figure 2—Monthly Variations in Dental Encounters for Plans and AHCA 19
EDV Results: Encounter Data File Review Figure 3—Monthly Variations in Institutional Encounters for Plans and AHCA 20
Variation was present in the overall and month- to-month submission of encounters by type and source; the greatest variation was noted with institutional encounters Required data elements (e.g., Recipient ID, Procedure Code, and Primary Diagnosis) were consistently complete and contained reasonable values EDV Results: Encounter Data File Review 21
EDV Results: Encounter Data File Review Recommendations –Investigate differences identified in monthly encounter data volume and reconcile where appropriate Review activities should focus on determining whether differences are due to failed or incomplete submissions or processing parameters 22
EDV Results: Comparative Analysis Record Completeness –Findings Record omission and surplus rates varied considerably across plans –Dental encounters were the most complete (11.9 percent omission rate and 30.0 percent surplus rate) –Institutional encounters were the least complete (84.7 percent omission rate and 41.1 percent surplus rate) –Recommendation Review and update encounter data submission standards to ensure they meet current reporting requirements Conduct root cause analyses related to low performing encounter types and elements 23
EDV Results: Comparative Analysis Encounter Data Element Completeness –Overall, encounter data elements exhibited high level of completeness across all encounter types –Provider-related encounter data elements were most frequently associated with incomplete data Referring Provider NPI—professional omission and surplus rates > 10% Billing Provider NPI—dental omission rate = 13.6% Rendering Provider NPI—dental surplus rate = 17.5% Attending and Referring Provider—institutional surplus rate > 75% –Encounter data element omission and surplus rates varied by field and plan 24
Study Results: Comparative Analysis Encounter Data Element Completeness, continued –Dental encounters Omission rates showed less variation across plans than other encounter types Encounter data element surplus rate differences were greatest for Line Date of Service, Billing Provider NPI, and Rendering Provider NPI –Institutional encounters Omission and surplus rate variation was mixed Omission rates for nearly half of evaluated elements exhibited minimal variation across plans 25
EDV Results: Comparative Analysis Encounter Data Element Completeness, continued –Recommendations Review State and plan processes for submitting, tracking, and storing provider information Continue collaborative activities focused on exploring reasons for incomplete data submissions and developing improvement strategies 26
EDV Results: Comparative Analysis Encounter Data Element Agreement –Professional encounters Agreement rates were high for key data elements with Procedure Code, NDC, and Primary Diagnosis Code exhibiting agreement rates of at least 90% –Dental encounters High levels of agreement for key data elements with the exception of Dental Procedure Code –Institutional encounters High levels of agreement for one-third of key data elements 27
EDV Results: Comparative Analysis Encounter Data Element Agreement, continued –Recommendations AHCA should continue working collaboratively with key stakeholders to develop and implement a monitoring strategy to routinely examine claims and encounter volume AHCA and the plans should regularly review existing contracts and encounter data documentation to ensure clear expectations surrounding the collection and submission of data 28
EDV Results: MRR Medical Record Submission –1,234 sample cases requested for MRR 981 records submitted by plans 192 records where provider refused to submit 59 records where provider was unable to locate 2 records were submitted for incorrect patient 29
EDV Results: MRR Encounter Data Completeness –AHCA’s encounter data was only moderately supported by enrollees’ medical records (medical record omission) –Plans’ encounter data was only moderately supported by AHCA’s encounter data (encounter data omission) 30
EDV Results: MRR Encounter Data Element Accuracy –Overall encounter data element accuracy was high Diagnosis codes (95.4 percent) Procedure codes (82.3 percent) Procedure code modifiers (99.3 percent) –Only 1/3 of encounters accurately represented all three elements relative to medical record 31
EDV Results: MRR Recommendations –AHCA should continue working collaboratively with key stakeholders to develop and implement a monitoring strategy that audits provider encounter submissions for completeness and accuracy 32
Questions? 33
Meeting Objectives 1. To understand the structure and purpose of Encounter Data Validation studies 2. To review the SFY EDV results and recommendations 3. To review ongoing efforts to improve encounter data quality34 34
Meeting Objectives 1. To understand the structure and purpose of Encounter Data Validation studies 2. To review the SFY EDV results and recommendations 3. To review ongoing efforts to improve encounter data quality35 35
Improving Encounter Data Quality AHCA explored reasons for incomplete encounter data submissions from plans and began developing strategies to improve rates. –Developed encounter data support process –Worked to improve encounter data submission issues, timeliness, and accuracy Dedicated account On-site plan visits Webinars Conference calls 36
Improving Encounter Data Quality AHCA ensured there was a reliable process for timely submission of data from plans. –Implemented timeliness reports that are provided to plan managers. –Plan managers work with the plans to ensure contract timeliness compliance. 37
Improving Encounter Data Quality AHCA organized a Webinar explaining comprehensive list of operational edits associated with error categories identified in the feedback/response files. –HP created encounter data reports listing operational edits the plans are receiving on encounter files. –Worked with plans by phone, webinar and on site visits to provide information regarding errors and feedback on resolution. 38
Meeting Objectives 1. To understand the structure and purpose of Encounter Data Validation studies 2. To review the SFY EDV results and recommendations 3. To review ongoing efforts to improve encounter data quality39 39
Questions
SFY EDV Study: Next Steps Study Design – July/August Data Request – September Data Collection – October Conduct Analysis –Encounter Data File Review – November/December –Comparative Analysis – December - March –Medical Record Review – January - May Reporting – May/June
Contact Information Amy Kearney Director, Research & Analysis Team Tom Miller Executive Director, Research & Analysis Team Mary Wiley Director, State & Corporate Services
Thank you! Please take a moment to complete the webinar survey that will pop up when the meeting is complete. We value and appreciate your feedback!