Presentation is loading. Please wait.

Presentation is loading. Please wait.

Assessment and Feedback on Clinical Processes: Questioning Results and Conclusions Mariam “Aria” Kwamin Health Analysis Department Navy and Marine Corps.

Similar presentations


Presentation on theme: "Assessment and Feedback on Clinical Processes: Questioning Results and Conclusions Mariam “Aria” Kwamin Health Analysis Department Navy and Marine Corps."— Presentation transcript:

1 Assessment and Feedback on Clinical Processes: Questioning Results and Conclusions Mariam “Aria” Kwamin Health Analysis Department Navy and Marine Corps Public Heath Center 1 2012 Navy Medicine Audit Readiness Training Symposium

2 Disclaimer The views expressed in this presentation are those of the author and do not necessarily reflect the official policy or position of the Department of the Navy, Department of Defense, or the U. S. Government. 2 2

3 Learning Objectives  Introduce NMCPHC’s Health Analysis Department  Analyzing medical informatics through performance measurement  How results can be misinterpreted  How NMCPHC develops performance metrics 3

4 4 Health Analysis Department

5 Health Analysis Who We Are  Department within the Population Health Directorate of the Navy and Marine Corps Public Health Center, Portsmouth, VA  Epidemiologists  Program Manager  Technical Affairs Officer  Biostatistician  Physician Lead  Navy Tumor Registry Consultant 5

6 Health Analysis Department Mission 6 Provide epidemiologic expertise and leadership to improve the value of Navy healthcare through evidence-based methods and clinical health analysis.

7 Health Analysis Department Functions  Increase implementation of evidence-based care throughout Navy Medicine  Develop performance metrics focusing on the improvement of clinical processes  Review policy, programs, and publications relating to metrics and management of health services outcomes  Develop MTF-specific tools and services to promote efficient resource allocation and services Goal: Improve Processes and Promote Positive Health Outcomes 7

8 8 Analyzing Medical Informatics Through Performance Measurement

9 Who Wants Healthcare Quality Measured?  Purchaser of Care:  US Government: interested in direct vs. purchased care cost, quality of care, clinical efficiency  Governing Bodies:  BUMED: Identify areas of excellence, as well as opportunities for improvement  Determine whether there are procedures or areas that put patients, MTFs, or providers at risk  Providers:  Healthcare personnel care about encounters for a certain disease and how well that cohort is being managed.  How to identify and prevent diseases based on certain conditions.  How to use evidence based info in their practice 9

10 Health Care Improvements Using Performance Measurements 10

11 Performance Measurements: Measure & Data Collection Measure:  A valid and reliable indicator that can be used to monitor and evaluate the quality of important management, clinical, and support functions that affect patient outcomes.  In other words how well an organization does something. Data Collection:  Clinical data collected and stored by clinicians in a database  Data retrieved by analyst who measure:  Efficiency  Performance  Progress of the organization among specific cohorts  The type of measurement used is determined by the type of data available 11

12 Performance Measurements: Design, Assess, Improve Design/Redesign:  This is a systematic process involving appropriate scientific methodology and data availability  Redo a methodology to accurately assess/answer questions about a program Assess:  Translating available data into aggregated analysis that conclusions can be drawn about performance and improvement process Improve:  Evaluated after the analysis phase with recommendations on how improvements can be achieved. 12

13 Information Flow M2 Former CDM ICDB COHORT MHSPHP CarePoint PHN Dashboard MDR Examples: CHCS AHLTA DEERS PDTS CDR Data Mart Data Mart Transactional Databases Transactional Databases Storage and Data Enhancement End User Applications for Queries and Analysis Real Time Data Collection 13

14 14 Types of MHS Data Availability? 14 CHCSM2MDR AllergiesYesNo LabIn/OutFinancial RadiologyIn/OutNoFinancial AppointmentsYes EnrollmentYes ICD & CPTIn/OutOut <= 10Date of Injury PharmacyYes Network CareNoYes ReadinessLabsNo Height/Weight VitalsNo Yes 14

15 Quality of Database  Data Availability: Metric development is dependent on the type of data that is available.  Data Validity: Usefulness of the measures in performance assessment and improvement  Data Completeness: Useful in gathering retrospective data over longer time periods more quickly than in a prospective study.  Data Reliability: Data used for analysis is only as good as the accuracy taking during data entry.  Poor data creates poor interpretations and undermines even the most sophisticated assessment tools. Data Availability Data Reliability Data completeness 15 Data Validity

16 Misinterpretation of Results  Quality of Data:  Chance variability: may falsely identify outliers for praise or blame  Changes in data recording overtime: reports may show apparent improvement or deterioration  Errors in the transfer process from one data system to another  Poor data handling procedures and processes may cost organizations a lot of money overall 16 Source: Modified from Quality improvement research: understanding the science of change in healthcare

17 The Impact of Misinterpreting Results  Potential to negatively impact an MTF, region or Service  Monetary:  Decrease in available budget  Inaccurate demand forecasting for business decisions  Productivity:  Under or overestimation of frequency in which a procedure or service was delivered (Quality of Care)  Inaccurate staff availability/utilization estimates (Access to Care)  Surveillance:  Under or overestimation of disease/condition prevalence 17

18 18 NMCPHC Developing Performance Metrics

19 SMART Test: S = Specific: clear and focused to avoid misinterpretation M = Measurable: can be quantified and compared to other data A = Attainable: achievable, reasonable, and credible under conditions expected R = Realistic: fits into NAVMED constraints. T= Timely: doable within the time frame given Quality Of Performance Metric Source: Modified from DEVELOPING PERFORMANCE METRICS – UNIVERSITY OF CALIFORNIA APPROACH

20 Classification of Performance Metrics Source: Modified from DEVELOPING PERFORMANCE METRICS – UNIVERSITY OF CALIFORNIA APPROACH

21 21 NMCPHC Example #1 : Evaluation of a WII Program

22 22 Project Comprehensive Aesthetic Restorative Effort (C.A.R.E.) process flow chart (NMW and NME) NMC San Diego Walter Reed National Military Medical Center NMC Portsmouth

23 Project C.A.R.E. Decision Tree 23

24 How to identify Project C.A.R.E. participants?  MEPRS code ELA2 was the proposed code to capture project C.A.R.E. participants.  ELA2 is a MEPRS code that case managers use when they consult with their wounded warrior patients.  According to The DoD Coding Guidance for case management services, ELA2 is designated for Global War On Terrorism (GWOT) funded warriors in transition.  However there is an issue with using this MEPRS code as the designated code to identify project C.A.R.E. participants. Why? 24

25 25 Project C.A.R.E. process flow chart (NMW AND NME) NMC San Diego Walter Reed National Military Medical Center NMC Portsmouth

26 Project C.A.R.E. Recommendations  Surveillance  How to track project C.A.R.E. patients?  Measure the process of project C.A.R.E.  Designated code for project C.A.R.E. patients.  Measure the outcome of project C.A.R.E.  Derriford Appearance Scale (DAS24) 26

27 NMCPHC Example #2 TRAUMATIC BRAIN INJURY METRICS 27

28 mTBI Metrics Wounded, Ill, and Injured (WII) Program: Navy Medicine effort to monitor and improve the care offered to wounded, ill, and injured service members and their families mTBI Metrics TBI Screening: Percent of Coded Head Injury/Trauma Patients Coded as Screened for TBI Co-Occurring Conditions Screen: Percent of Coded mTBI Patients Coded as Screened for Co-Occurring Conditions Six Week Follow Up Visit: Percent of Coded mTBI Patients with Follow-Up within Six Weeks 28

29 mTBI Metrics (cont’d)  Metric Definitions:  The Department of Veterans Affairs (VA) and Department of Defense (DoD) Concussion and mTBI Clinical Practice Guideline (CPG)  Coding Guidance:  Defense Centers of Excellence for Psychological Health and Traumatic Brain Injury (DCoE PH/TBI)  Navy Medicine TBI subject matter experts (SMEs) 29

30 General Coding Guidance & CPG Recommendation mTBI Metrics (cont’d) 30

31 CPG Recommendation: “Regardless of the time that has elapsed since injury, management should begin with the patient’s first presentation for treatment,” and head injury cases should be screened for TBI. All Head Injury Cases: 5% Active Duty Only: 7% mTBI Metrics (cont’d) Not using codes? Not screening for TBI? Actual process different than recommendation? Not using codes? Not screening for TBI? Actual process different than recommendation? 31

32 Conclusion  Introduce NMCPHC’s Health Analysis Department  Analyzing medical informatics through performance measurement  How results can be misinterpreted  How NMCPHC Develops Performance Metrics 32

33 QUESTIONS 33


Download ppt "Assessment and Feedback on Clinical Processes: Questioning Results and Conclusions Mariam “Aria” Kwamin Health Analysis Department Navy and Marine Corps."

Similar presentations


Ads by Google