Presentation is loading. Please wait.

Presentation is loading. Please wait.

Validation of Results Leveraging Navy Medicine’s Analytic Resources.

Similar presentations


Presentation on theme: "Validation of Results Leveraging Navy Medicine’s Analytic Resources."— Presentation transcript:

1 Validation of Results Leveraging Navy Medicine’s Analytic Resources

2 Objectives Compare analyst’s information to real world knowledge to assess reasonableness. Review an annotated query panel to confirm analyst understanding and implementation. Perform or review decompositions for reasonableness in appropriate dimensions. Selectively exploit the M2 Data Dictionary and data caveat “blasters” for impact on validity. 2 2

3 Introduction Critical Thinking – required of all producers and recipients of information. Analysts’ powerful tools don’t substitute for managers’ assessment of validity. In this block: Define validity Methods and illustrations of assessing validity Stewardship of a scarce resource: analysts! 3 3

4 VALIDITY Face Validity – the measuring method seems OK. Construct Validity – either measuring the right thing, or something highly correlated with it. External Validity – findings are likely to apply to “the bigger world”. Reliability – Repeated measurements get similar answers. 4 4 Healthcare Analytics Newsletter

5 1. REAL WORLD KNOWLEDGE 5 5 FACTS VALIDITY ANALYST INFORMATION COMMON SENSE

6 ILLUSTRATION HOW MANY ADMISSIONS LAST YEAR AT NAVY MTFs WERE RELATED TO OBSTETRICS? 6 ANALYST’S ANSWER: 286,727

7 ILLUSTRATION 7 ANALYST’S ANSWER: 286,727 MHS-WIDE, HOW MANY INPATIENTS IN A YEAR?

8 ILLUSTRATION 8 ANALYST’S ANSWER: 286,727 HAZARD A ROUGH GUESS FOR NAVY OB? THE ANALYST ANSWER SEEMS ABOUT 7 TIMES TOO BIG!

9 2. REVIEWING ANNOTATED QUERY PANEL 9 9 The analyst tells the computer what to extract via a query panel M2 then converts the display into a program which retrieves the data from the M2 database. Management can ask the analyst to provide both the query panel, and an explanation for it. BUT must also check for filters applied AFTER the data were extracted!

10 ILLUSTRATION HOW MANY ACTIVE DUTY NAVY LIVE AROUND NAVAL HOSPITAL, JACKSONVILLE? 10 ANALYST’S ANSWER: 20,206

11 Used December, although January was available, as most recent month is usually low. Included both Navy, and Navy Afloat Used Relationship Summary since all AD in it and didn’t need details Included only the active duty Residence zip in Jacksonville catchment area OR enrolled to Jacksonville Used December, although January was available, as most recent month is usually low. Included both Navy, and Navy Afloat Used Relationship Summary since all AD in it and didn’t need details

12 FILTERING THE EXTRACTED DATA? 12 The analyst chose to omit anyone enrolled to a non-Navy, non- contractor site that was not near Jacksonville

13 FILTERING THE EXTRACTED DATA? 13 Check slice-and-dice also, since “ranking” can hide data!

14 RELIABILITY? The analyst chose December because the most recent month is often immature and unreliable. 14

15 THE INFORMATION COST TRADE-OFF 15 ANALYST EFFORT QC AND DOCUMENTINGPRECISION

16 3. DECOMPOSING RESULTS 16 Big sums can hide obvious errors! Manager or analyst can decompose, if stratifiers were or are retrieved with the data. Just “TLAR”, but at a more granular level.

17 ILLUSTRATION HOW DO THE THREE SERVICES COMPARE ON AVERAGE LOS FOR SIMPLE PNEUMONIA? 17 ANALYST’S ANSWER: Army: 2.36 days Navy: 2.70 days AF:4.41 days

18 ILLUSTRATION ANY SURPRISES IF DECOMPOSED BY GENDER? 18 What about Air Force males would cause unusually long LOS?

19 ILLUSTRATION ANY SURPRISES IF DECOMPOSED BY BENCAT? 19 What about Air Force active duty males would cause unusually long LOS?

20 ILLUSTRATION SURPRISES IF DECOMPOSED BY MTF (AF only)? 20 Why would Wilford Hall (59 th Med Wing) have so much longer stays than any other MTF?

21 ILLUSTRATION DECOMPOSED BY PATIENT (WH ONLY)? 21

22 4. EXPLOITING THE DOCUMENTATION M2 DATA DICTIONARYWORLDWIDE BLASTERS 22

23 EXPLOITING DOCUMENTATION 23 AKIN TO HUNTING WHERE THE LIGHT IS, BECAUSE NOTHING CAN BE FOUND IN THE DARK. NEITHER THE DICTIONARY NOR THE BLASTERS ARE NEAR PERFECT... BUT THEY ARE THE BEST WE HAVE. THERE IS NO LIST ANYWHERE OF ALL THE PROBLEMS IN M2 DATA

24 ILLUSTRATION 24 Both are M2, both report on the same events, why do they differ by a million encounters ?

25 ILLUSTRATION 25 The M2 Data Dictionary shows that: o CAPERS have more procedure fields and so will get different estimates of costs. o SADRs are not updated on the same frequency as CAPERs, so the data are not equally fresh. Worldwide blasters were sent: Cautioning on using the CAPERS, especially costs. But subsequent blasters have said that is fixed.

26 ILLUSTRATION 26 Neither Dictionary nor Blasters explain that: o Edits on CAPERS prevent many events from being reported that are found in the SADRs. o SADRs use an imperfect key so that there are duplicate records, where these duplicates are not in CAPERs. Probably good to caveat any M2 answer as no one knows the extent and effect of validity problems in the data. “According to the data in M2...”

27 Questions? Rich Holmes Rich@RichardLHolmes.com 27


Download ppt "Validation of Results Leveraging Navy Medicine’s Analytic Resources."

Similar presentations


Ads by Google