Download presentation
Presentation is loading. Please wait.
1
An Assessment of DMSS Immunization Records 1998-2004
National Immunization Conference 2006 Daniel Payne, PhD, MSPH March 7, 2006
2
Objective of this data assessment
To estimate the agreement between chart abstracted AVA vaccination data and electronic DMSS vaccination data from
3
Why is this data assessment important?
“Analysis of DMSS data should be the primary approach for investigation of possible AVA-related health effects of medical significance that occur within the typical period of active duty following vaccination…”. Institute of Medicine An Assessment of the CDC Anthrax Vaccine Safety and Efficacy Research Program. Washington, DC: National Academy Press. Both the DoD and the Food and Drug Administration (FDA) have emphasized the quality of AVA vaccination data collection. The quality of AVA data is also important to the Centers for Disease Control (CDC) which collaborates with the DoD in using DMSS surveillance data to conduct post-marketing surveillance investigations of possible AVA vaccine adverse events in the U.S. military.
4
Data Assessment Project Timeline
March 7, 2006 Results presented Jan 19, 2006 Preliminary analysis complete April-Dec 2005 Medical charts abstracted Spring 2005 Abstractors trained, data collection tools created and tested Jan 4, 2006 CDC receives final abstracted dataset Jan-Apr 2005 DoD service branch approvals obtained Dec 16, 2004 QA Project proposed to TMA, SAP 2004 MAR JUN SEP 2005 MAR JUN SEP 2006 MAR
5
Sampling Strategy Representative sample of military treatment facilities (MTF) serving small, medium and large populations across Army, Marine Corps, and Navy service branches Stratified random sample of MTF’s derived from PRISM data Sampled 28 of the 146 MTF’s for these service branches (19%) Geographical considerations: continental U.S. Air Force approval to access medical charts was requested but not able to be obtained
6
Courtesy of Google Maps
Geographical distribution of 28 facilities sampled Courtesy of Google Maps
7
Abstraction Strategy Records Coordinator at each MTF contacted and requested to pull pre-defined number of records 2 abstractors reviewed each medical chart independently, with adjudication by a 3rd abstractor Medical charts were reviewed to collect information on vaccination type and the date(s) of vaccination. Data entered into a pre-tested PC-based abstraction instrument SSN was collected to link an individual’s medical chart information to the electronic vaccination data in DMSS
8
4 Measures of Assessment
Abstracted Records AVA No AVA A B C D DMSS Sensitivity = A / A+C Specificity = D / B+D PPV = Likelihood of A / Likelihood of A + B NPV = Likelihood of D / Likelihood of C + D 1. Sensitivity: how accurately DMSS electronic data reflects AVA vaccinations abstracted from medical charts 2. Specificity: how accurately DMSS electronic data reflects the absence of AVA vaccinations in the medical charts 3. Positive Predictive Value (PPV): measure of overall usefulness of the data, based on the probability that an AVA vaccination reported in DMSS actually was present in the abstracted medical chart 4. Negative Predictive Value (NPV): measure of overall usefulness of the data, based on the probability that an AVA vaccination not reported in DMSS actually was not present in the abstracted medical chart 5. Proportion 3 (P3): estimated the probability that an AVA vaccination was recorded in both the DMSS and the medical charts, conditional upon that it was captured by at least one of these sources. There are only 3 cells for the incomplete 2 by 2 table and D is blank – cannot tell if a vx record is missing from both sources. Cell a is for doses found in both sources, cell b is for doses can be found in DMSS but not in Constella, and cell c is for doses can be found in Constella but not in DMSS.
9
Methods of assessing AVA data in DMSS
1. Accuracy of DMSS data for personnel having one or more abstracted AVA vaccination(s) (person-level analysis) 2. Concordance of AVA vaccinations recorded in DMSS within different intervals of time, by branch and year (vaccine-level analysis) 3. Regression modeling analysis This measurement of assessment was based upon an AVA data quality study conducted the Defense Manpower Data Center (DMDC), which assessed data in the Defense Enrollment Eligibility Reporting System (DEERS) by abstracting medical records located at the Veterans Administration Records Management Center for a sample of childbearing females [i]. [i] Honner W, Ryan M, Aran R, et al. The US Military Immunization Database: Quality of data on anthrax vaccinations in military women Immunization Registry Conference: October 27, 2003. 2.for 4 different vaccination time intervals: the exact date, within ± 1 day, within ± 2 days, and within ± 7 days of the vaccination date recorded in the medical chart. 3. A logistic regression model was fitted to statistically assess associations between AVA data quality measurements (sensitivity, PPV, and P3) and the following covariates: a.) calendar year of AVA administration and b.) service branch.
10
Chart Abstraction Results
Population of service members 4,201 medical charts sampled from 28 MTF’s Less 27 with unmatched SSN’s (0.6%) 4,174 Approximately 45% had AVA vaccination(s) in medical charts 1,866 medical charts with AVA 1,842 Less 24 records with their only AVA before Jan 1, 1998 1,817 medical charts FINAL SAMPLE Less 25 records with inaccurate service branch-level match
11
Demographics Characteristics Subjects (n=1,817)
AVA Vaccinations (n=7,400) Service Branch Army 878 (48.3%) 3,726 (50.4%) Marine Corps 269 (14.8%) 1,128 (15.2%) Navy 670 (36.9%) 2,546 (34.4%) Component Active 1,715 (94.4%) 7,095 (95.9%) Guard 19 (1.0%) 46 (0.6%) Reserve 83 (4.6%) 259 (3.5%) Gender Female 256 (14.1%) 960 (13.0%) Male 1,561 (85.9%) 6,440 (87.0%) Race White 1,149 (63.2%) 4,649 (62.8%) Black 456 (25.1%) 1,907 (25.8%) Other 183 (10.1%) 712 (9.6%) Unknown 29 (1.6%) 132 (1.8%) Note: due to low sample sizes in 2001 (resulting from AVA distribution slow-down) AVA vaccinations for are grouped
12
Demographics Characteristics Subjects (n=1,817) AVA Vaccinations
Ethnicity Non-Hispanic 1,599 (88.0%) 6,525 (88.2%) Hispanic 189 (10.4%) 743 (10.0%) Unknown 29 (1.6%) 132 (1.8%) Occupation Medical/Research 277 (15.2%) 1,138 (15.4%) Hazardous 129 (7.1%) 560 (7.6%) Frontline 544 (29.9%) 2,278 (30.8%) Office/Admin 863 (47.5%) 3,409 (46.1%) Communication 4 (0.2%) 15 (0.2%) Age <18 7 (0.4%) 26 (0.4%) 18-24 474 (26.1%) 1,748 (23.6%) 25-34 778 (42.8%) 3,258 (44.0%) 35-44 457 (25.2%) 1,964 (26.5%) >=45 101 (5.6%) 404 (5.5%)
13
1: Accuracy of DMSS AVA Data - Full Sample -
Preliminary Results -- Do Not Distribute
14
1: Accuracy of DMSS AVA Data by Branch
Army Marine Corps Navy Sensitivity 98.0 95.2 86.7 Specificity 84.0 90.8 96.1 PPV 84.6 91.1 95.1 NPV 97.9 95.0 89.1 Preliminary Results - Do Not Distribute
15
2: Concordance within different reporting intervals
Service Branch AVA records PPV Sensitivity matched within: Army ± 0 days 79.7 83.1 ± 1 day 83.6 87.1 ± 2 days 84.2 87.9 ± 7 days 86.3 90.0 Marine Corps 85.1 87.3 85.8 88.0 86.5 88.8 87.5 89.8 Navy 81.3 80.7 84.4 83.8 85.9 85.2 86.9 Overall 81.0 83.0 86.2 89.0 Preliminary Results – Do Not Distribute Measurements of agreement by AVA vaccine doses administered within the time intervals ±0, ±1, ±2, and ±7 days, by service branch (n= 4,201) Preliminary Results -- Do Not Distribute
16
Preliminary Results -- Do Not Distribute
17
Preliminary Results -- Do Not Distribute
18
3: Regression model results
Measure of Agreement Time Window Variable Wald Chi-Square (df) P Value Sensitivity Same day Year 1.82 (5) 0.87 Service branch 7.51 (2) 0.02 * ± 7 days 4.14 (5) 0.53 4.80 (2) 0.09 PPV 8.53 (5) 0.13 3.40 (2) 0.18 4.78 (5) 0.44 0.13 (2) 0.94 Preliminary Results -- Do Not Distribute
19
3: Regression model results
Service Branch (Same day) Odds Ratio 95% CI P Value Sensitivity Army vs. Navy 1.19 (0.90, 1.57) 0.22 Marines vs. Navy 1.59 (1.14, 2.21) 0.01 * Army vs. Marines 1.34 (0.96, 1.86) 0.09 Preliminary Results -- Do Not Distribute
20
Limitations Unforeseen circumstances were encountered at four facilities requiring adjustments/replacements to the original site list: hurricane construction at records facility no active duty medical records only dependent records Medical records may not always be a gold standard: Hand-written information in medical record occasionally illegible by all 3 abstractors Some standard vaccination data collection forms in medical record were found to truncate or omit some data Record coordinator indicated that, infrequently, some records may be collected electronically but not recorded on the medical chart, esp. during deployment preparation Air Force medical records not able to be sampled
21
Considered generalizable to Army, Marine Corps, and Navy for 1998-2004
Strengths Robust sample Considered generalizable to Army, Marine Corps, and Navy for Several analytical methods used Effort to compare results to previous studies
22
Conclusions a higher sensitivity (93.4 vs. 61.5)
Compared with Honner W, et al.* (convenience sample of childbearing women in military), we found: a higher sensitivity (93.4 vs. 61.5) a lower specificity (89.5 vs. 97.5) a higher PPV (88.9 vs. 76.6) a similar NPV (93.8 vs. 95.0) fewer differences between service branches (* estimates above are for same person-level methods used by Honner, et al.)
23
Conclusions Compared with Mullooly J, et al. ** (Vaccine Safety Datalink childhood vaccines), we found: Service branch AVA sensitivities (range = ) were close to those found in the 3 VSD databases for childhood vaccines (range= ) VSD noted that, « relaxing the vaccination date agreement to 7 days increased [the relative sensitivity] by 1-5 percent . » We found this increased μ=6.0% (** estimates above are for same vaccine-level methods used by Mullooly, et al.) Note: VSD’s Proportion 2 (P2) calculation is similar to our sensitivity calculation – VSD calls it a relative sensitivity. Both VSD and we analyzed this using exact date matches -
24
Conclusions Measures of agreement varied by service branch:
Person-level analysis: Army had highest sensitivity, Navy had best specificity and PPV Vaccine-level analysis: AVA recorded on exact date was highest in Marine Corps, but agreement within +/- 7 days showed Army highest Regression analysis: only statistically significant difference by branch was for sensitivity (exact match by date): Marines had 59% higher agreement than Navy No statistically significant differences in agreement were detectable by year ( ) DMSS holds comparable estimates of vaccine quality compared with other vaccine safety databases and studies
25
Acknowledgements Yujia Zhang, PhD Susan Duderstadt, MD, MPH
Charles Rose, Jr., PhD Michael McNeil, MD, MPH Emily Weston, MPH Constella Inc.: Steve Wilkins, Tim Struttman DoD: Army Medical Surveillance Activity, MILVAX, Tricare Management Activity FDA: Center for Biologics Evaluation and Research National Vaccine Advisory Committee workgroup
27
7 Safeguards to assure data quality, consistency, and confidentiality
1 Built-in abstraction instrument validity checks REVIEW 2 TRANSMISSION Training program for abstractors 3 ABSTRACTION 100% double-blind data entry 4 On-site 3rd person adjudication for data entry discrepancies Data Quality 5 Subset independently re-abstracted and re-adjudicated: 100% correct PC-based abstraction instrument included built-in validity checks - These logic checks were designed into the instrument to alert the abstractor of inconsistencies and/or omissions and instantly allowed corrections to be made at the point of entry. We did not have a validity check to determine whether a recorded vaccine was age-appropriate or followed DoD or ACIP recommendations – this may have been an issue with other vaccines like Hib. Data abstractors completed a structured training program before being sent into the field - This training program involved a review of the abstraction manual and assessment methods, coding and communication instructions, and data entry exercises using sample records. 3. -- We used 100% double, blinded entry. Each medical record was abstracted by two independent abstractors Independently abstracted data files were then forwarded to the abstraction team leader for adjudication when necessary A final subset of medical charts was independently re-abstracted and re-adjudicated to observe whether any systemic differences in data collection were present At the end of each abstraction day the team leader sent one encrypted file to the secure, off-site host database for concatenation and storage An independent site review of these data collection and transmission methods was conducted by a CDC public health informatics fellow. A formal report was issued to the CDC investigator 6 Secure transfer of encrypted data for off-site storage 7 On-site informatics review
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.