Presentation is loading. Please wait.

Presentation is loading. Please wait.

Healthcare-Associated Infections (HAIs): Reporting and Validating Data across the Nation Tyler Whittington.

Similar presentations


Presentation on theme: "Healthcare-Associated Infections (HAIs): Reporting and Validating Data across the Nation Tyler Whittington."— Presentation transcript:

1 Healthcare-Associated Infections (HAIs): Reporting and Validating Data across the Nation
Tyler Whittington

2 Texas Information Health and Safety Code, Chapter 98
SSIs related to 10 procedures and CLABSI Required to review reporting activities to ensure data are valid Currently: 390 licensed Ambulatory Surgical Centers and 512 licensed hospitals in Texas The law in Texas (Health and Safety Code, Chapter 98) requires general hospitals and ambulatory surgical centers to report surgical site infections (SSIs) related to the following procedures: colon surgeries, hip arthroplasties, knee arthroplasties, abdominal hysterectomies, vaginal hysterectomies, coronary artery bypass graft (CABG) surgery, and vascular procedures. The law also states that pediatric and adolescent hospitals are required to report SSIs associated with cardiac procedures, ventriculoperitoneal shunt procedures, and spinal surgeries. Furthermore, the law requires general hospitals to report the incidence of laboratory-confirmed CLABSIs in any special care setting as well as the incidence of respiratory syncytial virus (RSV) occurring in any pediatric inpatient unit. Health and Safety Code, Chapter 98 also requires that the Texas Department of State Health Services (DSHS) “review reporting activities of health care facilities to ensure the data provided are valid.”

3 Goal of Report Compile a comprehensive source of state-specific information related to HAI reporting and data validation Qualitative surveys were chosen as the method to extract this information One survey to learn from states that are only required to report HAI data Another survey to learn about specific methodologies used by states that perform the validation of HAI data

4 Methods Basic understand of HAI reporting
“First State-Specific Healthcare-Associated Infections Summary Data Report” Compiled a list of preliminary questions to learn about other state mandates for HAI reporting Conference calls with subject matter experts Rachel Stricof Mary Andrus Becky Heinsohn and Karen Vallejo This document gave a brief overview of HAI reporting and standardized infection ratios (SIRs) and also provided detailed tables of the reporting status of specific states around the country. The types of facilities required to report HAI data; the number and credentials of staff members working on reporting (and more specifically auditing) HAI data; and the specific methodology used by states to audit HAI data. Conference calls with SMEs helped to fine-tune our question list and adjust it based on their knowledge and recommendations

5 Subject Matter Experts: Take Home Points
Both numerator and denominator data are extremely important for calculating infection rates It is imperative to understand the processes that facilities use to capture data Central line days are one of the most inconsistently measured and reported components used to determine SIRs Despite creating a sample size determination for auditing records the bottom line is how many records can be audited in a given day Administrative databases can be very valuable to provide monthly checks on data It is crucial to understand how healthcare staff members are capturing numerator and denominator data and to determine if they are correctly gathering the information necessary to calculate the risk adjustment). What are their surveillance methods? Do they understand definitions of risk factors (such as cut time)? An example from New York of capturing denominator improperly: A facility said they performed X number of surgeries but they actually performed 20% more and didn’t import this data into the administrative database. This will inflate the SSI rate for this particular facility. It is important to utilize the administrative database to check on the quality of data before actually visiting these facilities.

6 State-Specific HAI Information
Taken from the CDC: Compiled the following information into tables: Contact information Reporting status Reporting indicators Facilities required to report The information taken from the CDC site combined with input from SMEs allowed us to create the two survey instruments.

7 Reporting Survey Do you currently audit HAI reporting in your state?
If you do audit, may we contact you again later to discuss more specifics about this proposed audit methodology? Do you plan to audit any of the reported HAI data in the next six months? May we contact you again later to discuss more specifics about this proposed audit methodology? If you do not plan to audit in the next six months, what led you to this decision? If you are not auditing due to lack of resources, what resources would you need (e.g. more FTEs, staff with infection prevention experience, travel funds, etc.)? How many FTEs do you currently have to implement mandatory reporting? Did your state ever attempt to create a method for auditing facilities to validate reported data? The main purpose of this survey was to determine if states that do not currently audit have plans of auditing in the future. If they did not plan to audit reported HAI data in the foreseeable future we wanted to know what led them to this decision. We piloted the reporting survey with state contacts from Nevada, Alabama, and Illinois to identify any gaps in our questions and to be sure that our questions were easy to interpret. Based upon these calls, we added a question to find out how many FTEs states have on staff to implement mandatory reporting of HAIs. We converted the survey into QuestionPro, an online survey software, which would allow state contacts to answer our short survey at their convenience

8 Reporting Survey Results
Currently auditing HAI data Plan to audit HAI data in the next 6 months No immediate plan to audit HAI data New Jersey Ohio, District of Columbia, Montana, Nevada, Alabama, and Illinois Rhode Island, Delaware, West Virginia and Arkansas Survey sent via QuestionPro to: Arkansas, Delaware, District of Columbia, Minnesota, Montana, Nevada, New Jersey, Ohio, Rhode Island, and West Virginia. Minnesota was the only state that failed to respond. These states were selected because they (1) do not currently audit reported data to our knowledge, (2) do not have any plans to audit that we know of and (3) were not used to pilot test the survey.

9 Reporting Survey Results
Rhode Island, Delaware, West Virginia and Arkansas: all pointed to limited resources as the reason behind the lack of an auditing plan There is a need for improved funding and increased staff Delaware mentioned a lack of enthusiasm from healthcare facilities as a barrier. Rhode Island mentioned that they need increased funding and more staff. Delaware stated that they would require sufficient funds to contract out to complete audits because there is no internal capacity to complete them due to limited staff, not because of a lack of IP expertise. Arkansas cited a need for more FTEs to complete an audit in their state.

10 Auditing Survey How do you select facilities to be audited?
How do you select procedures to be audited? How do you select medical records to be audited? What is the timing cycle of the audit (Do you visit every facility every year, every three years, variable depending on findings)? What is the ratio of auditors to facilities? Is the audit performed blinded? What is the actual number of charts you review per as many units of time as they can tell you (e.g. per visit, per facility, per day, per year…)? What are the error rates that you have found for specific variables? We identified states that have begun auditing HAI data using the CDC state-specific information available online. We were mainly interested in knowing the specific methodology used by states to perform their HAI audits. The audit methodology survey was administered via phone call and/or through , depending on the availability of the state contacts.

11 Auditing Survey Results
Connecticut, Maryland, and Washington: CLABSIs New York and South Carolina: CLABSIs and SSIs NY: SSIs for hip, colon, and CABG procedures SC: SSIs for hip, knee, CABG, and hysterectomy procedures in all facilities. Colon and abdominal hysterectomies in facilities with <200 beds Pennsylvania: CLABSIs and CAUTIs State contacts from Maryland and Washington chose to complete the survey at their convenience through . Contacts for Connecticut, New York, Pennsylvania, and South Carolina completed the survey by phone.

12 Auditing CLABSIs: Connecticut
One auditor reviews all positive blood cultures from every facility reporting CLABSIs to NHSN (~30) Blinded audit: all medical record reviews occur from January 1st – April 30th For 2008 audit: 35-day time period with 770 positive blood cultures and 476 septic events DPH conducted medical record reviews over a 35-day time period in 30 adult ICUs and 3 pediatric ICUs. A total of 770 positive blood cultures from 410 patients (395 adult and 15 pediatric) were reviewed. Of the total number of positive blood cultures, 476 septic events were identified.

13 Auditing CLABSIs: Maryland
Contracted with APIC Five auditors complete blinded medical record reviews in all facilities reporting CLABSIs to NHSN (~45) Review 5 charts in ICUs in the top and bottom 11 (25th percentile) facilities of their ranking list (based on CLABSI rates) and 4 charts in all other ICUs Audit study time period: July 1, 2008 to June 30, 2009 Charts audits completed from December 9, 2009 – January 8, 2010 Rank ICUs by CLABSI rate, based on this list select ICUs to audit from the top and bottom 25th percentile (11 ICUs from the top and 11 ICUs from the bottom) Use positive blood culture list to random select every 5th patient was selected until target number reached for that ICU (if in top/bottom 11 then target number is 5, otherwise target number is 4). Cross reference these medical record numbers with patient ID numbers from CLABSI line list to ensure at least one of the selected records was reported as a CLABSI to NHSN. If none of the records are, then cross off the last record on the target list and select the 5th record following that record on the blood culture list until at least one record was reported as CLABSI to NHSN.

14 Auditing CLABSIs: New York
Complete medical record reviews for at least 90% of the facilities reporting CLABSIs to NHSN (~200) Facilities with high rates of HAI and low rates of HAI are given priority for their audit Six auditors divided into regions of the state 5 auditors each responsible for facilities 1 auditor responsible for 9 facilities in the capital region A high rate means any rate that is higher than the average state rate. For example, if a Medical ICU state rate is 2.3 and a hospital rate is 6.3 (and is statistically higher than the state rate), that hospital will be placed in the priority group for the audit cycle. Low is the same process. Any facility that is statistically lower than state average will be placed in the priority group for the audit. Furthermore, facilities that are not audited the year before are at the top of the list for the next year.

15 Auditing CLABSIs: New York
Each facility submits: Line list of NHSN CLABSIs Laboratory list of positive ICU blood cultures Randomly select patient records from the most recent ICU positive blood cultures Currently, they select a total of 20 records to audit New York strategy for CLABSI audit: Limited to ICUs (Adult, pediatric, and neonatal) Go on-site to laboratory and ask for a line list of positive blood cultures from time A to time B Select 10 most recent positive blood cultures from each of the 3 ICU categories See if patients had central lines in at the reported time of infection and see if they meet the criteria for CLABSI If a facility has only one type of ICU then they review all 20 records from that ICU. If a facility has two ICU types then they review 10 records from each ICU type, and if there are more than two ICU types they review at least 5 records from each. For 2007 and 2008 audits, the auditors were completely blind and carried an envelope that had the results (they checked their results with the envelope after the audits were completed). Now, they are not completely blinded but Carole said each auditor probably doesn’t know the results because of the case-control style. The auditors are not printing out the results and checking it that way.

16 Auditing CLABSIs: Pennsylvania
Contracted with APIC: 4 CIC IP auditors 12 facilities are selected to validate their CLABSI data 8 records are selected at each facility and auditor is blind to results Based on annual reports, they select 6 facilities with significantly lower than predicted rates of CLABSI and 6 facilities with significantly higher than predicted rates of CLABSI A list of positive blood cultures during a specified study frame (calendar year 2009) was used to select every 5th record until 8 records were selected for the audit. This target list was cross-referenced against the CLABSI line list (which was sent by the facility to the state health department) to be sure that at least one patient on the target list was reported to NHSN. If none of the 8 records from the target list was reported to NHSN they crossed off that record and used the 5th patient following that one until at least one patient on the target list was reported to NHSN by the hospital.

17 Auditing CLABSIs: South Carolina
Two auditors to review records in 60/75 facilities required to report No specific methodology for selecting records to review Attempt to review between 20 and 30 records at each facility With the 60 facilities, they want to look at facilities with outliers/red flags (if a facility has done 100 procedures with no SSIs, for example). They want to leave out the small facilities that only have performed a few procedures. The total number of charts selected is variable from facility to facility and depends heavily on what the auditor can get done in a day (can run into issues at a facility that can take extra time). Larger facilities (teaching hospitals) will require 2-3 days of auditing. Small hospitals can take half a day to one full day to complete an audit. The facility should fax a list of medical records to be pulled for each procedure. The SSI and CLABSI cases should be pulled for easy access (the control records should be pulled as well). ICD-9 codes can be searched within the medical records to ensure you are viewing records from the specified time frame of interest. For CLABSIs, all positive blood cultures from the time period should be available.

18 Auditing CLABSIs: Washington
Annual internal validation performed by all (62) facilities required to report External validation done by state health department depending on results Currently has two auditors to complete external validation External Validation: Select 40 records from most recent positive blood culture list List of all CLABSI records (based on discharge data) within same time period Review both lists and note if any record meets NHSN definition of CLABSI Compare the lists to check for discrepancies The Washington State Health Department’s Healthcare Associated Infections Program will confirm the internal (hospital) validation through external validation verification for poor internal validation results or by random spot-checks. For internal validation: select 22 cases that meet CLABSI criteria and 22 cases that do not meet CLABSI criteria and audit. Hospitals pass if 17 or more of the 22 cases are true positives. Hospitals fail if only 15 or fewer cases are true positives. Hospitals are not failed but will need to be verified by a site visit if they correctly recognize 16 of the 22 cases. Furthermore, hospitals pass if their specificity determination shows no more than 1 case misidentified as a false positive; fail if 3 or more are incorrectly identified; and are not failed but need to be verified by an on-site visit if they misidentify 2 cases as false positives. Currently has 2 auditors to complete external validation, with a new hire currently being trained

19 Auditing SSIs: New York
Treat each SSI indicator as a separate audit (hip, colon, CABG) 9-18 medical records selected for each procedure (depending on volume of procedures done) Case-control format Cases: taken from NHSN Controls: Hip and colon surgeries from NY State Wide Planning and Research Cooperative System (SPARCS) not in NHSN CABG surgeries from the Cardiac Surgical Reporting System (CSRS) that do not appear in NHSN These registries contain surgeries that can be used as controls to audit in order to be sure that facilities understand and utilize the correct NHSN definitions. Moreover, controls can be randomly selected.

20 Auditing SSIs: South Carolina
Hip, knee, CABG and abdominal hysterectomies in all hospitals and colon surgeries for facilities with less than 200 beds Stratified random method to select records to audit Review roughly charts per facility Variable depending on size of facility and volume of procedures performed If there are a large number of charts, they use a stratified random method to select charts to audit. First, they run a line listing of the procedures performed during the validation period by using the NHSN analysis package. Once they decide on the total number of charts to review, they pick a random number (n) and use that as the starting point on the list, and pick every nth chart for review. For example, if 100 procedures were done, and 10 charts are to be reviewed, they would pick every 10th chart (100/10).  So they pick a random number from 1-10 (computer generated or manually picked) and go down the list.

21 Common Error Rates Connecticut: improper understanding of CLABSI rules
minimum time period, patient transfer, and two or more blood cultures drawn on separate occasion rule Minimum time period (there is no minimum time that a central line must be in place for a blood stream infection to be central line associated), patient transfer (if a patient develops a central line infection within 48 hours of transfer from one location to another then the CLABSI is attributed to the first location), and two or more blood cultures drawn on separate occasion rule (LCBI criterion 1 definition versus criterion 2 definition).

22 Common Error Rates NY and SC: wound class, procedure duration, ASA score, and improper classification of surgeries as clean versus clean-contaminated Rachel Stricof: Other Issues that commonly lead to discrepancies during audits: Cut time and Depth of cut

23 Conclusion Lessons can be learned from other states and their validation programs However, Texas has drastically larger numbers of facilities without increased funding It is vital for Texas to create an efficient validation plan Think long-term goals with limited funding

24 Questions?


Download ppt "Healthcare-Associated Infections (HAIs): Reporting and Validating Data across the Nation Tyler Whittington."

Similar presentations


Ads by Google