Is Your Data Management System Flexible for Quality Control Activities? Winny Roshala, CTR Data Standards and Quality Control Unit NAACCR: June 13-19,

Slides:



Advertisements
Similar presentations
TECHNICAL INFORMATION AND CHANGES TO OASIS-C
Advertisements

Module 4: System Maintenance Intuit Financial Services University Internet Banking Certification Training.
Web Plus Overview Division of Cancer Prevention and Control National Center for Chronic Disease Prevention and Health Promotion CDC Registry Plus Training.
SIS – NBS Online Specimen Tracking System Training
October 7, 1999 Freeze Grant Accounts Overview University of Pennsylvania Office of Research Services.
Cancer Registry Coding Changes for 2014 Presented by the Kentucky Cancer Registry February, 2014.
Collaborative Stage Version An Update From Martin Madera CS Program Administrator.
Short Course on Introduction to Meteorological Instrumentation and Observations Techniques QA and QC Procedures Short Course on Introduction to Meteorological.
Unit 4: Monitoring Data Quality For HIV Case Surveillance Systems #6-0-1.
©2008 TTW Where “Lean” principles are considered common sense and are implemented with a passion! Product Training Fixed Assets.
Support.ebsco.com The CINAHL Databases Advanced Searching Tutorial featuring:
California Integrated Water Quality System (CIWQS) and Electronic Self Monitoring Reports (eSMR) Presentation to California Dischargers.
Employment and Training Administration DEPARTMENT OF LABOR ETA Reporting and Data Validation Updates Presenters: Wes Day Barbara Strother Greg Wilson ETA’s.
Introduction to DegreeWorks Created by The Office of the Registrar East Tennessee State University November 2012.
Webinar Training Series New Year Rollover. Agenda Introduction/Purpose Documentation Planning & Preparation Configuring Option Sets Processing New Year.
System Implementation
Copyright CovalentWorks Training Guide for Invoices MYB2B Powered by CovalentWorks.
2010 Hematopoietic and Lymphoid Neoplasm Project Registry Operations and the SEER Program.
DWINSA 2007 Website. Website Purpose Allow states to track status of questionnaires Allow systems >100K or states to upload project data.
Software Inspection A basic tool for defect removal A basic tool for defect removal Urgent need for QA and removal can be supported by inspection Urgent.
John Stem 7 th Annual Summit on VR PEQA Louisville, Kentucky.
MCR and WebPlus: Melanoma Reporting Nancy Cole, Missouri Cancer Registry.
FIRMS & SCO Reporting Updates and Reminders Roberta McNiel, Senior Manager, Financial Services / SFSR Chancellor’s Office April 23, 2015.
2 Session 26 EDExpress Pell Update: What’s New in EDExpress 9.1 Pell for 2003–2004.
Data Quality Toolbox for Registrars MCSS Workshop December 9, 2003 Elaine Collins.
TITLE I COMPARABILITY Determinations & Reporting Title I Technical Assistance Session School Improvement Grant Programs October 6, 2011.
2015 CHANGES IN REPORTING REQUIREMENTS AND CANCER CODING INSTRUCTIONS KENTUCKY CANCER REGISTRY SPRING TRAINING 2015 PRESENTED BY FRANCES ROSS.
Training Role Module 8 – User Admin Ver. 10 Oct 2009.
MAS/FPS Fall Directors’ Workshop MDE OFS Updates October 2014 Office of Field Services.
Reports and Learning Resources Module 5 1. SLMS Primary Administrator Training Module 5: Reports and Learning Resources 2.
1 MIP Sequencing Tool Training Introduction to the Sequencing Tool.
1 OGM Presentation for Your Hospital Today. 2 The Ongoing Maintenance Program The OGM Program Components.
CDC Site Visit at Emory CHD Surveillance Cooperative Agreement Data Quality & Validation September 25, 2013 Wendy Book, MD.
UNCLASSIFIED – For Official Use Only 1 Contract Load Notification “Fly-in” Action ( Continue to Page Down/Click on each page…) Electronic Document Access.
Data Coordinators Conference – 2014 Laura Marroquin CASEWORKER/JCMS Specialist Everything New Data Coordinators Should Know.
Documentation Prepared by: Leanna Bowman Programmer: Linda Norman October 10, 2012.
Vicki LaRue, CTR KCR Abstractor’s Training February 12,
*Be brief, yet thorough enough to allow recoding of the data fields from the text only! * Use accepted abbreviations from Abstractor’s Manual Appendix.
Small Group Discussion Questions. Three Categories Feed-Back from ISCR Training Death Clearance.
BY FRANCES ROSS, CTR PRESENTED AT THE NAACCR ANNUAL CONFERENCE JUNE, 2008 Record Consolidation Test with the 2007 Multiple Primary/Histology Rules.
Rosemary D. Cress, DrPH Research Program Director Improving Occupation Information in Central Cancer Registries for Use in Occupational Cancer Surveillance.
Integrating Central and Hospital Registries To Improve Timeliness and Data Quality (The Central Cancer Registry as a Hub for Data Exchange) David Rousseau,
Case Completeness and Data Accuracy in the National Program of Cancer Registries KK Thoburn, CDC/NPCR Contractor RR German, M Lewis, P Nichols, F Ahmed,
How to complete and submit a Final Report through Mobility Tool+ Technical guidelines Authentication, Completion and Submission 1 Antonia Gogaki IT Officer.
Presented at the NAACCR Annual Conference Quebec City June 22, 2010.
Gary M. Levin, BA, CTR Florida Cancer Data System NAACCR 2008 Annual Conference 2007 Multiple Primary Rules: Impact on Tumor Counts.
Reporter Training for High School RIO TM
Using CDC Edits Metafile in the Registry to Support Clinical Trials Recruitment Alan R. Houser, MA, MPH C/NET Solutions Dennis Deapen, DrPH Los Angeles.
Abstract Plus Version 3.0: Efficient, Flexible Tools for Cancer Casefinding and Reabstracting Case Completeness and Data Quality Audits NAACCR Conference.
The CINAHL Databases Advanced Searching Tutorial
How to complete and submit a Final Report through
Migration to Central Data Exchange (CDX)
SCO GAAP Reporting updates
NAACCR: June 13-19, 2009, San Diego, CA
Title I Equipment Inventory Requirements
Navigating through the Meaningful Use Stage 2 journey
Title I Equipment Inventory Requirements
BACKGROUND New Jersey Immunization Information
Quality Control Abstract Visual Review Process
Creating the perfect text…
SEER Case Consolidation Study: Design & Objective
Building Configurable Forms
Stakeholder Monthly Webinar
Optimizing Efficiency + Funding
Title I Equipment Inventory Requirements
Product Training Fixed Assets
Reallocation of Perkins Funds
Text Mining for Data Quality Analysis of Melanoma Tumor Depth
SEER Auto-Consolidation Workgroup
and Forecasting Resources
Presentation transcript:

Is Your Data Management System Flexible for Quality Control Activities? Winny Roshala, CTR Data Standards and Quality Control Unit NAACCR: June 13-19, 2009, San Diego, CA

CCR Visual Editing Standards: Accuracy Rates Implemented January 1, 2000 with 100% visual editing on 13 data items Automated software was developed to calculate accuracy rates Accuracy Rate Standard: 97%

CCR Visual Editing Standards: Purpose Assure high quality data for analysis Provide consistency in the visual editing process Quantify the accuracy of cancer data from cancer reporting facilities Standardize accuracy rates Standardize format for reporting rates to registrars/facilities

CCR Visual Editing Discrepancies Defined as the quality or state of being discrepant, i.e., disagreeing, being at variance A discrepancy arises when a more appropriate code should have been selected for a data item based on submitted documentation

CCR Visual Editing Discrepancies CCR Visual Editing Discrepancies Discrepancies are counted prior to cases being linked or consolidated Each data item is considered one potential discrepancy with the following exceptions: ◦ Site/subsite ◦ LN’s Pos/Examined ◦ Site Specific Factor fields

CCR Visual Editing Standards: Calculation of Accuracy Rates Percent Discrepant: Number of discrepancies divided by the number of abstracts, multiplied by the number of data items Accuracy Rate: 100% less the percent discrepant

CCR Visual Editing Standards: Historical Perspective In December 2005, in order to reduce a backlog, admissions from abstractors with an accuracy rate of 99% were no longer visually edited This “push through” represented approximately 64% of admissions

CCR Visual Editing Standards: Historical Perspective Due to budget cuts, this percentage was reduced again by adding abstractors with an accuracy rate of 98% to the admissions no longer visually edited

CCR Visual Editing Standards: Historical Perspective In February 2008, due to a further reduction in state funding, the CCR changed it’s approach to reducing the proportion of cancer registry abstracts that are visually edited Instead of focusing on individuals, the CCR went to a random sampling of cases for visual editing, reducing it from 100 percent to 40 percent

CCR Visual Editing Standards Quality for the remaining 60% of abstracts will be monitored by targeted visual editing and through recoding and reabstracting audits Hospital registrars continued to receive monthly Discrepancy Reports

Visually Edited Data Items County of Residence at Diagnosis Sex Race Spanish/Hispanic Origin Date of Diagnosis Diagnostic Confirmation Site/Subsite* Laterality (Only paired sites listed in Volume I) Histology Grade CS Tumor Size CS Extension CS Lymph Nodes Number of Regional Nodes Positive/Examined* CS Metastasis at Diagnosis CS Site Specific Factors 1-6* Class of Case * Counted as one discrepancy

VISUAL EDITING SAMPLING PLAN Run the edits against the admission Set a flag indicating whether any edit errors exist Check the admission as to whether it qualifies for required review and set a flag indicating true/false Check the site to determine whether it is one of the sites that require 100% visual editing and set a flag indicating true/false

The following sites had a high discrepancy rate and will continue to undergo 100% visual editing:  Lip  Nasal Cavity & Middle Ear  Accessory Sinuses  Thymus  Heart, Mediastinum, and Pleura  Retroperitoneum and Peritoneum  Adrenal Glands  Other Endocrine Glands  Other Ill-Defined Sites  Unknown Primary Site

VISUAL EDITING SAMPLING PLAN If the site is determined not to require 100% VE, using the system function Random, generate a number between 0 and 99 Set a flag indicating VE Required to true if the number generated is 37 or less, else set the flag to false Once all of the checks are complete, and if all flags are set to false, the admission will bypass Visual Editing

Issues to Consider Percent of cases randomly selected for visual editing List of sites which require 100% visual editing New review tasks that need to be added to the database

Issues to Consider Programming changes may require little time, however the deployment of programming changes may require a full build Deployment may be delayed to comply with a scheduled release

EUREKA RECODING AUDIT MODULE (RAM) What About the 60% of Cases Bypassing Visual Editing?

RAM Features Ability to select data items and text fields Accessibility Audit current data Automatically sends cases from the primary auditor to the secondary auditor Generates reports Multi-purpose tool

Audit Sample Request Request ItemParameter Name Lymphoma Recoding Audit Data LevelAdmission Level Sample Size60 cases from each region Case StatusBYPASS Cases Only Sample Specifications 1) Resident within reporting region only 2) Reporting facility within resident region 3) Combine regions 1 and 8 as one sample 4) Combine regions 7 and 10 and 3 and 4 as one sample (Region 4: 10 cases/ Region 3: 20 cases/ Region 7/10: 30 cases) 5) Remove all ACTUR reporting facilities (159990,349990,379991,429990,489990) Type of Cases (Year)Cases Loaded 3/1/2008 to 12/31/2008 Diagnosis Year2007 and 2008 Primary SiteLymph Nodes (C77.0 – C77.9)

Audit Sample Request Request ItemParameter GenderAll Class of CaseClass 0, 1, and 2 - ONLY Histology (Parameters, if any) Histology_m3= 2,3 (Behavior) 3 – Invasive - ONLY CA residents only (at DX) Type Reporting Source 1 ( hospital/clinic) Recoding Dates (Start and End) May 26, 2009 to June 12, 2009

21 Primary Auditor Recoding Screen

22 Reconciliation Screen

23 RAM Disposition Report

Summary With diminishing resources, changes in the CCR visual editing practices were necessary Due to the flexibility in our data management system, Eureka, we have the ability to quickly refine the sampling plan for visual editing as needed This has resulted in our ability to redirect resources while carefully monitoring quality control on our data

Summary The development of Eureka RAM has enabled us to focus on the cases bypassing the visual editing process RAM is instrumental in quickly identifying problem areas in coding and instruction Training efforts can be mobilized earlier Future uses for RAM may include training of new staff, targeted visual editing and special studies

Acknowledgements Nancy Schlag, CCR Operations Section Chief Andrew Sutliff, Eureka Programmer Analyst Kyle Ziegler, Audit Coordinator, Quality Control Specialist Vic Belen, Administrative Assistant

Contact Information Winny Roshala, BA, CTR California Cancer Registry 1825 Bell St., Suite 102 Sacramento, CA Phone: (916)