Data Quality Toolbox for Registrars MCSS Workshop December 9, 2003 Elaine Collins
Quality Data Toolbox ArtisanRegistrar MediumComputerized data Raw MaterialsMedical information Shaping toolsKnowledge, skills DirectionsStandards Measuring toolsEditing “tools” Final ProductCancer record GoodnessMatch to standards
Quality Data - Goodness Accurate Consistent Complete Timely Maintain shape across transformation and transmission
Measuring Tools Reabstracting studies Structured queries and visual review Text editing EDITS MCSS routine review
Exercises MCSS reabstracting study – 2003 Sites: Breast, Corpus uteri, Lung, Melanoma, Testis, Soft tissue sarcoma 2000 diagnosis year 12 facilities Review of reported data – Structured query Review of reported data – Text editing
Reabstracting Studies Compares original medical record with reported cancer record Considered the “gold standard” Labor-intensive; all records used at initial abstracting may not be available; biased by reabstractor’s training and skills
Structured Queries Compares coding across series of records sorted by selected characteristics Useful for finding pattern discrepancies across many records Manual process; some comparisons may be converted to automated edits
Text Editing Compares text with coded values for individual records Useful for immediately identifying coding problems Manual process; most effective on completion of each individual case
EDITS Checks range validity for many fields, comparability of few fields for individual records Automated process, can be applied on completion of each record or on preparation of batch report; warnings and over-rides are alternatives to failures Expansion of interfield edits requires careful logic
Edits Analysis Edits to be included in MCSS Set Edits in Hospital/Staging Edit Sets – C edits are included in confidential data set No Text Edits displayed Criteria –Valid codes/dates –Alpha/numeric –Timing –Interfield comparisons –Absolute conditions
MCSS Review Requests values for missing or unknown data; resolves conflicts between data items from multiple facilities and between data items updated by single facility Allows incorporation of information from multiple facilities Review for limited number of conditions
Cancer Registrar – Resource for Quality Data Registrar Facility System Medical Record Physician Other Registries Patient ICD-O COC AJCC SEER NAACCR Facility Staff Committees Protocols NCDB Central Registry Quality Monitors CDC Cancer Research Cancer Control NAACCR Public
Data Inputs Patient data from facility systems Medical record reports and notes Pathology reports Staging forms Communication with physician offices Communication with other registries Communication with patients
Process Inputs Registrar training, knowledge, skills Coding standards – ICD-O-3, COC, AJCC, SEER, NAACCR Interpretations of standards – I&R, SEER Inquiry, Ask NAACCR Medical literature – printed and online Registry software data implementations
Sources of Error Patient data from facility systems Medical record reports and notes Pathology reports Staging forms Communication with physician offices Communication with other registries Communication with patients
Sources of Error Registrar training, knowledge, skills Coding standards – ICD-O-3, COC, AJCC, SEER, NAACCR Interpretations of standards – I&R, SEER Inquiry, Ask NAACCR Medical literature – printed and online Registry software data implementations
Types of Errors Missing/conflicting data Shared data errors Timing/coding errors Standards and interpretations – ambiguities, omissions, confusions, contradictions Discrepancies among local/central registry practice and national standards
Software Implementations Discrepancies between implementations and national standards Lack of registrar knowledge/training on correspondence between registry and exported data Logic errors in matching registry data to reporting formats Conversion errors
AJCC Staging Dilemma Are pathologic nodes required for pathologic stage grouping? How do Minnesota registrars answer this question?
Clinical/Pathologic Staging in Study
Collaborative Staging Provides specific rules for coding known vs unknown staging elements Accommodates “best” stage for AJCC stage assignment
AHIMA 75 th Annual Conference October, 2003 Minneapolis: Coming Events Data mining ICD-10-CM SNOMED Natural language processing
AHIMA 75 th Annual Conference October, 2003 Minneapolis: Challenges What is our professional purpose? How do we envision ourselves as professionals?
Foundation for Quality Data Registrar’s commitment to registry purpose Registrar’s knowledge, understanding of cancer data Registrar’s management of communication technologies Registrar’s advocacy for data use
SUMMARY Consistent recording and reporting of quality cancer data requires commitment. Routine and regular review of data patterns facilitates data knowledge and quality. Passing EDITS assists but does not ensure data quality. Data standards change, use the manuals. Welcome Collaborative Stage.