Pathologist Performance Metrics

Slides:



Advertisements
Similar presentations
CME 101 The Application Process Sponsored by the Office of Continuing Medical Education.
Advertisements

A DNO Perspective by Stephen Parker for Structure of Charges Workshop 15 July 2003.
Chapter 4 Quality Assurance in Context
Introduction & Objectives
Iowa’s Application of Rubrics to Evaluate Screening and Progress Tools John L. Hosp, PhD University of Iowa.
Disclosure/Communication of Laboratory Errors Raouf E Nakhleh, MD Mayo Clinic Florida.
Overview Clinical Documentation & Revenue Management: Capturing the Services Prepared and Presented by Linda Hagen and Mae Regalado.
Top 10 OR Dashboard Metrics Pamela Ereckson Fox, MD.
Cal State Northridge  320 Ainsworth Sampling Distributions and Hypothesis Testing.
Software Quality Assurance
Molecular & Genomic Pathology in the Management of Cancer: Teaching...Who, When, What and How Antonia R. Sepulveda MD., PhD Columbia University, NY, NY.
Political and Strategic Aspects of Employing Digital Pathology in Pathology Departments John D. Pfeifer, MD, PhD Vice Chair for Clinical Affairs, Department.
Learning Target Four “ I can plan meaningful success
+ Equity Audit & Root Cause Analysis University of Mount Union.
Why Use MONAHRQ for Health Care Reporting? May 2014 Note: This is one of seven slide sets outlining MONAHRQ and its value, available at
© Grant Thornton | | | | | Guidance on Monitoring Internal Control Systems COSO Monitoring Project Update FEI - CFIT Meeting September 25, 2008.
1 There is no “quick fix” So, think crock pot, not microwave Strategies… Before that, we must say something very important!
Chapter 1: Introduction to Statistics
CASE TITLE HERE Resident(s): Attending(s): Program/Dept(s): Originally Posted: Month, 00, 20xx.
QUALITY MANAGEMENT PLAN FOR SURGICAL PATHOLOGY-ALL ADVOCATE SITES What it is, why we need it, what references were used, what it entails, why it will be.
* Statutory Assessment Tasks and Tests (also includes Teacher Assessment). * Usually taken at the end of Key Stage 1 (at age 7) and at the end of Key.
Assistant Instructor Nian K. Ghafoor Feb Definition of Proposal Proposal is a plan for master’s thesis or doctoral dissertation which provides the.
The Canadian KTpush project 41 FPs in practice 1-year mixed methods study PUSH and PULL components PUSH: 4,548 rated alerts PULL: 1,767 rated searches.
External Quality Assessment (Proficiency testing) in Diagnostic (Renal) Histopathology Professor Peter Furness Leicester UK.
8 Principles of Effective Documentation.
Political and Strategic Aspects of Employing Digital Pathology in Pathology Departments John D. Pfeifer, MD, PhD Vice Chair for Clinical Affairs, Department.
Correcting Ministerial & Clerical Errors
Patricia M. Alt, Ph.D. Dept. of Health Science Towson University
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Ashley C. Johnson N877 Foundations in Education and Learning
MODIFIERS.
As a specialised Histopathologist how would you evidence the quality of your specialty pathology team and how might you lead continued service improvement?
CHAPTER 6, INDEXES, SCALES, AND TYPOLOGIES
Readiness Consultations
The peer review process
Correcting Ministerial & Clerical Errors
USING ESSENTIAL QUESTIONS AND DEVELOPING CRITICAL THINKING SKILLS IN THE CLASSROOM Presented by: Sabrina Symons.
Data Usage Response to Intervention
Multi Rater Feedback Surveys FAQs for Participants
Multi Rater Feedback Surveys FAQs for Participants
Principles of Effective Documentation
Taming the Dragon: How to teach residents to code office visits
Analysis: Clarity of Scholarly Question Style & Scholarly Relevance
AP Seminar: irr directions & rubric analysis
Changing Landscapes in Academic Medicine: What do we do now????
ADASP Survey on Communication of Urgent and Unexpected Values
More about Tests and Intervals
Top 10 OR Dashboard Metrics
Office of Education Improvement and Innovation
Content and Labeling of Tests Marketed as Clinical “Whole-Exome Sequencing” Perspectives from a cancer genetics clinician and clinical lab director Allen.
CBEI Essentials for Residents, Fellows, Advanced Practice Providers, and Faculty A 10-minute primer on student performance assessment in required clerkships.
RESEARCH METHODS Trial
Progress Reports, Alerts, and Cases in Compass
Integrating Outcomes Learning Community Call February 8, 2012
AP Statistics Exam Overview.
Presenter Name(s) Presenter Institution(s)
Copyright © 2016 by ASCP.
Data Data Data.
Data Data Data.
SEC & State/District Initiatives
Membership of the Royal Colleges of Physicians of the United Kingdom Part 2 Clinical Examination (PACES) Chair’s Briefing to Examiners.
Workshop on Erroneously-Filed Elements and Parts
Development Plans: Study Design and Dose Selection
AICT5 – eProject Project Planning for ICT
4. Allergy severity The severity of an allergy is not displayed by the CDA Display Tool, and this is considered relevant to the HP. PT MAJOR The severity.
[Group Name].
Pertinence of the WHO 2010 guideline on NEC diagnosis Reliable pathology reporting of extra-pulmonary large cell NEC JM van der Zwan Ph.D., IACR, Vancouver.
New Provider and Reappointment Training
Conducting a Business Impact Analysis (BIA)
Presentation transcript:

Pathologist Performance Metrics John Sinard, MD, PhD Vice Chair and Director of AP Yale Pathology

Notice of Faculty Disclosure In accordance with ACCME guidelines, any individual in a position to influence and/or control the content of this ASCP CME activity has disclosed all relevant financial relationships within the past 12 months with commercial interests that provide products and/or services related to the content of this CME activity. The individual below has responded that he/she has no relevant financial relationship(s) with commercial interest(s) to disclose: John Sinard

Disclaimers This is not a discussion of MIPS metrics This is part of an OPPE program This is a work in progress. There is still much to do. The destination is still not even clearly defined. “Dashboard” means different things Up to the minute status of parameters Quick view of multiple parameters Both

Pathologist Performance What we would like to measure: Diagnostic accuracy Attention to detail Responsiveness to clinician clients Willingness to work Ability to function as a member of a team

Preferable Metrics Measureable Metric can be measured consistently Metric is quantitative Aligned with (or surrogate of) what you would really like to measure Trendable over time Interpreted against appropriate benchmarks Focused on identifying the cause, not just the result

Metric: FS/Perm Discrepancy Valuable in/of itself Surrogate for diagnostic accuracy overall Multiple dimensions: Agreement level Disagreement reason Impact “Centralized” Scoring Can’t rely on signout pathologist to flag cases Easy to get hung up on differences in style Maintain focus on whether or not clinical question was answered

FS/Perm Agreement Levels Answered implied clinical question correctly Defer Surgeon had to decide what to do without help from pathology Margin Assurance “I didn’t call it positive, but I can’t call it negative” Disagree Separately score Reason and Impact

FS/Perm Disagree Reason Impact Interpretive: Dx material present on FS slide(s) Perms Only: Dx material in FS block but not on FS slides(s) Sampling: Dx material not in FS block Impact None: Correct thing done (or interest only) Minor: Might have done something different or small additional margin during current surgery Significant: Patient required additional surgery Major: Unnecessary substantial surgery/treatment done

Metric: Amendment Rates Not all amendments are created equal Need a classification system: Reason Original classification system Problems with original system Change in Final Diagnosis Field Change in Pathologist Change in Synoptic Summary Change in Cytotech Change in Clinical Information Add/Delete Parts Change in Gross Description Accessioned to Wrong Patient Change in Labeling Signed out prematurely Change in Client/Location Typographical Error Change in Submitting Physician

Cause vs Result Result A Cause 1 Result B Cause 2 Result C Cause 3 Result D

New Amendment Classification Interpretive Error At time of signout, pathologist did not know the complete and “correct” diagnosis Includes incorporation of ancillary studies, changes to margin status, etc. Diagnostic Communication Error At time of signout, pathologist knew the correct diagnosis, but it was not communicated accurately / fully in report Includes omitted data, inconsistent data, typographical errors Administrative Non-diagnostic issues Only use if there is NO CHANGE IN THE FINAL DIAGNOSIS or SYNOPTIC SUMMARY (Other)

New Amendment Classification Interpretive Errors Oversight/Overlooked (“I didn’t see that”) Didn’t Consider (“I should have thought of that” or “I never heard of that”) Considered / Excluded (“I thought about that; decided against it”) Incorrect Context (“I didn’t know the patient had …”) Planned (“I signed out the case knowing it would be amended when additional results became available”) Unknown Reason (doesn’t fit other categories)

New Amendment Classification Diagnostic Communication Final Diagnosis – Incomplete Includes amending to report the results of special stains done but not reported Final Diagnosis – Inconsistent Typically discrepancy between final dx and synoptic Use for incorrect staging if all of the elements are correctly reported Final Diagnosis – Typo – Potentially misleading E.g., “Stains for HSV and Candida are” Final Diagnosis – Typo – Not misleading Use ONLY for single word changes, NOT for missing lines Wrong case (Case mix-up) Correct diagnosis entered into incorrect case Synoptic Summary Includes missing data, miss-entered data

New Amendment Classification Diagnostic Communication (continued) Wrong patient – External Case accessioned to incorrect patient because it came mislabeled Wrong patient – Internal Case accessioned to incorrect patient because Pathology chose the wrong patient in CoPath Labeling (site/source) – External Usually laterality; specimen came mislabeled Labeling (site/source) – Internal Usually laterality; specimen reported incorrectly FS vs Final Discrepancy not addressed FS/Permanent discrepancy not specifically addressed in initial report Additional Material Received/Submitted Clinician requested looking for more lymph nodes; taken-back blocks show additional information Other Try real hard not to use this – give detailed explanation if you do

New Amendment Classification Administrative Add/Delete Parts Additional material received, or two specimens merged into one Clinical Information Incorrect Incorrect clinical information provided and entered onto report, but did not change interpretation of specimen Non-diagnostic data – External Incorrect submitting physician provided Non-diagnostic data – Internal Incorrect submitting physician chosen, incorrect client, incorrect outside case number Amended in Error Should not have amended the case; no changes made to anything

Categorizing Amendments Amendment information needs to be entered by the Attending Pathologist Needed a tool to create amendments which: Was available 24/7 Allowed amending only your own cases Required entry of needed information Add Requirement for Amendment note Needed a tool for centralized review of amendments

Additional “Metrics” Case Volume Turnaround Time No a metric per se, but places other metrics in perspective Turnaround Time Average easy to calculate, but less meaningful Time to xx% of cases is more meaningful to our clinician colleagues Must compare apples to apples

Additional “Metrics” Use of ancillary testing (special stains, immunos) Can be a clue to different practice patterns External Reviews Agreement level Percentage of cases within a month of s/o Case sharing / consensus conference Surrogate to “teamness”