Download presentation
Presentation is loading. Please wait.
Published byDarren Phillips Modified over 9 years ago
1
Screening Mammography Benchmarks – Modified Angoff: Screening Performance and Guidelines for Practice Robert D Rosenberg and Patricia Carney for Breast Cancer Surveillance Consortium
2
Overview - Performance Benchmarks Mammography Audit and Practice Recommendations before the BCSC BCSC Early Efforts - Methods BCSC Screening Audit results BCSC and the Modified Angoff Process
3
Mammography Audit - History in the US Single Practice Frankl 1983/Wolfe 1987 (3,000/Yr) Spring-Kimbrel/Sickles 1987-1990 (6,000/Yr) Community Practice Sienko 1993 and Rosenberg 1996 (42,000/yr) Multi-community Practice Yankaskas 2005 (240,000/yr)
4
BiRads Committee & Audit - ~ 1991 to Present Collaborative work – ACR, ACS, NIH Specific Discrete Results!!! Importance of the Audit!! Specified Audit Measurements Preliminary Definitions Recommendations and Assessments Screening and Diagnostic
5
BCSC and the Audit Methods: Standardized Operational Definitions Computerized methods Applicable across practices Assessment vs. Recommendation Extended Types of Practices with Audits From Selected Individual Practices to Regionally selected Practices
6
BCSC and the Audit II Creation of Methods/Definitions for Research that are also Clinically Applicable Consultation with ACR and Members of BiRads Committee Consultation with Community Radiologists Extensive Validation of Methods How to ask Questions, and specific answers
7
Screening Benchmarks Distribution of Performance by Radiologists 2,500,000 Screening Studies > 300 Radiologists Audit Measures: Recall Rate, PPV 1,2,3, Sensitivity, Specificity, Cancer detection rate, Cancer Size and Stage Radiology: Volume 241: Number 1—October 2006
8
Cancer Detection Rate Specificity Sensitivity PPV of Biopsy Rec
9
Great Job Dad– Now What? We’ve Created performance measures for the Community, what do we do with them? What should they be/could they be?
10
Existing Guidelines: Expert Opinion 1994 - AHCPR Clinical Practice Guideline #13: “Desirable Goals Achieved by Highly Skilled Experts” i.e. - Expert Radiologists at Dedicated Facilities Three Goals for Mammography Defined High Sensitivity Reasonable Rates of Recall and PPV2 Cancers detected are small and Localized
11
Modified - Angoff Meeting Meeting funded and assisted by American Cancer Society Moderated and Organized by Patricia Carney Assemble group of 10 “Expert” mammographers Mix of academic and private practice radiologists Mix of regions of the country
12
Angoff Method Process Approach for Setting Cut-Point Criteria for Low Performers Developed in the 1970s and applied in International and National Medical Student Assessment for USMLE-CPX Purpose is to Increase “Accountability” for Meeting a Standard of Proficiency. Most Commonly Used Method to Set Educational Performance Standards Today
13
Modified - Angoff Process Question to Expert Panel: Beyond what range of performance would you recommend consideration for additional training? Anonymous answers given and displayed Experts view responses, discuss, repeat goals Experts then given actual performance ranges in clinical practice - and repeat the process Repeat for each performance measurement
14
AHCPR Expert performance 1994 vs. Angoff ~ Minimally acceptable expectations ParameterAHCPRAngoff Meeting Recall rate≤ 10%5 to 12% PPV15-10%3-8% PPV225-40%20-40% Cancer Detection2-10/1000> 2.5/1000 Sensitivity> 75% Specificity>88% - 95% Radiology: Volume 255: Number 2—May 2010
15
Angoff - Implications for Care The selected cut - points could recommend 18% to 49% of physicians for remediation for one or more measures. If all physicians fell within the boundaries, there would be 14 more cancers/100,000 women screened and 880 fewer false positive studies.
16
Summary It is a complex process to create valid benchmarks for community Radiology These Benchmarks inform experts on realistic recommendations for community performance
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.