Measuring Non-technical Aspects of Surgical Clinician Development in an Otolaryngology Residency Training Program Jennifer J. Shin, M.D. S.M. Michael J.

Slides:



Advertisements
Similar presentations
Clinical Programs and Competence Evaluation and Outcomes Keith Watson, D.O. Associate Dean for Graduate Medical Education Centers for Osteopathic Research.
Advertisements

"How's our impact?: Developing a survey toolkit to assess how health library services impact on patient care" Alison Weightman July 2008.
Clinical Skills Verification Rater Training MODULE 4 Strategies for Clinical Skills Assessment: Models and Best Practices Michael Jibson, M.D., Ph.D. David.
On Being a Teacher The Postgraduate View Dr Kevin Imrie Director of Post-Graduate Programs New Faculty orientation October 20 th, 2005.
Where to from here in Assessment in Medical Education? Dr Heather Alexander 5 November 2010.
Evaluating the Reliability and Validity of the Family Conference OSCE Across Multiple Training Sites Jeffrey G. Chipman MD, Constance C. Schmitz PhD, Travis.
The Milestone Project For Neurosurgical Resident Coordinators.
UniLOA The University Learning Outcomes Assessment The Center for Learning Outcomes Assessment, Inc. ©
Gall C, Katch A, Rice T, Jeffries HE, Kukuyeva I, and Wetzel RC
Information Literacy Assessment: A Contextual Overview Keith Gresham Head of Information & Instruction Services University of Vermont Libraries VLA College.
INTERNATIONAL CERTIFICATION: NEXT STEPS Dr. Robert Replogle October 2004.
PRESENTED BY: Michael T. Flannery, M.D., F.A.C.P. Professor of Medicine GME Internal Review Director.
Next Accreditation System Safe Care for Current and Future Patients.
Evaluating a rating scale for assessing trainee clinical psychologists’ clinical skills in-vivo. Dr. Alison Tweed* and Ms. Rebecca Graber Introduction.
Fazıl Apaydın, MD n UEMS representative of EAFPS (since 2010) n General secretary of EAFPS (two weeks ago in Oslo) n Member of the International Board.
The Objective Structured Assessment of Technical Skills (OSATS) Helen M. MacRae, M.D., FRCSC, D.H. Gales Director Technical Skills Centre, University of.
Assessing and Evaluating Learning
CASE LOGS & CLINICAL PROCEDURE TRACKING M. Njoku, MD UMMC DIO, Chair GMEC GMEC Meeting June 25, 2015.
Wendy M. Helkowski, M.D. Program Director University of Pittsburgh Medical Center (UPMC)
ACGME OUTCOME PROJECT : THE PROGRAM COORDINATOR’S ROLE Jim Kerwin, MD University of Arizona.
Physician Recruitment Process Elements of Successful Recruitment Board of Directors Presentation September 22, 2006.
ACADEMIC PERFORMANCE AUDIT
Program Administrator Certification
Fund of Knowledge: Basic research methodology Pre-test mean: 56% ± 8% Post-test mean: 65% ± 6% N=11, p value
Seth D. Goldstein, MD 1, Brenessa Lindeman, MD 1, Jorie Colbert-Getz, PhD 3, Trisha Arbella 1, Robert Dudas, MD 2, Anne Lidor, MD, MPH 1, Bethany Sacks,
GUIDELINES FOR CURRICULUM PLANNING Jose Y. Cueto Jr., MD, MHPEd Member Board of Medicine.
Bob Woodwards SAC Chair, Oral and Maxillofacial Surgery.
Performance on Brief Practice Exam Identifies residents at Risk for Poor ABSITE and ABS Qualifying Exam Performance Michael Corneille MD, Ross Willis PhD,
The Royal College of Pathologists – the overall assessment system Dr. Trevor Gray Director of Examinations and Assessment.
R 3 P Colloquium American Board of Pediatrics Jan. 31 – Feb. 2, 2007 The Past, Present and Future Assessments of Clinical Competence A Canadian Perspective.
Assessing clinical judgment using the script concordance test: The importance of using specialty-specific experts to develop the scoring key Petrucci AM.
Is the Script-Concordance Test a Valid Instrument for Assessment of Intra-operative Decision-making Skills? Brent Zabolotny 1, Robert Gagnon 2, Bernard.
Does success on the American Board of Surgery Qualifying Examination guarantee Certifying Examination success? Thomas W. Biester MS, Jonathan D. Rubright.
© 2013 Cengage Learning. All Rights Reserved. This edition is intended for use outside of the U.S. only, with content that may be different from the U.S.
Can a Brief On-line Education Tool Improve Surgical Resident Operative Dictations? A Prospective Evaluation Alicia Kieninger, MD, Yi Wei Zhang, MD, Anna.
Subspecialty Recognition in Medicine The Process The Process Step 1 Approval by the American Board of Medical Specialties (ABMS) Requires sponsorship by.
TWS Aids for Student Teachers & Interns Overview of TWS.
Technical Assistance PREFERENCES and PRIORITIES Bureau of Health Professions HRSA September 2012.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
ACADEMIC PERFORMANCE AUDIT ON AREA 1, 2 AND 3 Prepared By: Nor Aizar Abu Bakar Quality Academic Assurance Department.
IMPLEMENTATION QUALITY RESEARCH OF PREVENTION PROGRAMS IN CROATIA MIRANDA NOVAK University of Zagreb, Faculty of Education and Rehabilitation Sciences.
WORLD HEALTH ORGANIZATION Draft Report WHO/HQ Geneva – Dr. Sasha Goubarev WHO/SEARO & WHO/Nepal Presented by Karen Gladbach Contributions by Arie Rotem.
Self-assessment Accuracy: the influence of gender and year in medical school self assessment Elhadi H. Aburawi, Sami Shaban, Margaret El Zubeir, Khalifa.
WBAs: the Northern Ireland School of Surgery Experience Richard Mayes 1 Robert Gilliland 1, 2 Jeffrey Campbell 1, 2 Helen Holscher 3 Department of Surgery,
Developments in Medical Education Peter Benning Associate Director of Medical Education.
Program Review – Assessing the Health of Curriculum Programs Mark Lupton Director, Institutional Research, Planning and Effectiveness South Piedmont Community.
Pathway to IR: Lessons learned and future training. Brandon Olivieri, R4 Dave Tabriz, R3
POSTER TEMPLATE BY: SAFE-SEAT: An Education Program on Child Passenger Safety for Pediatric Residents Anita Mantha MD 1, Kristen.
Assessing Responsiveness of Health Measurements Ian McDowell, INTA, Santiago, March 20, 2001.
Psychometric Properties of the Ambulatory Surgery-Inventory of Nausea, Vomiting and Retching (AS-INVR)
Assessing Specialty Specific Milestones of ‘Off-Service’ Rotators During Emergency Medicine Rotation Lauren Walter, MD, FACEP, FAAEM and Andrew Edwards,
DEVELOPING AND TESTING THE STANDARD OF PRACTICE AND EVALUATION OF CRITICAL-CARE-NURSING TOOL (SPECT) FOR GRADUATES OF CRITICAL CARE NURSE EDUCATION PROGRAMS.
Next Accreditation System (NAS) Primer Cuc Mai IM Residency Program Director Annual PD Workshop 2015.
EVALUATING EPP-CREATED ASSESSMENTS
Objective Methods for Assessment of Technical Skills in Otolaryngology–Head & Neck Surgery Residents: A Systematic Review Érika MERCIER1, Ségolène CHAGNON-MONARQUE1,
European fellowship programmes
An Alternative Certification Examination “ACE”, to assess the domains of professional practice.  Morris M1*, Gillis AE2, Smoothey CO3, Hennessy M1, Conlon.
Matching into a Surgical Residency
Peer Review in Anesthesia at BIDMC
Strategies to Reduce Antibiotic Resistance and to Improve Infection Control Robin Oliver, M.D., CPE.
INSTRUCTIONAL MATERIALS AND METHODS USED
Development of Inter-Professional Geriatric and Palliative Care Clinic
COACHE Survey 2017 at SUNY ESF
Designing and Implementing Local Faculty Development Programs
Veterinary Technician or Assistant
Basics of Clinical Medicine Lecture Series:
National Credentialing Forum February 8, 2019
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Presentation transcript:

Measuring Non-technical Aspects of Surgical Clinician Development in an Otolaryngology Residency Training Program Jennifer J. Shin, M.D. S.M. Michael J. Cunningham, M.D. Kevin G. Emerick, M.D. Stacey T. Gray, M.D. Society for University Otolaryngologists November 13, 2015

The Constitution of a Surgeon It takes five years to learn how to operate and twenty years to learn when and when not to. -Anonymous

The Constitution of a Surgeon It takes five years to learn how to operate and twenty years to learn when and when not to. -Anonymous I would like to see the day when somebody would be appointed surgeon somewhere who had no hands, for the operative part is the least part of the work. -Harvey Williams Cushing, Letter to Dr Henry Christian, Quoted in 'The Best Hope of All', Time (3 May 1963).

Validated Instruments …Clinical Outcomes …Evidence-based Practice

Validated Instruments …Educational Outcomes …Evidence-based Residency NoviceIntermediateCompetent Instrument to measure the development of clinical practice ability?

Instrument Design  Rapid/real time evaluation  Insight into each trainee’s thought process  Concrete feedback and data compilation  Benefits beyond the learning/measurement itself

Instrument Administration Mock oral board exam  Case criteria:  1) Single patient with a single main diagnostic issue.  2) Case culminates in an intervention, with one resulting complication  3) Enough detail available so that a full history, physical examination, and complement of diagnostic testing may be obtained -- actual radiologic images, lab values, audiometric studies, or other diagnostic test results are available

Instrument Development  Iterative process:  22 successive drafts  311 candidate questions  Face validity evaluation:  Residency program director  Division chairman  Professionalism director  Resident education curriculum supervisor  Epidemiologist/Instrument validation scientist  Validation phase:  Instrument assessment

Inter-rater Reliability Cohen Kappa 0.66 (SE 0.03) 72.5% agreement Internal Consistency Cronbach alpha >0.87 Responsive to Change

OBJECTIVES To utilize the clinical practice instrument (CPI) to measure non-technical diagnostic and management skills during otolaryngology residency training To determine whether there is demonstrable change in these skills between PGY-2, 4, and 5 residents To evaluate whether results vary according to subspecialty topic or method of administration.

METHODS Prospective study of an otolaryngology residency training program Institutional review board approved n=248 evaluations of 45 otolaryngology resident trainees at regular intervals Analysis of variance with nesting and post-estimation pairwise comparisons to evaluate total and domain scores according to training level, subspecialty topic, and method of administration Examination preparation and security

PGY-4 & 5

RESULTS Total scores were significantly different among PGY-levels of training, with lower scores seen in the PGY-2 level compared to the PGY-4 or PGY-5 level (p<0.0001). PGY Level nMean Standard Deviation MinimumMaximum 25 th Percentile 75 th Percentile All

RESULTS

Residents scored higher in general otolaryngology than in the subspecialties (ANOVA, p<0.003). There was a significant difference in between general otolaryngology and pediatric otolaryngology (p<0.0001), and between general otolaryngology and head and neck surgery (post- ANOVA pairwise comparison, p<0.0033) RESULTS

Administering the examination with an electronic scoring system, rather than a paper-based scoring system, did not affect these results. The calendar year of administration did not affect these results. RESULTS

CONCLUSIONS Standardized interval evaluation with the CPI demonstrates improvement in qualitative diagnostic and management capabilities as PGY-levels advance. Administration of the CPI has been formally incorporated into the Harvard otolaryngology residency curriculum. The CPI can potentially be adapted for use in any otolaryngology training program, and is potentially applicable to any surgical specialty.

FUTURE DIRECTIONS Short-term educational missions… Milestones… Electronic case library… Program update follow up? Benchmarks? Other sites?

Thank you