Download presentation
Presentation is loading. Please wait.
Published byAleesha Quinn Modified over 8 years ago
1
OBJECTIVE INTRODUCTION Emergency Medicine Milestones: Longitudinal Interrater Agreement EM milestones were developed by EM experts for the Accreditation Council for Graduate Medical Education Used to recurrently assess competency- based developmental outcomes of postgraduate trainees Little is known regarding how closely resident self-evaluations compare to faculty evaluations as determined by the training program’s Core Competency Committee Statistics, such as Cohen’s kappa, may be useful for measuring agreement between resident and faculty evaluation scores Interrater agreement is a way to statistically measure agreement between two or more independent raters 1 Takes into account agreement due to chance Ranges from poorer than chance (-1) to better than chance (+1) agreement 2 See Table 1 To determine whether resident self-evaluation scores were consistent with their corresponding Core Competency Committee faculty evaluation scores in semiannual EM milestones assessments over time Alan H. Breaud, MPH 1 ; Andrew L. Chu, BS 2 ; Lauren Sigman, MD 3 ; Kerrie P. Nelson, PhD 4 ; Kerry M. McCabe, MD 1,2 1 Boston Medical Center, Boston, MA; 2 Boston University School of Medicine, Boston, MA; 3 LAC+USC Medical Center, Los Angeles, California; 4 Boston University School of Public Health, Boston, MA 11.522.533.544.55 1 1.5 2 2.5 3 3.5 4 4.5 5 LIMITATIONS METHODS We collected milestone scores of postgraduate year (PGY) 1 through 4 EM residents training at one urban, academic medical center from spring 2014, fall 2014, and spring 2015 Residents’ self-assessed using the milestones at each time point and their scores were matched to their corresponding faculty evaluation score Faculty evaluation score determined by Core Competency Committee consensus Calculated quadratically-weighted kappa statistical values (95% CI) were calculated to assess the degree of chance-corrected association between the self-evaluations and faculty evaluations at each time point Quadratically weighted kappa statistics allows for “partial credit” for ordered data 3,4 Able to account for disagreements dependent on degree of separation The weights decrease with increasing distance between categories Quadratic weighting provides “partial credit” to off-diagonals (light purple) The diagonal (dark purple) reflects perfect agreement Limited number of time points collected Sample size varied among different time points as well as within each time point Unable to analyze individual PGY trends due to this Potential for high weighted kappa when exact agreement is low 4 1. Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational & Psychological Measurement, 20(1), 37. 2. Fleiss, J. L., & Cohen, J. (1973). The Equivalence of Weighted Kappa and the Intraclass Correlation Coefficient as Measures of Reliability. Educational and Psychological Measurement, 33(3), 613–619. 3. Cohen, J. (1968). Weighted kappa: nominal scale agreement with provision for scaled disagreement or partial credit. Psychological Bulletin, 70(4), 213–220. 4. Graham, P., & Jackson, R. (1993). The analysis of ordinal agreement data: beyond weighted kappa. Journal of Clinical Epidemiology, 46(9), 1055–1062. REFERENCES Ref: Landis, J. R., & Koch, G. G. (1977). The measurement of observer agreement for categorical data. Biometrics, 33(1), 159–174. Table 1. General guideline for interpreting the quadratically-weighted kappas Figure 1. Application of quadratic weighting
2
RESULTS A weighted kappa range of 0.43 - 0.88, 0.39 - 0.87, and 0.59 - 0.85 was found for spring 2014, fall 2014, and spring 2015, respectively Moderate to strong chance-corrected association for nearly all milestone assessments Milestone assessing competence with vascular access in spring 2014 had the highest chance-corrected association Kappa: 0.88 (95% CI 0.81 – 0.94) The milestone assessing ultrasound had consistently moderate chance-corrected association Kappa range: 0.43 – 0.59 Sample sizes for each self-assessed milestone ranged from 32 to 45 responses CONCLUSIONS Residents’ self-evaluation of their own competency-based development as defined by the milestones assessment tool is, on average, in alignment with their corresponding faculty Core Competency Committee evaluations Further directions include collecting more time points in order to examine postgraduate year data Table 2. Quadratically-Weighted Chance-Corrected Associations for milestones at each time point Kappa Statistic Strength of Agreement <0.00Poor 0.00-0.20Slight 0.21-0.40Fair 0.41-0.60Moderate 0.61-0.80Substantial 0.81-1.00Almost Perfect Emergency Medicine Milestones: Longitudinal Interrater Agreement Alan H. Breaud, MPH 1 ; Andrew L. Chu, BS 2 ; Lauren Sigman, MD 3 ; Kerrie P. Nelson, PhD 4 ; Kerry M. McCabe, MD 1,2 1 Boston Medical Center, Boston, MA; 2 Boston University School of Medicine, Boston, MA; 3 LAC+USC Medical Center, Los Angeles, California; 4 Boston University School of Public Health, Boston, MA
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.