Download presentation
Presentation is loading. Please wait.
Published byJeff Quiett Modified over 10 years ago
1
Standard setting for clinical assessments Katharine Boursicot, BSc, MBBS, MRCOG, MAHPE Reader in Medical Education Deputy Head of the Centre for Medical and Healthcare Education St George’s, University of London The Third International Conference on Medical Education in the Sudan
2
WHAT are we testing in clinical assessments? Clinical competence What is it?
3
A popular modern model: elements of competence Knowledge ofactual oapplied: clinical reasoning Skills ocommunication oclinical Attitudes oprofessional behaviour Tomorrow’s Doctors, GMC 2003
4
Knows Knows how Shows how Behaviour~ skills/attitudes Cognition~ knowledge Another popular medical model of competence Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S67. Does Professional authenticity
5
Assessment of competence A review of developments over the last 40 years
6
Knows 1960: National Board of Medical Examiners in the USA introduced the MCQ MCQs conquered the world Dissatisfaction due to limitation of MCQs Knows Knows how Shows how Does
7
Knows how 1965: Introduction of PMP Patient Management Problem Knows Knows how Shows how Does
8
Patient Management Problem Clinical Scenario Action
9
Knows how 1965: Introduction of PMP oPatient Management Problem Well constructed SBA format MCQs can test the application of knowledge very effectively Knows Knows how Shows how Does
10
Shows how 1975: Introduction of Objective Structured Clinical Examination (OSCE) OSCEs are conquering the world Knows Knows how Shows how Does
11
> 2000: emerging new methods WBAs – Workplace-Based Assessments oMini Clinical Examination Exercise oDirect Observation of Practical Procedure oOSATS oMasked standardized patients oVideo assessment oPatient reports oPeer reports oClinical work samples o……… Knows Knows how Shows how Does
12
Mini CEX (Norcini, 1995) Short observation (15-20 minutes) and evaluation of clinical performance in practice using generic evaluation forms completed by different examiners (cf. http://www.abim.org/minicex/)
13
Example of mini-CEX form
14
DOPS – Direct Observation of Practical Procedure
15
OSATS – Objective structured Assessment of Technical Skills
16
WBAs – Workplace-Based Assessments All based on the principle of an assessor observing a student/trainee in a workplace or practice setting
17
Past 40 years: climbing the pyramid..... Knows Shows how Knows how Does Knows Factual tests: SBA-type MCQs….. Knows how (Clinical) Context based tests: SBA, EMQ, MEQ….. Shows how Performance assessment in vitro: OSCEs Does Performance assessment in vivo: Mini-CEX, DOP, OSATS, …..
18
Standard setting – why bother? To assure standards o At graduation from medical school o For licensing o For a postgraduate (membership) degree oFor progression from one grade to the next o For recertification
19
At graduation from medical school To award a medical degree to students who meet the University’s standards (University interest) To distinguish between the competent and the insufficiently competent (Public interest) To certify that graduates are suitable for provisional registration (Regulatory/licensing body interest) To ensure graduates are fit to undertake F1 posts (employer interest)
20
Definition of Standards A standard is a statement about whether an examination performance is good enough for a particular purpose oa particular score that serves as the boundary between passing and failing othe numerical answer to the question “How much is enough?”
21
Standard setting All methods described in the literature are based on ways of translating expert (clinical) judgement into a score
22
‘Classical’ standard setting methods For written test items: oAngoff’s method oEbel’s method For OSCEs: oBorderline group method oRegressions based method
23
Performance-based standard setting methods Borderline group method Contrasting group method Regression based standard method Kramer A, Muijtjens A, Jansen K, Düsman H, Tan L, van der Vleuten C Comparison of a rational and an empirical standard setting procedure for an OSCE, Medical Education, 2003 Vol 37 Issue 2, Page 132 Kaufman DM, Mann KV, Muijtjens AMM, van der Vleuten CPM. A comparison of standard-setting procedures for an OSCE in undergraduate medical education. Acad Med 2000; 75:267-271.
25
The examiner’s role in standard setting Uses the examiner’s clinical expertise to judge the candidate’s performance Examiner allocates a global judgement based on the candidate’s performance at that station Remember the level of the examination Pass Borderline Fail
26
Borderline Group Method Checklist 1. 2. 3. 4. 5. 6. 7. TOTAL Passing score Borderline score distribution Pass, Fail, Borderline Test score distribution
27
Contrasting groups method Checklist 1. 2. 3. 4. 5. 6. 7. TOTAL Pass, Fail, Borderline P/B/F Test score distribution Passing score PassFail
28
Regression based standard Checklist 1. 2. 3. 4. 5. 6. 7. TOTAL Overall rating 1 2 3 4 5 1 2 3 4 5 Checklist Score X X = passing score 1 = Clear fail 2 = Borderline 3 = Clear pass 4 = Excellent 5 = Outstanding Clear Borderline Clear Excellent Outstanding fail pass
29
Performance-based standard setting Advantages Utilises the expertise of the examiners othey are observing the performance of the students othey are in a position to make a (global) judgement about the performance based on their clinical expertise expected standards for the level of the test knowledge of the curriculum/teaching
30
Borderline/ contrasting/ regression-based methods Advantages Large number of examiners set a collective standard while observing the candidates – not just an academic exercise Reliable: cut-off score based on large sample of judgments Credible and defensible: based on expert judgment in direct observation
31
Performance-based standard setting Disadvantages Requires large (-ish) cohort of candidates to achieve enough numbers in the ‘borderline’ group Passing score not known in advance Judgments not independent of checklist scoring Requires expert processing of marks immediately after the exam o Checking of results o Delay in producing results
32
Work Based Assessment tools No gold standard standard setting method!
33
Standard setting Standards are based on informed judgments about examinees’ performances against a social or educational construct e.g. competent practitioner suitable level of specialist knowledge/skills
34
Standard setting for Work Based Assessment tools oBased on descriptors for a particular level of training oInformation gathering relying on descriptive and qualitative judgemental information oDescriptors agreed by consensus/panel of clinical experts oPurpose of WBA tools: formative rather than summative: feedback
35
Feedback Giving feedback to enhance learning is some form of judgement by the feedback giver on the knowledge and performance of the recipient It is a very powerful tool!
36
WBAs and feedback Underlying principle of WBA tools is FEEDBACK from oTeacher/supervisor oPeers/team members oOther professionals oPatients
37
Conclusions It’s not easy to set standards for Work Based Assessments (in the ‘classic’ sense) Expert professional judgement is required Wide sampling from different sources: range of tools, contexts, cases and assessors Feedback to the trainee
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.