Download presentation
Presentation is loading. Please wait.
Published byGodfrey Tate Modified over 8 years ago
1
Making assessment meaningful and useful…….. Andrew Whitehouse January 25th 2011
2
The Talk Remind us why we assess Remind us of GMC standards for assessment Draw attention to some flaws in our present practice Suggest a suitable approach for the two assessment purposes
3
Why do we assess? What meaning? What use?
4
GMC standards. April 2010 What is assessment? “A systematic procedure for measuring a trainee’s progress or level of achievement, against defined criteria, to make a judgement about a trainee.” “…either as national examinations or as assessments in the workplace.”
5
GMC standards. April 2010 What is assessment? “Competence (can do) is necessary, but not sufficient, for performance (does), and as trainees’ experience increases so performance-based assessment in the workplace becomes more important”
6
DOES SHOWS HOW KNOWS HOW KNOWS BEHAVIOUR COGNITIVE ASSESSMENTS PERFORMANCE “in vivo” eg MSF ( professional), supervision of work by multiple observers (clinical) Clinical audit/complaints etc COMPETENCE “in vitro” eg OSCE, simulation, miniCEX, DOPS CLINICAL TEST exam, grey cases FACTUAL TEST MCQ
7
Two central purposes of assessment For society’s benefit. “Is this trainee OK to progress/qualify?” Screening/summative For the trainee’s benefit. “How am I doing?” Feedback/formative
8
So how best to meet both purposes?
9
Summative (high stakes) purpose, needs Validated tests of knowledge and clinical skills ie exams! Real performance assessment in work; work they are doing all the time! Clinical Interpersonal (professionalism) Administration etc etc
10
So how best to meet both purposes? Formative (teaching) purpose needs Topics to be chosen based on trainee need Direct trainer observation Immediate useful feedback Not to be muddled with “need to pass test”
11
GMC Standard 4 “Assessments must systematically sample the entire content, appropriate to the stage of training, with reference to the common and important clinical problems that the trainee will encounter …………
12
So how well are we doing with WPBA? Formatively? “Impact of wpba on doctors’ education and performance: systematic review”. Miller, Archer, BMJ Sep 2010 “…MSF can lead to performance improvement…..feedback and facilitation have a profound effect on the response…..” “There is no evidence that…..miniCEX, DOPS, CBD lead to improvement in performance, although subjective reports on their educational impact are positive.”
13
So how well is WPBA being done? In Foundation? (where it began in UK, 2005) Foundation curriculum: (specifies miniCEX, DOPs, CBD, MSF as wpbas) “An evaluation of the miniCEX in the Foundation Programme” Jackson, Wall, Hospital Medicine Oct 2010. 196 FY1 miniCEX analysed 6 months into the FY1 year Only one score “below expectations for FY1 completion” 12% done by consultants 38% “rarely or never observed by assessor” Trainee satisfaction 3.8 on 10 point scale Trainees regarded it as a “tick box exercise” Assessors thought it “did not give a realistic insight into trainee performance”, as did 70% of trainees. Assessors found the 6 point score ill defined and complex “potential benefit” widely acknowledged
14
So how are we doing with WPBA? In Foundation? Foundation curriculum: EP data from NES, first year (in press, PGMJ) Tochel et al. 2489 FY1 posts, 15 wpba done on each of 92% trainees 2339 FY2 posts, 6 wpba done on each of 87% trainees Rating scale of 1 ( v unsatisfactory) - 7 (v satisfactory) Results? <5 in 1% FY1 <5 in 3% FY2 ie overwhelmingly “very satisfactory”
15
What are speciality curricula doing? JRCPTB curriculum for CMT /GIM (ARCP “decision aid” specifies miniCEX, DOPS, CBD, MSF, ACAT as wpbas,) Clinical competences consist of: 20 common conditions, and 4 emergency conditions 40 other conditions. 17 practical procedures. All the common and emergency, 34 of the other conditions and 15 of the procedures, must be linked to “evidence” to pass the summative ARCP. “Evidence” normally to consist of these wpbas.
16
To summarise.. Collins said “...they’re being assessed to death…” Tooke said “aspire to excellence………” Watson said “½ a million hours used in wpba per year” The literature suggests we are abusing wpba for summative input, and losing thereby its value for teaching. Consultants are busy with patients, trainees say “ this is just box ticking, and we’re busy too”.
17
So how should we “evidence trainee ability” for summative purposes (ARCP)? Maximise performance assessment for this, so; Retain MSF for professionalism. Harvest multiple trainers’ views on clinical ability, in 3 monthly “local faculty group” (LFG) meetings. Beefed up, structured ES report for ARCP. Use a fuller range of performance review, include OPD letters, casenotes, patient survey, audit, complaints…….. Be less exhaustive, use “competence sampling”. Retain mandatory exams. Perhaps retain DOPS for procedures, target wpbas, by trained assessors, for problem trainees….. (Nb assessment drives learning. Assess performance, and performance will improve…….)
18
So how should we harness wpba for its original “formative” value? Don’t “require it as evidence of ability”. Do require it as evidence of teaching. Let’s not call it “assessment” at all. Get rid of the “scores”, concentrate on the “feedback.” Keep the “results” away from ARCP, it’s private! Trainees and ES should select wpba topics based on “learning needs”, not on “can already do it”! Perhaps target wpbas on specific shortcomings for problem trainees.
19
And surely….? Poorly used doctor time, and inaccurate, invalid, uncalibrated workplace “assessment” of trainees, with little or no educational value, are patient safety issues, and need to be tackled.
20
Supporting Excellence In Medical Education 9th National Multi-specialty Conference for Heads of Schools, Programme Directors, Directors of Medical Education 25 & 26 th January 2011 C O G P E D
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.