OSCEs Kieran Walsh OSCE Means “objective structured clinical examination” Assesses competence Developed in light of traditional assessment methods.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Knowledge Dietary Managers Association 1 PART II - DMA Certification Exam Blueprint and Exam Development-
Rubric Design Denise White Office of Instruction WVDE.
Quality Control in Evaluation and Assessment
Summary Input – process – output model. Input Written Oral Observational.
1 Approach for Agency Access to the Sandbox Environment Background: The MyFloridaMarketPlace project recognizes the need for users to access the MyFloridaMarketPlace.
Fairness, Accuracy, & Consistency in Assessment
CDI Module 10: A Review of Effective Skills Training
Workplace assessment Dr. Kieran Walsh, Editor, BMJ Learning. 2.
MCR Michael C. Rodriguez Research Methodology Department of Educational Psychology.
Seminar for Teacher Assistants
IAEA Training in Emergency Preparedness and Response Development of Simulation Exercise Work Session (Drill) Module WS-012.
INNOVATIVE ASSESSMENT TOOLS PBL AND ASSESSMENT Thorsten SCHÄFER, Andreas BURGER Prof. Dr. med. Thorsten Schäfer ZENTRUM FÜR MEDIZINISCHE LEHRE RUHR-UNIVERSITÄT.
MBBS, MPH, MCPS, MRCGP (UK), FRIPH (UK), FHAE (UK) OSCE PREPARATION GUIDELINES Ass. Prof. Dr. Abdul Sattar KHAN Family & Community Medicine Department.
Clinical Coach Standardisation Meeting August 2011.
S tructured O bjective C linical E xamination SOC E.
Division of medical and dental education -update on OSCEs Rhoda MacKenzie.
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
Instructional Plan and Presentation Bernard Q Mallada CUR/516 Dr
An overview of Assessment. Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of.
An overview of Assessment. Aim of the presentation Define and conceptualise assessment Consider the purposes of assessment Describe the key elements of.
Constructing a test. Aims To consider issues of: Writing assessments Blueprinting.
Keele University School of Medicine Reliability of the L eicester C linical procedure A ssessment T ool (LCAT), a tool to support holistic generic assessment.
1 Tools of clinical assessment. 2 Presentation outline Introduction Introduction Our daily practice Our daily practice Types of assessment tools Types.
AGS Annual Meeting May 2005 Converting the Geriatric Functional Assessment Standardized Patient Instructor …into an OSCE Karen E. Hall, M.D., Ph.D. Clinical.
Assessment of Clinical Competence in Health Professionals Education
Professor Joseph J Y Sung Faculty of Medicine Ensuring Clinical Competence.
Training the OSCE Examiners
Assessment of clinical skills Joseph Cacciottolo Josanne Vassallo UNIVERSITY OF MALTA ANNUAL CONFERENCE - OSLO - MAY 2007.
Assessment of Communication Skills in Medical Education
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
SKILLS AUDITS The Skills Framework
Bob Woodwards SAC Chair, Oral and Maxillofacial Surgery.
Enabling learning and assessment Unit 3 – Week 1.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
1 Article Number (4) Application of Objective Structured Clinical Examination in Community Health Nursing Course: Experience of Staff Members and Students.
“The OSCE code” Objective Structured Clinical Examination.
Observation KNR 279 Stumbo, Observation as Assessment Therapist observes client’s behaviors Directly Indirectly Primary reason is to record behavior.
Assessment tool OSCE AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Understanding Meaning and Importance of Competency Based Assessment
Developing Structured Activity Tools. Aligning assessment methods and tools Often used where real work evidence not available / observable Method: Structured.
Prepare and Use Knowledge Assessments. IntroductionIntroduction Why do we give knowledge tests? What problems did you have with tests as a student? As.
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Planning for Successful Simulation Simulation Planning Guide - A Guided Discussion.
Kazakhstan Health Technology Transfer and Institutional Reform Project The Objective Structured Clinical Examination.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Objectives Structured Clinical Examinations (OSCE) By: Raniah Al-jaizani, M.Sc.
Dr. Paul Townsend, D.C. Director of Practical Testing, Research & Development NBCE Part IV Practical Examination Duties of a Part IV Examiner Dr. LeRoy.
International Diabetes Federation (IDF) East Mediterranean and Middle East Region (EMME) Workshop on Professional Educational Methodology in Diabetes
Assessment Tools.
Final Assessment OSCE Candidate Briefing. Objectives Fire & Emergency Evacuation Procedure Skills to be tested Structure of the OSCE examination Referrals.
OSCE Dr. Shama Mashhood. 2 Objectives At the end of the session, participants will be able to: Enumerate the reasons for use of OSCE Explain the process.
Development of the Egyptian Code of Practice for Student Assessment Lamis Ragab, MD, MHPE Hala Salah, MD.
Training the Trainers Assessing the Learner Progress By Dr Malik Zaben IMET
MRCGP The Clinical Skills Assessment January 2013.
Students Assessment: Essay Type Questions. Dr. G. U. Kavathia Associate Professor in Microbiology, P. D. U. Govt. Medical college, Rajkot.
Presented by Dr Safeera Hussainy OSCEology A primer on performance-based teaching, learning & assessment in pharmacy.
S tructured O bjective C linical E xamination P ractical.
Copyright © 2005 Avicenna The Great Cultural InstituteAvicenna The Great Cultural Institute 1 Student Assessment.
Dr.K.Sai leela KIMS,Narketpally
MRCGP The Clinical Skills Assessment January 2013.
CRITICAL TASK & OBSERVATION
OSCE.
Clinical Assessment Dr. H
Designing Assessment Things to be considered:
OSCE Objective Structured Clinical Examination
Assessment Methods.
Oral Exams & Interviews
Assessment David Taylor.
MRCS Part B OSCE Candidate Briefing.
Presentation transcript:

OSCEs Kieran Walsh

OSCE Means “objective structured clinical examination” Assesses competence Developed in light of traditional assessment methods

Long case One hour with the patient Full history and exam Not observed Interrogated for 20 min afterwards Bad cop … worse cop Taken back to the patient

Long case – holistic assessment BUT Unreliable Examiner bias.. examiner stringency.. unstructured questioning … little agreement between examiners Some easy patients.. some hard ones Some co-operative patients … some not Case specificity In the NHS when do you ever get an hour with a patient? (not valid) Not a test of communication skills

Objective structured Long Case Examination Record (OSLER) 30 min with patient Observed Prior agreement on what is to be examined Assessed on same items Case difficulty taken into account Comm skills assessed Two examiners

OSLER More reliable than long case One study showed it to be highly reliable But need 10 cases 20 examiners 300 minutes (? Feasibility / cost / acceptability)

Short cases 3-6 cases Have a look at this person and tell us what you think… A few minutes per station

Short cases fine BUT Different students saw different patients Cases differed in complexity Halo effect Not structured Examiners asked what they wanted Communication with patient “incidental”

OSCEs Clinical skill – history, exam, procedure Marking structured and determined in advance Time limit Checklist/global rating scale Real patient/actor Every candidate has the same test

OSCEs – reliable Less dependent on examiner’s foibles (as there are lots of examiners) Less dependent on patient’s foibles (as there are lots of patients) Structured marking More stations … more reliable Wider sampling – clinical, comm skills What is better more stations with one examiner per station or fewer stations with two examiners per station?

OSCEs – valid Content validity – how well sampling of skills matches the learning outcomes of the course Construct validity – people who performed well on this test have better skills than those who did not perform well Length of station should be “authentic”

OSCEs – educational impact Checklist – remember the steps in a checklist Global rating scale – holistic Both If formative – feedback

OSCEs - cost Planning Examiners Real patients Actors Manikins Admin staff Tech staff Facilities Material Catering Collating and processing results Recruitment Training Cost effective?? If used for correct purpose

OSCEs – acceptability Perceived fairness – examiners and examinees Become widespread

OSCE design - blueprinting Map assessment to curriculum Adequate sampling Feasibility – real patients, actors. manikins

OSCE design – station development In advance Trial it Construct statement Instructions for candidate Instructions for examiner List of equipment Personnel (real patient or …) Semi-scripted scenarios Marking schedule (global rating scale/checklist/both)

OSCE design – assessing Process skills Content skills Clinical management

OSCE design – piloting Return to slide 15 and go through again

OSCE design – simulated patients Consistency – reliability Training Briefing Debriefing Database of actors Scenarios in advance Practice with each other and with examiner

OSCE design - examiners Training Consulted

OSCE design – real patients Consistency – must give the same history each time Can fall sick Develop new signs / lose old ones Can get tired (10 students/day)

Practical considerations Single rooms/hall with partitions One rest for 40 minutes of assessment Rooms to rest – candidates/patients/examiners Floor plan Ideally each station – same duration Signs Floor map Stopwatch and bell Catering Transport Pay Acknowledgement

OSCE – standard setting – establish passmark Most tools developed for MCQs All are complex and time consuming None ideal “Borderline group” method emerging

OSCEs Should we run OSCE assessment ourselves? Should we run OSCE preparation courses? What is the difference between these two in practical terms? Should we film OSCEs? Should we do podcast OSCEs? Are distant OSCEs possible (e.g. via Skype) and could we be an enabler?