S tructured O bjective C linical E xamination P ractical.

Slides:



Advertisements
Similar presentations
Use of an objective assessment tool to evaluate students basic electrical engineering skills Nandini Alinier University of Hertfordshire, U.K. Engineering.
Advertisements

OSCEs Kieran Walsh OSCE Means “objective structured clinical examination” Assesses competence Developed in light of traditional assessment methods.
“Scoring an Oral Simulation Exam” Elizabeth A. Witt, Ph.D. American Board of Emergency Medicine Presented at the 2005 CLEAR Annual Conference September.
HONG KONG EXAMINATIONS AND ASSESSMENT AUTHORITY PROPOSED HKDSE ENGLISH LANGUAGE ASSESSMENT FRAMEWORK.
Objective Structured Clinical Examination (OSCE) Arnuparp Lekhakula M.D.,M.S. Faculty of Medicine Prince of Songkla University Hat Yai, Songkhla.
S tructured O bjective C linical E xamination SOC E.
Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
Essay Type Questions and Their Improvement
Division of medical and dental education -update on OSCEs Rhoda MacKenzie.
Evaluating Teaching and Learning Linda Carey Centre for Educational Development Queen’s University Belfast 1.
Objective vs. subjective in assessment Jaime Correia de Sousa, MD, MPH Horizonte Family Health Unit Matosinhos Health Centre - Portugal Health Sciences.
Performance Management and Appraisal
Constructing a test. Aims To consider issues of: Writing assessments Blueprinting.
1 Tools of clinical assessment. 2 Presentation outline Introduction Introduction Our daily practice Our daily practice Types of assessment tools Types.
Assessment of Clinical Competence in Health Professionals Education
Standard setting Determining the pass mark - OSCEs.
EVALUATION OF DR.MOHAMMED AL NAAMI, FRCSC, FACS, M Ed. Using O bjective S tructured C linical E xamination (OSCE)
Training the OSCE Examiners
Barbara J Pettitt, MD Emory University Dept. of Surgery Atlanta GA April 24, 2015.
Friday 18 th November 2011 Video Screening Assessment A Multiple Mini Interview Approach Dr Sue Morison Dental School Queen’s University Belfast.
Chapter 4 Principles of Quantitative Research. Answering Questions  Quantitative Research attempts to answer questions by ascribing importance (significance)
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Educational Psychology, 11 th Edition ISBN © 2010 Pearson Education, Inc. All rights reserved. Classroom Assessment, Grading, and Standardized.
Understand the sequence of oral presentation assignment components Learn how to develop explanations for assigned material –Listen to lecture on Rowan.
Portfolio Assessment in Clerkship Michelle Gibson - Geriatrics (thanks to Chris Frank and Melissa Andrew too)
1 Article Number (4) Application of Objective Structured Clinical Examination in Community Health Nursing Course: Experience of Staff Members and Students.
Classroom Assessments Checklists, Rating Scales, and Rubrics
1 An Introduction to Language Testing Fundamentals of Language Testing Fundamentals of Language Testing Dr Abbas Mousavi American Public University.
Unit 1 – Preparation for Assessment LO 1.1&1.2&1.3.
Assessment tool OSCE AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
High expectations… “To improve teaching and learning throughout the school”
R 3 P Colloquium American Board of Pediatrics Jan. 31 – Feb. 2, 2007 The Past, Present and Future Assessments of Clinical Competence A Canadian Perspective.
Felt for signatur(enhet, navn og tittel) Associate Professor Annetine Staff Head of examination committee, member of semester committee Professor Kai-Håkon.
Understanding Meaning and Importance of Competency Based Assessment
Measuring Complex Achievement
Raises, Merit Pay, Bonuses Personnel Decisions (e.g., promotion, transfer, dismissal) Identification of Training Needs Research Purposes (e.g., assessing.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Kazakhstan Health Technology Transfer and Institutional Reform Project The Objective Structured Clinical Examination.
OSCE Objective Structured Clinical Examination
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Formative Assessment Applications & Pitfalls. Outline General Ideas Application Benefit & Limitation Pitfalls.
Objectives Structured Clinical Examinations (OSCE) By: Raniah Al-jaizani, M.Sc.
MISSOURI PERFORMANCE ASSESSMENTS An Overview. Content of the Assessments 2  Pre-Service Teacher Assessments  Entry Level  Exit Level  School Leader.
International Diabetes Federation (IDF) East Mediterranean and Middle East Region (EMME) Workshop on Professional Educational Methodology in Diabetes
Assessment Tools.
Assessing Your Learner Lawrence R. Schiller, MD, FACG Digestive Health Associates of Texas Baylor University Medical Center, Dallas.
Classroom Assessment, Grading, and Standardized Testing
School of Clinical Medicine School of Clinical Medicine UNIVERSITY OF CAMBRIDGE UK Council problems with OSCE assessment Jonathan Silverman 2012.
Assessment and Testing
Teaching & Assessment Assessment is a Subset of Teaching.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Types of Examinations Dr. Kosala Marambe Medical Education Unit.
Assessing Learners The Teaching Center Department of Pediatrics UNC School of Medicine The Teaching Center.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
Standards-Based Tests A measure of student achievement in which a student’s score is compared to a standard of performance.
Chapter 8:Evaluation Anwar F. Al Arfaj Supervised by Dr. Antar Abdellah Submitted by.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
MRCGP The Clinical Skills Assessment January 2013.
Rubrics: Using Performance Criteria to Evaluate Student Learning PERFORMANCE RATING PERFORMANCE CRITERIABeginning 1 Developing 2 Accomplished 3 Content.
Presented by Dr Safeera Hussainy OSCEology A primer on performance-based teaching, learning & assessment in pharmacy.
Copyright © 2005 Avicenna The Great Cultural InstituteAvicenna The Great Cultural Institute 1 Student Assessment.
Classroom Assessments Checklists, Rating Scales, and Rubrics
MRCGP The Clinical Skills Assessment January 2013.
Michael Henehan, DO San Jose-O’Connor Hospital
Data Analysis and Standard Setting
Classroom Assessments Checklists, Rating Scales, and Rubrics
Clinical Assessment Dr. H
Changes to the Final FRCA Exam
OSCE Interest group 3rd April 2009 Barry Ricketts
Presentation transcript:

S tructured O bjective C linical E xamination P ractical

What is OSCE ? series of stations with tasks series of stations with tasks planned planned marking form marking form examiner examiner patient : SP patient : SP organization > examination organization > examination

Why OSCE ? before OSCE (1975) before OSCE (1975) viva (oral), long case, short case viva (oral), long case, short case  valid ? know how NOT show how know how NOT show how  reliable ? different patients different patients different examiners different examiners

Why OSCE ? more valid more valid  show how more reliable more reliable  same task | patient  same examiner or  same structured marking sheet

Basic Structure

Basic Structure : Parallel A B CD

Basic Structure : Double

How to start ? blueprint of the whole OSCE blueprint of the whole OSCE design the station design the station design the mark sheet design the mark sheet

Blueprint reversed table of specification reversed table of specification

Station Design station time 4-15 min. station time 4-15 min. total time < 2 hrs total time < 2 hrs focus task focus task examiner used : Y | N, who? examiner used : Y | N, who? pilot pilot

Type of Stations static | written static | written practical : technique practical : technique clinical clinical

Marking Sheet Design checklist checklist rating scale rating scale score score

Checklist dichotomous : Yes | No dichotomous : Yes | No Pros Pros  high objectivity  high reliability  easy to feedback Cons Cons  only quantity check

Checklist How to improve? How to improve?  stem clear clear observable observable not too long not too long  overall not too long not too long

Rating Scale rating rating quality concern quality concern lower objectivity lower objectivity  lower reliability

Rating Scale How to improve? How to improve?  3-7 scale  more clarification of each scale  more raters  rater training common errors of rating scale common errors of rating scale

Rating Scale common errors common errors  leniency error  central tendency error  halo effect  logical error  proximity error  contrast error

Examiner station developer station developer non station developer non station developer  teacher  not teacher other staff other staff SP SP participation => reliability participation => reliability

Observation direct direct indirect indirect  one-way mirror  monitor  video

Getting Feedback: How? verbal verbal marked checklist & be the subject marked checklist & be the subject marked checklist & watch video marked checklist & watch video printed answer printed answer relevant papers relevant papers

Getting Feedback : When? during the exam during the exam  intra-station  in another station  stress?  NB: too much information! after the exam after the exam  end of all stations

Station & Marking learning by doing learning by doing 8 groups : A-E in each group 8 groups : A-E in each group 2 x 4 stations 2 x 4 stations 2(p) - 4(x) - 4(d) 2(p) - 4(x) - 4(d) signal : signal : materials & ID materials & ID task (flexible) task (flexible) lunch : 3rd floor lunch : 3rd floor

Setting an OSCE learning by doing | 8 groups learning by doing | 8 groups structured task | medical student V structured task | medical student V time time  7 min. test (without feedback)  5 min. test + 2 min. with feedback available tools : pls ask available tools : pls ask draft of test and marking sheet : ~ 4 p.m. draft of test and marking sheet : ~ 4 p.m. preparation a.m. preparation a.m.

Minimal Passing Score criterion-referenced ( อิงเกณฑ์ ) criterion-referenced ( อิงเกณฑ์ )  holistic  modified Angoff norm-referenced ( อิงกลุ่ม ) norm-referenced ( อิงกลุ่ม )  borderline method  relative method

Holistic Method medical school’s faculty-wide pass mark medical school’s faculty-wide pass mark e.g. 60% e.g. 60%

Modified Angoff Method group of experts group of experts get the OSCE get the OSCE “think of a group of minimally acceptable candidates” “think of a group of minimally acceptable candidates” decide the probability decide the probability brief discussion and decide again brief discussion and decide again

Minimal Passing Score criterion-referenced ( อิงเกณฑ์ ) criterion-referenced ( อิงเกณฑ์ )  holistic  Modified Angoff norm-referenced ( อิงกลุ่ม ) norm-referenced ( อิงกลุ่ม )  borderline method  relative method

Borderline Method marking form : checklist + global rating marking form : checklist + global rating all categorized ‘borderline’ students all categorized ‘borderline’ students mean scores of ‘borderline’ group mean scores of ‘borderline’ group

Borderline Method Mean borderline score = ( ) / 3 = 74

Relative Method 1st method : Wijnen Method 1st method : Wijnen Method  Passing mark = mean -1.96SE 2nd method 2nd method  60% of the 95th percentile rank score

Minimal Passing Stations criterion-referenced criterion-referenced

Staff & OSCE : Like emotional comfort emotional comfort validly assess validly assess consistent consistent

Staff & OSCE : Dislike too compartmentalized too compartmentalized no opportunity to observe the complete patient evaluation of the student no opportunity to observe the complete patient evaluation of the student repetitive nature => boring repetitive nature => boring

Students & OSCE fairer than other methods fairer than other methods less stressful less stressful unsure whether the important aspects tested unsure whether the important aspects tested

Limitations of OSCE lengthy preparation lengthy preparation need more observational skill of the staff need more observational skill of the staff costly costly low inter-station correlation low inter-station correlation test security? test security?

What’s next? evaluation => learning evaluation => learning summative => formative summative => formative

Innovation senior student as SP and examiner in OSCE senior student as SP and examiner in OSCE study sheet listing Dx that might appear on the OSCEs study sheet listing Dx that might appear on the OSCEs add structured oral exam into OSCE add structured oral exam into OSCE GOSCE GOSCE

GOSCE : Group OSCE Pros Pros  economy  mutual teaching  mutual support  opportunity to examine social skill Cons Cons  lack of individual assessment  different participants do different tasks

Potential Use of GOSCE formative assessment formative assessment end-of-course assessment end-of-course assessment exploring interpersonal relationship exploring interpersonal relationship teaching method for short course teaching method for short course

Re-using OSCE stations across rotation in the same academic year across rotation in the same academic year  statistically OK from year to year from year to year  statistically not OK

Conclusion : OSCE What ? What ?  stations + tasks + checklist Why ? Why ?  more valid, more reliable How ? How ?  How to organize?  How to analyze?