Clinical Skills Verification rater Training MODULE 2 Training Faculty Evaluators of Clinical Skills: Drivers of Change in Assessment Joan Anzia, M.D. Tony.

Slides:



Advertisements
Similar presentations
A Systems Approach To Training
Advertisements

Assessment types and activities
Evaluation Overview - Basics. Purpose of Testing Diagnostic Formative Summative.
Objectives Explain the purpose of the RIME feedback method.
Standardized Patients in Training and Evaluation Judith G. Gearhart, MD.
Clinical Skills Verification Rater Training MODULE 4 Strategies for Clinical Skills Assessment: Models and Best Practices Michael Jibson, M.D., Ph.D. David.
Where to from here in Assessment in Medical Education? Dr Heather Alexander 5 November 2010.
Assessment of Professionals M. Schürch. How do you assess performance? How do you currently assess the performance of your residents? What standards do.
Clinical Supervision Foundations Module Six Performance Evaluation.
ABPN Task Force on Clinical Skills Verification Rater Training.
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
Objective vs. subjective in assessment Jaime Correia de Sousa, MD, MPH Horizonte Family Health Unit Matosinhos Health Centre - Portugal Health Sciences.
Clinical Skills Verification Rater Training MODULE 3 Setting the Clinical Skills Verification Examination Climate and Giving Feedback Karen Broquet, M.D.
UCD School of Medicine “Criterion Based” vs. “Norm-Based” Evaluation David L Gaspar MD October 18, 2008.
Overview: Competency-Based Education & Evaluation
DEVELOPING DEPARTMENTAL OUTCOMES ASSESSMENT PLANS Jerry Rackoff Lois Huffines Kathy Martin.
Assessment of Clinical Competence in Health Professionals Education
Comparison: Traditional vs. Outcome Project Evaluative Processes Craig McClure, MD Educational Outcomes Service Group University of Arizona December 2004.
Assessing and Evaluating Learning
Assessment of clinical skills Joseph Cacciottolo Josanne Vassallo UNIVERSITY OF MALTA ANNUAL CONFERENCE - OSLO - MAY 2007.
QUALITY ASSURANCE PROJECT Improvement Coach The purpose of this session is to introduce participants to the role of the improvement coach and prepare for.
Develop Systematic processes Mission Performance Criteria Feedback for Quality Assurance Assessment: Collection, Analysis of Evidence Evaluation: Interpretation.
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
Evaluation: A Challenging Component of Teaching Darshana Shah, PhD. PIES
Mohammed Y Al-Naami FRCSC, FACS, M Ed.
Rater Training for Clinical Skills Verification Module 5 Performance Standards Michael Jibson, M.D., Ph.D. Jeffrey Hunt, M.D. David Kaye, M.D. Richard.
The MiniCEX and the Evaluation of Clinical Skills National Health Service Foundation Training Program.
Skill Assessment of 4 th Year students in KMU.
Mini-CEX Mini-clinical evaluation exercise لیلا پاداش
Simulation for Patient Safety Slideset 1 Rachel Yudkowsky, MD MHPE UIC CPC.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Competencies and Assessment Strategies Prof Yu-Lung Lau & Dr. Pamela Lee Department of Paediatrics and Adolescent Medicine LKS Faculty of Medicine The.
Assessment tool OSCE AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Alternative Assessment
Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst,
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Patricia A. Mahoney, MSN, RN, CNE
Developing an Assessment System B. Joyce, PhD 2006.
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Assessment Tools.
What is “Competency” in the New Millennium? Shirley Schlessinger, MD, FACP Associate Dean for Graduate Medical Education University of Mississippi Medical.
TUSK Competency Framework Project November 20, 2008.
What is a Planned Curriculum?
R³P Project Faculty Development. Question Themes: Day 1 Engagement of faculty (5) Recognizing faculty proficiency in assessment (3) Create value for clinician-educator.
Assessment tools MiniCEX, DOPS AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Assessing Learners The Teaching Center Department of Pediatrics UNC School of Medicine The Teaching Center.
Improving and Focusing a Training Course using GroupSystems Electronic Meeting Technology Alan Weatherall Ventana UK
Training the Trainers Assessing the Learner Progress By Dr Malik Zaben IMET
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase.
Plan This part of the presentation will: Introduce you to the rationale behind the Competency Framework Show you how to use the suggested assessments and.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
Dr. Shruti Mohanty, FAIMER 2008,GSMC, KIMS, Narketpally
Preparing for the ACGME's Next Accreditation System (NAS) A Prospective View for Family Medicine Residencies Joseph J. Brocato, PhD, University of Minnesota.
Copyright © 2005 Avicenna The Great Cultural InstituteAvicenna The Great Cultural Institute 1 Student Assessment.
Classroom Assessments Checklists, Rating Scales, and Rubrics
ASSESSMENT METHODS – Chapter 10 –.
Mini-Clinical Evaluation Exercise (Mini-CEX)
Classroom Assessments Checklists, Rating Scales, and Rubrics
Clinical Assessment Dr. H
A Practical Approach to Evaluation in the Ambulatory Setting in the Era of the New ACGME General Competencies Eric S. Holmboe Stephen Huot Yale University.
This presentation includes the audio recording from the “Review of the Internal Medicine Subspecialty Reporting Milestones” webinar held on September 11,
This presentation includes the audio recording from the “Review of the Internal Medicine Subspecialty Reporting Milestones” webinar held on September 9,
Father Muller Medical College & Hospital, Mangalore, Karnataka.
Medical Students Documenting in the EMR
Medical Students Documenting in the EMR
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
Integrating Best Practices of Participant Evaluation Clinical Instructor Intensive Adrienne Small, DNP, FNP-C, CNE, CHSE Medical Instructor Duke University.
Presentation transcript:

Clinical Skills Verification rater Training MODULE 2 Training Faculty Evaluators of Clinical Skills: Drivers of Change in Assessment Joan Anzia, M.D. Tony Rostain, M.D.

Outline Mini pretest! Brief history of assessment in medical education Drivers of change in assessment in medical education Miller’s pyramid Why is faculty training necessary? Methods to train faculty to evaluate clinical skills. Post-test

Module 2 Pre-Test 1.A clinical skills exam of a trainee assesses whether he or she “knows how” according to Miller’s Pyramid. a.True b.False

Module 2 Pre-Test 2.Faculty evaluators in a group are preparing their individual evaluation scores for a videotaped trainee clinical skills exam, and comparing their scores with the scores of “expert” raters. This activity is called: a. Behavioral Observation Training b.Frame of Reference Training c.Direct Observation of Competence Training d.Performance Dimension Training

Brief history of assessment in medical education Through the 1950s: knowledge evaluated through essays and open-ended questions graded by faculty. Clinical skill and judgment tested with live oral examinations, sometimes after bedside data-gathering by the examinee. 1960s: multiple-choice exams to test knowledge base

Clinical Skills Exams vs Multiple Choice Question Exams

New technologies come on the scene Introduction of computers in the 1980s enabled large-scale testing using MCQs that are machine-scanned and scored. Computers also allow the assessment of clinical decision-making through use of interactive item formats. Advances in psychometrics allow shorter tests, reduction of bias, and identification of error sources.

Since the 1980s OSCEs (Objective Structured Clinical Exams) have been fine-tuned with improved psychometric qualities. Assessment of clinical skills and performance has lagged behind – faculty are inexperienced, don’t share common standards, and have not been trained to apply them consistently.

Drivers of change in medical education Outcomes-based education: a focus on the “end product” rather than the process. What should a psychiatrist “look like” at the end of training? National initiatives in accountability, patient safety and quality assurance: maintaining the public trust in the medical profession and improving the quality of healthcare.

Levels of assessment: Miller’s Pyramid

Miller’s Pyramid Knows: what a trainee “knows” in an area of competence. MCQ-based exam. Knows how: does the trainee know how to use the knowledge (acquire data, analyze and interpret findings). An interactive reasoning exam. Shows how: can the trainee deliver a competent performance of the skill with a patient. Clinical skills exams. Does: does the clinician routinely perform at a competent level outside of a controlled testing environment? Performance-in-practice assessment, critical incident systems.

Why is faculty training necessary? Assessment methods based on observation are only as good as the individuals using them. Holmboe and Hawkins, 2008 Faculty sometimes don’t possess sufficient knowledge, skills and attitudes in particular competencies. Competencies evolve over time, and faculty may not have been trained in specific competencies.

How do we train evaluators? Empirically studied training methods: Behavioral Observation Training (BOT) Performance Dimension Training (PDT) Frame of Reference Training (FoRT) Direct Observation of Competence Training

Behavioral Observation Training Get faculty to increase the number of their observations of their trainees. Provide a form of observational aide that raters can use to record observations ( a “behavioral diary”). Help faculty members learn how to prepare for an observation. (Determining goals, evaluator position, etc.)

Performance Dimension Training Designed to teach the faculty with the appropriate performance dimensions used in the evaluation system. It is a critical element for all rater training programs: goal is to define all the criteria for each dimension of performance. Faculty interact to further define criteria (what constitutes “superior performance” etc.) and work towards consensus on framework and specific criteria.

Frame of Reference Training First, Performance Dimension Training must be completed. FoRT targets accuracy in rating: goal is to achieve consistency. First, minimal criteria for satisfactory performance defined, then marginal criteria. Faculty are given clinical vignettes describing performance in different ranges.

Frame of Reference Training (cont.) Faculty use vignettes to provide ratings. Trainer provides feedback on what the “true” ratings should be, with an explanation for each rating. Discussion of discrepancy between faculty ratings and “true” ratings from trainer. Repeated practice: “calibration.”

Module 2 Post-Test 1.A clinical skills exam of a trainee assesses whether he or she “knows how” according to Miller’s Pyramid. a.True b.False

Module 2 Post-Test 1.A clinical skills exam of a trainee assesses whether he or she “knows how” according to Miller’s Pyramid. b.False A CSV exam assesses whether a resident can “show how.”

Module 2 Post-Test 2.Faculty evaluators in a group are preparing their individual evaluation scores for a videotaped trainee clinical skills exam, and comparing their scores with the scores of “expert” raters. This activity is called: a. Behavioral Observation Training b.Frame of Reference Training c.Direct Observation of Competence Training d.Performance Dimension Training

Module 2 Post-Test 2.Faculty evaluators in a group are preparing their individual evaluation scores for a videotaped trainee clinical skills exam, and comparing their scores with the scores of “expert” raters. This activity is called: b.Frame of Reference Training