The MiniCEX and the Evaluation of Clinical Skills National Health Service Foundation Training Program.

Slides:



Advertisements
Similar presentations
NMRCGP Assessments on the e portfolio A summary for hospital clinical supervisors Maggie Eisner June 2009.
Advertisements

Objectives Explain the purpose of the RIME feedback method.
Clinical Skills Verification Rater Training MODULE 4 Strategies for Clinical Skills Assessment: Models and Best Practices Michael Jibson, M.D., Ph.D. David.
Introduction to Competency-Based Residency Education
HCA Session III Teaching Physician Rules Time Based Coding; Counseling
PRIOR LEARNING ASSESSMENT RESEARCH PROJECT IN NURSING Mount Royal College, Calgary, Alberta Purpose of the Research: To assess the impact of.
Teacher Evaluation Model
Workplace-based Assessment. Overview Types of assessment Assessment for learning Assessment of learning Purpose of WBA Benefits of WBA Miller’s Pyramid.
Clinical Skills Verification rater Training MODULE 2 Training Faculty Evaluators of Clinical Skills: Drivers of Change in Assessment Joan Anzia, M.D. Tony.
Learning and Applying Concepts while Having Fun: A Perfect Training Environment Learning and Applying Concepts while Having Fun: A Perfect Training Environment.
Promoting Excellence in Family Medicine nMRCGP Workplace-based Assessment March 2007.
A Decision Matrix for Designing Staff Training Ronnie Detrich Wing Institute.
UCD School of Medicine “Criterion Based” vs. “Norm-Based” Evaluation David L Gaspar MD October 18, 2008.
. Workplace-Based Assessment: a true reflection of competency ? Trevor Gibbs.
Overview: Competency-Based Education & Evaluation
Assessment of Clinical Competence in Health Professionals Education
Workplace-based Assessment
CBES Essentials for Residents, Fellows, and Faculty A 10-minute primer on student performance assessment in required clerkships Stanford School of Medicine.
Comparison: Traditional vs. Outcome Project Evaluative Processes Craig McClure, MD Educational Outcomes Service Group University of Arizona December 2004.
Assessing Student Learning
GME Jeopardy. Compe 10 cies VISA issues ToolboxOversiteAlphabet Soup
a judgment of what constitutes good or bad Audit a systematic and critical examination to examine or verify.
Assessing and Evaluating Learning
Assessment of Communication Skills in Medical Education
Fundamentals of Assessment Todd L. Green, Ph.D. Associate Professor Pharmacology, Physiology & Toxicology PIES Seminar
Assessment Tools. Contents Overview Objectives What makes for good assessment? Assessment methods/Tools Conclusions.
5 Criteria of Performance Measures
Foundations of Educating Healthcare Providers
Mini-CEX Mini-clinical evaluation exercise لیلا پاداش
Monitoring through Walk-Throughs Participants are expected to purpose the book: The Three-Minute Classroom Walk-Through: Changing School Supervisory.
Classroom Assessments Checklists, Rating Scales, and Rubrics
HRM-755 PERFORMANCE MANAGEMENT OSMAN BIN SAIF LECTURE: TWENTY TWO 1.
R 3 P Colloquium American Board of Pediatrics Jan. 31 – Feb. 2, 2007 The Past, Present and Future Assessments of Clinical Competence A Canadian Perspective.
Performance Management
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Alternative Assessment
Modernising Medical Careers for GPs Education Supervision and Review of Progression.
Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst,
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Student assessment Assessment tools AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Assessment Tools.
What is “Competency” in the New Millennium? Shirley Schlessinger, MD, FACP Associate Dean for Graduate Medical Education University of Mississippi Medical.
Copyright restrictions may apply Randomized Trial of Teaching Brief Motivational Interviewing to Pediatric Trainees to Promote Healthy Behaviors in Families.
Resident Self Assessment Where do you fall on the continuum for each of the following? Please make an “X” on each line then date it: History Novice Advanced.
Assessment and Testing
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
R³P Project Faculty Development. Question Themes: Day 1 Engagement of faculty (5) Recognizing faculty proficiency in assessment (3) Create value for clinician-educator.
Assessment tools MiniCEX, DOPS AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Assessing Learners The Teaching Center Department of Pediatrics UNC School of Medicine The Teaching Center.
Resident Self Assessment Where do you fall on the continuum for each of the following? Please make an “X” on each line then date it: History Novice Advanced.
Workplace based assessment for the nMRCGP. nMRCGP Integrated assessment package comprising:  Applied knowledge test (AKT)  Clinical skills assessment.
Educational Outcomes Service Group: Overview of Year One Lynne Tomasa, PhD May 15, 2003.
APPROVED CLINICAL INSTRUCTOR WORKSHOP Evaluation.
Henry M. Sondheimer, MD Association of American Medical Colleges 7 August 2013 A Common Taxonomy of Competency Domains for the Health Professions and Competencies.
COUNSELOR EDUCATION PEDAGOGY TRAINING Session One: Significant Learning and Counselor Education.
Maria Gabriela Castro MD Archana Kudrimoti MBBS MPH David Sacks PhD
1. Mini-Clinical Evaluation Exercise (mini-CEX) 2.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Introduction to Evaluation
Michael Henehan, DO San Jose-O’Connor Hospital
The Big Picture – curricula, the Gold Guide and the assessment system
Undergraduate teaching of consultation skills – examples from the teaching of pharmacy and medicine Tracey Coppins Teaching Fellow, School of Pharmacy,
Classroom Assessments Checklists, Rating Scales, and Rubrics
Clinical Assessment Dr. H
A Practical Approach to Evaluation in the Ambulatory Setting in the Era of the New ACGME General Competencies Eric S. Holmboe Stephen Huot Yale University.
Work Place Based Assessment
Assessment of Clinical Competencies
Evaluation Rating Forms
nMRCGP Assessments on the e portfolio
Presentation transcript:

The MiniCEX and the Evaluation of Clinical Skills National Health Service Foundation Training Program

Patient Care Competencies Gather accurate, essential information from all sources, including medical interviews, physical examinations, medical records and diagnostic/therapeutic procedures Make informed recommendations about preventive, diagnostic and therapeutic options and interventions that are based on clinical judgment, scientific evidence, and patient preference.

Patient Care: Themes Clinical skills essential to patient care Cannot make “good” decisions unless you work with good and accurate information –GIGO principle Evaluation of clinical skills requires direct observation

Workshop Objectives Review current state of –Physician clinical skills –Faculty evaluation skills Understand the importance of direct observation by faculty for assessment of clinical skills

Workshop Objectives Discuss practical strategies for focused direct observation Review rater training methods –Direct Observation of Competence (DOC) training

Workshop Elements Mini-Lectures: –State of clinical skills –Quality of faculty ratings Direct observation exercises –Performance dimension exercise –Frame of reference training

Video Exercise Situation: A foundation trainee performs a medical interview in the outpatient setting. Using the MiniCEX form provided, please rate the performance of this trainee.

Key “Basic” Clinical Skills Medical interviewing Physical examinations Counseling/patient education Clinical judgment/reasoning Reflective practice –Self-directed learning –Professional growth and improvement –Medical errors

Are Clinical Skills Important? Where do clinical skills fall into the hierarchy of physician competencies and mastery in an era of advanced technology?

Diagnosis and Medical Interview Hampton (BMJ, 1975): –Medical interview: 82% –Physical exam: 9% –Laboratory: 9% Kirch (Medicine, 1996) –Medical interview (+PE): 70% –Imaging: 35%

Importance of Sound Clinical Skills Diagnostic errors –Inaccurate/ incomplete medical interview one of leading causes (Bordage) Wrong information leads to wrong decisions Patient satisfaction –Higher with better communication skills Patient self-care –Better adherence and outcomes associated with better physician communication skills

Clinical Reasoning: A Primer Patient/situation characteristics Prior knowledge Problem Representation* Information Gathering Context EvaluationAction Gruppen and Frohna, International Handbook on Research, 2002

Clinical Skills: U.K. Trainees Fox (2000) – Voluntary study of 22 PRHOs using OSCE – Only 45% with passing score on drug advice communication station – 0% passed locomotor system examination Evans (2004) –26 new PHROs –All had passed 22 station OSCE in medical school –Majority failed skill stations in blood pressure measurement and cannulation

Clinical Skills: U.S. Trainees Stillman (1990) –OSCE: wide variability in graduating medical student clinical skills Mangione (1997) –Deficient cardiac and pulmonary auscultatory skills –Medical students, FP and IM residents –Replicated findings in Canada and U.K.

Clinical Skills: Practicing MDs Ramsey (1998) –Incomplete history-taking / preventive health screening among Primary care physicians in Northwest Unites States Braddock (1999) – Study of informed decision making (IDM) and counseling – Simple analysis of presence or absence of 7 key elements –1058 outpatient visits: only 9% of visits met minimal criteria for IDM

Importance of Faculty: U.K. Studies Grant (Med Educ, 2003) – Inadequate coverage and frequency of supervision activities – Discordance between specialist registrars and attendings Kilminster (Med Educ, 2000) – Systematic review of supervision – Better supervision associated with improved patient safety and quality of care.

Importance of Faculty: U.S. Studies Inpatient Study (Lancet, 2003) – Reviewed 100 consecutive admissions to teaching service in U.S. – Faculty detected 26 PExam findings missed by house officer that impacted patient’s care Outpatient Studies – Two separate studies showed that faculty assessment disagreed with that of house officer in up to 30% of patients

Clinical Skills: Themes Deficiencies exist across continuum Specific skills more “error-prone” –Eg: musculoskeletal and neuro exams Not detected by other evaluation methods –Performance of basic clinic skills does not correlate with performance in other dimensions of competence

Clinical Skills: Themes House officers: –Aware of importance –Recognize under-emphasis Without detection deficiencies in clinical skills cannot be corrected

Miller’s Pyramid KNOWS KNOWS HOW SHOWS HOW DOES MCQ EXAM Extended matching / CRQ OSCE Portfolios Faculty Observation

Faculty Observation / Rating Skills Patient care settings –Ratings based mostly on perceived knowledge and personality –Little evidence of direct observation –Significant “Halo” effect Gray, Thompson, Haber, Grant, etc.

Faculty Observation / Rating Skills Research settings –Poor inter-rater reliability –Brief rater training methods ineffective Didactic instructions Demonstration videos without practice –Accuracy: structured > open-ended forms –Increased accuracy  discriminative ability Kalet, Herbers, Noel, Kroboth

Faculty as Raters – Key Issues Faculty do not observe actual performance Faculty ratings lack: –Reliability –Accuracy/validity Content specificity – How comfortable are you with own skills?

Improving Faculty Ratings: Solutions Step 1: Getting faculty to observe –Required a part of Foundation Program –Focused observations are logistically possible –5 to 10 minute observations are valuable –Build on faculty “epiphany” The “you will not believe what I saw today” experience –Provide “usable tool”

Foundation Mini - CEX Tool Simple rating scale using 6 dimensions and overall rating “Structured” approach to direct observation Direct assessment of actual patient care Incorporation of CEX into daily activities Evidenced-based

Research: Mini - CEX Tool Two large scale U.S. studies involving 36 total residency programs Logistically feasible to incorporate miniCEX into daily activities High satisfaction among house staff Good to excellent reliability characteristics Overall scores and interpersonal scores correlated with trainee’s ECFMG OSCE scores

Logistics: Outpatient Clinic One mini-CEX per trainee per day per week –One attending observes portion of first visit of the day –Minimizes disruption of clinic –Perform over course of academic year –Easy to obtain 6-8 Mini-CEX’s per year per trainee in single setting

The Patient Encounter Sampling “parts” of the encounter: INTERVIEW PHYSICAL EXAM COUNSELING

Solutions: Step 2 Improve reliability –Multiple brief observations –Perform over time: outpatient setting allows for longitudinal observation –Involve multiple faculty –MiniCEX: sufficient reliability for pass/fail determinations after just 4 observations

Solutions: Step 3 Improve accuracy and validity –Most difficult step –Use structured rating forms –Rater training (faculty development) Caveat: brief “one time” interventions do not work

Does Faculty Training Work? Performance Appraisal Literature: Can reduce rating errors Can improve discriminative ability Can improve accuracy

Approaches to Rater Training Behavioral Observation Training Performance Dimension Training Frame of Reference Training Direct Observation of Competence Training

Videotape Exercise: BOT Situation: An attending is performing a miniCEX of a house officer performing a physical exam. Questions: –How well did this attending evaluate the house officer? –How was the house officer-patient interaction affected?

Basic Faculty Observation Skills Prepare for the observation –Faculty: Know what you’re looking for –Resident: Let them know what to expect –Patient: Let them know why you are there Minimize intrusiveness – correct positioning Minimize interference with the house officer- patient interaction Avoid distractions

Triangulation DESK Resident Patient Attending

Basic Observation Strategies Increase the amount of “sampling” –More observations lead to more accurate evaluations (“practice makes perfect”) Use of observational “aides” –Behavioral diary to record observed performance. –U.S. study: simple 3X5 card diary lead to increased comments on forms

Performance Dimension Training Group exercises designed to familiarize faculty with the specific elements of a competency Should involve discussion of the criteria required for each element Use defined, agreed upon elements of a competency to calibrate faculty –Playing from the “same sheet of music”

PDT Exercise In your small group, discuss what should be the basic components of an effective medical interview for a foundation trainee performing an outpatient consultation

Frame of Reference Training Goal is to improve “judgment” and accuracy Steps in FOR training: 1. Group performance dimension training (PDT) exercise 2. Review clinical vignettes that describe critical incidents of performance: unsatisfactory to average to superior

Frame of Reference Training 3. Faculty, using framework developed in PDT exercise, provide ratings on a behaviorally anchored rating scale (BARS) 4. Session trainer provides feedback on what “true” ratings should be for each vignette along with rationale 5. Group finishes by discussing discrepancies between trainer’s ratings and the participants’ ratings

Frame of Reference Training Most difficult aspect of FOR: –Setting the actual standards that distinguish between levels of performance –Reaching agreement and/or consensus among teaching faculty

DOC Training Combination of: –Behavioral observation training –Performance dimension training –Frame of reference training –“Live” practice in observation with standardized residents/patients Individual evaluation and feedback Group debrief with Eval and FB

DOC Training Trial Randomized controlled trial of 40 faculty from 16 residency programs DOC training: –High satisfaction (favorite aspect of course) –Increased comfort in observation –Changed rating behavior at 8 months –Increased accuracy in identifying unsatisfactory performance

Direct Observation: Challenges Like all skills, requires training and practice Faculty “calibration” important –Agreeing on “metrics” of performance –Faculty comfort with own skills Faculty training –Brief interventions mostly ineffective

Observation: Helpful Hints Sample “parts” of the visit: –History-taking –Physical examination –Counseling Perform longitudinally –No need to do it all at once Agree on performance metrics with faculty

Summary Basic clinical skills are important: so is the need to observe them! Observation is a complex skill that requires training and practice Direct observation by educators will remain a critical component of both evaluation and feedback