Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst,

Slides:



Advertisements
Similar presentations
The Challenge and Importance of Evaluating Residents and Fellows Debra Weinstein, M.D. PHS GME Coordinators Retreat March 25, 2011.
Advertisements

Objectives Explain the purpose of the RIME feedback method.
Standardized Patients in Training and Evaluation Judith G. Gearhart, MD.
Introduction to Competency-Based Residency Education
Assessment of Professionals M. Schürch. How do you assess performance? How do you currently assess the performance of your residents? What standards do.
The Milestone Project For Neurosurgical Resident Coordinators.
Preparing the Future Primary Care Workforce Together
Practical Training in PT Education – key aspects to develop PT competences Inga Vau, MBA Estonian Information Technology College.
Clinical Skills Verification rater Training MODULE 2 Training Faculty Evaluators of Clinical Skills: Drivers of Change in Assessment Joan Anzia, M.D. Tony.
MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to.
Chapter 4 Validity.
Overview: Competency-Based Education & Evaluation
Assessment of Clinical Competence in Health Professionals Education
Accreditation Council for Graduate Medical Education © 2014 Accreditation Council for Graduate Medical Education Competency-based Medical Education (CBME)
Dr. Dalal AL-Matrouk KBA Farwaniya Hospital
SUNITA RAI PRINCIPAL KV AJNI
Comparison: Traditional vs. Outcome Project Evaluative Processes Craig McClure, MD Educational Outcomes Service Group University of Arizona December 2004.
Critical Thinking in Nursing. Definition  Critical thinking is an active, organized, cognitive process used to carefully examine one’s thinking and the.
Assessing and Evaluating Learning
Daniel Fasko, Jr., Ph.D..  Definition of Critical Thinking  Critical Thinking Skills  Critical Thinking Dispositions  Instructional Strategies  Assessment.
Triple C Competency-based Curriculum: Implications for Family Medicine Residency Programs.
Work based assessment Challenges and opportunities.
Mohammed Y Al-Naami FRCSC, FACS, M Ed.
Mentorship Preparation Programme Week 6 Clinical Assessment processes Queen’s University Belfast Open University University of Ulster.
U N I V E R S I T Ä T S M E D I Z I N B E R L I N Entrustable professional activities for learning in competency-based undergraduate medical education.
PLAN AND ORGANISE ASSESSMENT. By the end of this session, you will have an understanding of what is assessment, competency based assessment, assessment.
Mini-CEX Mini-clinical evaluation exercise لیلا پاداش
Preparing the Future Primary Care Workforce Together
“SEE ONE, DO ONE, TEACH ONE” Supervision. Libby Zion Case Issue of work hours galvanized the press and the public and led to subsequent major reforms.
Feedback & Evaluation: Quick Tips for Clinical Preceptors (Part 2) Shirley Schlessinger, MD, FACP Associate Dean for Graduate Medical Education University.
Triple C Competency-based Curriculum A Brief Overview for Residents Copyright © 2013 The College of Family Physicians of Canada.
Assessment tool OSCE AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Measuring Complex Achievement
 3:30 Simulation and curriculum integration Learning and competency assessment of students (Margaret Hindman)  4:15 Debriefing & simulation scenario.
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Meaningful Evaluation: Framework, Process, Impact Inis Jane Bardella, M.D., FAAFP Associate Dean for Faculty Development and Global Health Initiatives.
Developing learner competency in the clinical environment GRACE Session 3 GRACE Program.
Developing an Assessment System B. Joyce, PhD 2006.
“R.I.M.E.” MODEL – A SYNTHETIC EVALUATION CONCEPT R eporter I nterpreter M anager- E ducator Pangaro LN. A new vocabulary and other innovations for improving.
Assessment Tools.
Measurement Validity.
Assessing Your Learner Lawrence R. Schiller, MD, FACG Digestive Health Associates of Texas Baylor University Medical Center, Dallas.
“SEE ONE, DO ONE, TEACH ONE” Bruce Covell GP Clinical Supervision.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
R³P Project Faculty Development. Question Themes: Day 1 Engagement of faculty (5) Recognizing faculty proficiency in assessment (3) Create value for clinician-educator.
Assessing Learners The Teaching Center Department of Pediatrics UNC School of Medicine The Teaching Center.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
Clinical Evaluation: Concepts and Processes Copyright 2008 by The Health Alliance of MidAmerica LLC.
Workplace based assessment for the nMRCGP. nMRCGP Integrated assessment package comprising:  Applied knowledge test (AKT)  Clinical skills assessment.
Educational Outcomes Service Group: Overview of Year One Lynne Tomasa, PhD May 15, 2003.
Assessing Competence in a Clinical Setting GRACE Session 12.
Non-Medical Prescribing Practice Based Learning and Assessment.
Click to edit Master subtitle style Competence by Design (CBD) Foundations of Assessment.
FAIMER Assessing Teaching Excellence John Norcini, Ph.D.
Entrustable Professional Activities (EPAs) for the Assessment of Early Medical Students H. Carrie Chen MD MSEd1, Margaret McNamara MD1, Arianne Teherani.
Marishiel Mejia-Samonte, MD
Assessment of Learning 1
Training Trainers and Educators Unit 8 – How to Evaluate
Introduction to Evaluation
EPAs as Curriculum Tools
Clinical Assessment Dr. H
A Practical Approach to Evaluation in the Ambulatory Setting in the Era of the New ACGME General Competencies Eric S. Holmboe Stephen Huot Yale University.
This presentation includes the audio recording from the “Review of the Internal Medicine Subspecialty Reporting Milestones” webinar held on September 11,
This presentation includes the audio recording from the “Review of the Internal Medicine Subspecialty Reporting Milestones” webinar held on September 9,
Training Trainers and Educators Unit 8 – How to Evaluate
Assessment 101 Zubair Amin MD MHPE.
Maxine A. Papadakis, M.D. Professor of Medicine UCSF
Evaluation Rating Forms
Workplace-based Assessment
Presentation transcript:

Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst, E. Holmboe

2 Medical Education Trend present Competency Based Education Variable length, defined outcome Fixed length, variable outcome Structure/Process Knowledge acquisition Single subjective measure Norm referenced evaluation Evaluation setting removed Emphasis on summative Competency Based Knowledge application Multiple objective measures Criterion referenced Evaluation setting: DO Emphasis on formative Caraccio et al 2002

3 In-Training Performance Assessment  Assessment in authentic situations  Learners’ ability to combine knowledge,skills, judgments, attitudes in dealing with realistic problems of professional practice  Assessment in day to day practice  Enables assessment of a range of essential competencies, some of which cannot be validly assessed otherwise Govaerts MJB et al. Adv Health Sci Edu. 2007;12:239-60

4 Observation to Test Assumptions  Direct observation can test assumptions  4 observations needed to detect outliers  Shared responsibility Detect OutliersFeedback/development TIME/TASK Early Late

5 Observation and Safe Patient Care  Importance of appropriate supervision  Entrustment Trainee performance* X Appropriate level of supervision** Must = Safe, effective patient-centered care * a function of level of competence in context **a function of attending competence in context

6 Types of Supervision  Routine oversight  Clinical oversight planned in advance (i.e. what we normally do)  Responsive oversight:  Clinical activities that occur in response to trainee or patient specific issues (i.e. you do more than usual)  Direct patient care:  When supervisor moves beyond oversight to actively providing care for the patient  Backstage oversight:  Clinical oversight which the trainee is not aware of Kennedy TJT et al. JGIM :

7 Your Supervision  How do you usually supervise?  When do you supervise more closely?  How do you change your supervision to ensure patients get safe, effective, patient-centered care?  What did you learn observing that will change how you supervise going forward?  REMEMBER: SUPERVISION ALSO FOR FEEDBACK

8 Entrustment  “A practitioner has demonstrated the necessary knowledge, skills, and attitudes to be trusted to independently perform this activity.” Ten Cate O, Scheele F. Acad Med 2007;82:542-7

9 Problems with Performance Assessment  Poor accuracy  Focus on different aspects of clinical performance  Differing expectations about levels of acceptable clinical performance  Rating errors  Halo effect/ “Horn” effect  Leniency/stringency effect

10 Factors That May Impact Ratings  Minimal impact of demographics  Age, gender, clinical and teaching experience  Faculty’s own clinical skills may matter  Faculty with higher history and patient satisfaction performance scores provide more stringent ratings. Kogan JR. et al. Acad Med. 2010;85(10 Suppl):S25-8

11 Factors Influencing Faculty Ratings  Different frameworks for judgments/ratings  Self-as-reference (predominant)  Trainee level  Absolute standard  Practicing physicians

12 Faculty OSCE Clinical Skills CompetencyMean (SD)RangeGeneraliz- ability History Taking65.5% (9.6%)34% - 79%0.80 Physical Exam78.9% (13.6%) 36% - 100%0.52 Counseling77.1% (7.8%)60% - 93%0.33 Patient Satisfaction (0.48)4.43 – On 7-point scale Kogan JR. et al. Acad Med. 2010;85(10 Suppl):S25-8 N=44

13 Other Factors Influencing Ratings  Factors external to resident performance  Encounter complexity  Resident characteristics  Institutional culture  Emotional impact of constructive feedback  Role of inference

14 Definition of Inference a. The act or process of deriving logical conclusions from premises known or assumed to be true. b. The act of reasoning from factual knowledge or evidence.

15 Affix meaning Concrete data (resident actions) Conclusions Assumptions

16 The Problem with Inference  Inferences are not recognized  Inferences are rarely validated for accuracy  Inferences can be wrong

17 Types of Inference about Residents  Skills  Knowledge  Competence  Work-ethic  Prior experiences  Familiarity with scenario  Feelings  Comfort  Confidence  Intentions  Ownership  Personality  Culture

18 High Level Inference

19 Direct Observation: A Conceptual Model Kogan JR, et al. Med Educ. 2011

20 International Comparative Work  Yeates (UK)  Differential salience  Criterion uncertainty  Information integration  Govaerts (Netherlands)  Use of task-specific and person schemas  Substantial inference in person schema  Rater idiosyncrasy  Gingerich (Canada)  Impact of social models: clusters; person; labels

21 Achieving Accurate, Reliable Ratings  Form is not the magic bullet  Assessment requires faculty training  Similar frameworks  Agreed upon levels of competence  Move to criterion referenced assessment

22 Questions