Graduate Medical Education Stanford University Medical Center

Slides:



Advertisements
Similar presentations
Feedback & Evaluation: Quick Tips for Clinical Preceptors (Part 1) Shirley Schlessinger, MD, FACP Associate Dean for Graduate Medical Education University.
Advertisements

The Challenge and Importance of Evaluating Residents and Fellows Debra Weinstein, M.D. PHS GME Coordinators Retreat March 25, 2011.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Teacher Excellence and Support System
Surveys: One More Outcomes Measure Jay Shapiro, MD Program Director Anesthesiology.
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Conflict of Interest Nancy Piro, PhD - No Conflicts of Interest to Declare Ann Dohn, MA - No Conflicts of Interest to Declare.
PEER REVIEW OF TEACHING WORKSHOP SUSAN S. WILLIAMS VICE DEAN ALAN KALISH DIRECTOR, UNIVERSITY CENTER FOR ADVANCEMENT OF TEACHING ASC CHAIRS — JAN. 30,
Department of Graduate Medical Education (GME) The Stanford Patient-Physician Communication Project Graduate Medical Education Stanford University Medical.
Program Director Best Practices: Initial Survey Results University of Utah GME Annual Program Director Retreat Susan Stroud, MD Sonja Van Hala, MD, MPH,
Clinical Supervision Foundations Module Six Performance Evaluation.
Ombudsman UMKC SOM GME. Ombudsman Objective: The position of Ombudsman for Graduate Medical Education (GME) was developed to promote a positive climate.
 Nancy Piro, PhD – No conflicts of interest to disclose  Kim Walker, PhD - No conflicts of interest to disclose.
Evaluation.
©2013 Accreditation Council for Graduate Medical Education (ACGME) Information current as of December 2, 2013 Recent Changes in ACGME Policy.
Graduate Program Assessment Report. University of Central Florida Mission Communication M.A. Program is dedicated to serving its students, faculty, the.
Performance Appraisal in the Public Sector
Improving Students’ understanding of Feedback
 Nancy Piro, PhD – No conflicts of interest to disclose  Kim Walker, PhD - No conflicts of interest to disclose.
CBES Essentials for Residents, Fellows, and Faculty A 10-minute primer on student performance assessment in required clerkships Stanford School of Medicine.
Grade 12 Subject Specific Ministry Training Sessions
UNDERSTANDING, PLANNING AND PREPARING FOR THE SCHOOL-WIDE EVALUATION TOOL (SET)
Measuring Learning Outcomes Evaluation
Maine Course Pathways Maine School Superintendents’ Conference June 24 – 25, 2010.
The Personal Development Plan (PDP)
Department of Graduate Medical Education (GME) ACGME Upcoming Site Visit Background Information.
Department of Graduate Medical Education (GME) 1 Faculty Development Session on Evaluations Graduate Medical Education Ann Dohn, MA, Designated Institutional.
Capturing Useful Assessment Data: Eliminating Unintended Cognitive Bias from your Evaluation Instruments Conference Session: SES ACGME Annual Education.
Reaffirmation of Accreditation: Institutional Effectiveness Southern Association of Colleges and Schools February 2008 Stephen F. Austin State University.
Assessment Surveys July 22, 2004 Chancellor’s Meeting.
Evaluating NSF Programs
The Ofsted ITE Inspection Framework 2014 A summary.
Ann M. Dohn, M.A. Stanford DIO Nancy A. Piro, Ph.D.
Core Competency Assessment in Emergency Medicine from Design to Implementation National Hispanic Medical Conference Christian Arbelaez, MD, MPH Associate.
Using Mixed Method Approaches in Institutional Research Rick Kroc University of Arizona Richard Howard University of Minnesota Arizona Association for.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
MGT-555 PERFORMANCE AND CAREER MANAGEMENT
If you don’t know where you’re going, any road will take you there.
CBEI Essentials for Residents, Fellows, and Faculty A 10-minute primer on student performance assessment in required clerkships Stanford School of Medicine.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Dr Lucy Lee Researcher Development Manager Faculty of Medicine, Dentistry and Health University of Sheffield Evaluating the impact of tailoring annual.
© Copyright 2014 Milady, a part of Cengage Learning. All Rights Reserved. May not be scanned, copied, or duplicated, or posted to a publicly accessible.
Measuring Complex Achievement
Developing and Validating an Assessment Measure. Goals, Objectives & Criteria  It is critical that employees have a clear understanding about what part.
2005 Performance Development System Survey Human Resources Staff Meeting March 20, 2006.
B. Faculty Commitment. Core FeaturePBIS Implementation Goal B. Faculty Commitment 4. Faculty are aware of behavior problems across campus through regular.
Promotion Process A how-to for DEOs. How is a promotion review initiated? Required in the final probationary year of a tenure track appointment (year.
Distance Learning Standards. Higher Learning Commission (HLC) AQIP Quality Check-up Requires quality assurance oversight of distance education activities.
A Basic Guide to Academic Assessment Presented by Darby Kaikkonen Director of Institutional Research.
Evaluation of the Noyce Teacher Scholarship Program 2010 NSF Noyce Conference Abt Associates Inc. July 9, 2010.
Fair and Appropriate Grading
WASHINGTON STATE UNIVERSITY EXTENSION Evaluation Based Program Development This material was developed in partnership with Rutgers's University and Lydia.
Assessment of Student Learning in General Education AAHE/NCA 2003 Assessment Workshop Omaha, Nebraska ● June 2003.
CBEI Essentials for residents, fellows, and faculty A 10-minute primer on the Criterion-Based Evaluation Initiative and student performance assessment.
WITH THE NAME OF ALLAH THE MOST MIGHTY AND MERCIFUL. 1.
SWRK 3150 & 4120 Mid-term Evaluation. Welcome Please take some time to review these PowerPoint slides. They contain important information for students,
Educational Outcomes Service Group: Overview of Year One Lynne Tomasa, PhD May 15, 2003.
Chapter 4 Perception, Attribution, and Learning It’s in the eye of the beholder.
Chapter 8:Evaluation Anwar F. Al Arfaj Supervised by Dr. Antar Abdellah Submitted by.
1 UST Support for Institutional Effectiveness: Information for Academic OA Plan Writers.
INTRODUCTION TO ASSESSMENT METHODS USED IN MEDICAL EDUCATION AND THEIR RATIONALE.
Transfer Course Credit – Institutions of Higher Education Credit for Prior Learning Industry Recognized Credentials/Test Credit AGC – April 2016.
Initial Outcomes of a Reflective Portfolio and Advising System for Clinical Medical Students Scott Moser, MD Alan Blakely, PhD Anne Walling MB, ChB.
ACGME 2016 Annual Education Conference SES052 How to Use Milestones to Provide an Evidence-based Early Warning System for Identifying the Academically.
CBEI Essentials for Residents, Fellows, Advanced Practice Providers, and Faculty A 10-minute primer on student performance assessment in required clerkships.
Institutional Research and Assessment (IR&A) Feedback Survey
Unit 7: Instructional Communication and Technology
Competency Based Instruction in CSI High Schools?
Presentation transcript:

Graduate Medical Education Stanford University Medical Center Improving Academic Program Quality by Standardizing and Streamlining the Evaluation Process for Programs and Faculty Graduate Medical Education Stanford University Medical Center Ann M. Dohn, MA Designated Institutional Official Nancy Piro, PhD Education Specialist/ Program Manager Ann

Session Objectives The Situation and Problems/Issues: Why Did we Do This? What Did We Do? How Did We Do It? Results Ann

Why Did We Do This? The Situation: The largest number of citations for our programs were in the area of evaluations Program Directors’ Needs Assessment Results One of the Top Five Requests for the Department of GME = Templates for Evaluation GME Review of Evaluations Results Ann

What Did We Find? Poor quality questions Inappropriate response scales Questions not core competency based Evaluations not even being conducted Program Director frustration with the whole process Ann

Setting the Stage PROGRAM EVALUATION 82 Programs 14 Citations What Can We Do??? Ann

What Did We Do? – STEP 1 Step 1. Standardized Questions Collaboratively developed Prototype Questions for Programs to Use for Evaluation of Faculty and Residents Core Competency Based Unbiased Educated Faculty and Program Directors on Removing Unintended Cognitive Bias from their Evaluations and Evaluating Ann

Tools Were Not Sufficient Used group consensus to finalize the questions Distributed the questions to the Program Directors Educated them on removing bias but…. the frustration still remained Ann

GME now administers program evaluations for all 82 programs What Could We Do? – STEP 2 Step 2. Centralized the Annual Program Evaluation Process. GME now administers program evaluations for all 82 programs Used the Standardized Program Evaluation for : Faculty Residents/Fellows Ann

How Is This Implemented? GME electronically delivers the evaluations to all their faculty and residents/fellows… Programs can choose one of two dates – Feb or May Nancy

Standardized Program Evaluation by Faculty Form Nancy

Standardized Program Evaluation by Residents/Fellows Form Nancy

Generate reports for each program: 1) Program Eval by Faculty Nancy

Generate reports for each program: 2) Program Evaluation by the Residents/Fellows Nancy

But still not enough Program Directors had the data but didn’t use it…... SIGH….. Ann

The Total GME Annual Program Review Package Aggregate Data Delivered “to you” 1. Evaluation results posted for all Program Directors on their Residency Management System website Program Eval by Faculty Program Eval by Residents/Fellows Internal GME House Staff Annual Survey ACGME Survey (if available) Ann

The Total GME Annual Program Review Package 2. Review Checklist – All the essentials that need to be covered Nancy

GME Website Support Nancy

The Total GME Annual Program Review Package 3. Template Agenda for the Annual Review Meeting Nancy

The Total GME Annual Program Review Package Sign-In Sheet /Documentation for web Nancy

Standardized Program Evaluation Minutes (Meeting Documentation) Template Nancy

Standardized Program Evaluation As soon as completed, we post the aggregated summary reports for both Faculty and Residents to our online Residency Management System for each Program: Nancy

Standardized Program Evaluation Action Plan Documentation Template /Example of an action plan from one of our Programs…. Nancy

Each program is reviewed by GME Ensures all the requirements of the APR are completed and documented: … Nancy

Outcomes 100% compliance in 2010!! Ann

Faculty Evaluations At the Dean’s request, GME developed standardized questions for residents and fellows to evaluate their faculty

Annual Academic Evaluation of Faculty

Roadblock Could not get PD consensus Some faculty did not want the results going directly to the Dean’s office without Chair ‘review’… Ann

Where are we? Some programs are using it The Dean to present to Program Directors Ann

Summative Evaluation What is summative evaluation? A summative evaluation is a method of judging the competence of a trainee at the end of the program (summation). The focus is on the outcome. “Assessment with the primary purpose of establishing whether or not performance measured at a single defined point in time meets established performance standards, permanently recorded in the form of a grade or score.” Ann

Summative Evaluation Why do we do this? A program director must provide timely verification (written documentation) of residency education by completing summative performance evaluations: for all graduating residents/fellows for residents who leave the program prior to completion. Continued… Ann

Summative Evaluations 3) ACGME also requires this for residents who are transferring into your program from another program. “The program director must obtain written or electronic verification of previous educational experiences and a summative competency-based performance evaluation of the transferring resident/intern.” Ann

Summative Evaluations GME created Templates for the programs for Program to use in completing Summative Evaluations for their trainees graduating or leaving the program / transferring… Nancy

Step 2. Continued… Provided Faculty/Program Director Education on the tools and processes Reviewed the requirements of Summative Evaluation… Nancy

Summative Evaluation Why do we do this? A program director must provide timely verification (written documentation) of residency education by completing summative performance evaluations: for all graduating residents/fellows for residents who leave the program prior to completion. Continued… Nancy

Summary We have found that: Standardizing Forms and Developing Templates Saves a lot of time for our Programs/PDs Improves feedback to the Programs Standardizing Program Evaluations in the GME Office Not only saves time for the programs, but assures GME that there will be no citations for not having program evaluations Our faculty have been engaged and very positive with development sessions from GME on eliminating bias from their evaluation questions and response scales Nancy

Questions? CONTACT INFORMATION ~ Ann M. Dohn, MA Designated Institutional Official adohn1@stanford.edu Nancy Piro, PhD Education Specialist/ Program Manager npiro@stanford.edu

Faculty Development Session EVALUATIONS Our focus today: How do we eliminate unintended bias from our evaluation process?

What is cognitive bias… Cognitive bias is distortion in the way we perceive reality / information. Response bias is a type of cognitive bias which can affect the results of an evaluation if evaluators answer questions in the way they think they are designed to be answered, or with a positive or negative bias toward the fellow being evaluated

Where does response bias occur? Response bias most occurs most often in the wording of the question. Response bias is present when a question contains a leading phrase or words. Response bias can also occur in rating scales. Response bias can be in the raters themselves Central Tendency Halo Effect Similarity Effect Response bias a type of cognitive bias which can affect the results of an evaluation if respondents answer questions in the way they think they are designed to be answered, or with a positive or negative bias rather than according to their true perceptions of the person or program being evaluated. Response bias is present when a question contains a leading phrase. Response bias can, however, also occur in rating scales. More positive or negative choices Five point scales Response bias in the rater Central Tendency Halo Effect Devil Effect

Examples of Question Bias Example 1: "I can always talk to my Program Director about residency related problems." Problem: Terms such as "always" and "never" will bias the response in the opposite direction. Result: Data will be skewed.

Examples of Question Bias Example 2: “Career planning resources are available to me and my program director supports my professional aspirations." Problem: Double-barreled ---resources and aspirations… Respondents may agree with one and not the other. Evaluator cannot make assumptions about which part of the question respondents were rating. Result: Data is useless.

Examples of Question Bias Example 3: "Communication in my program is good." Problem: Question is too broad. If score is less than 100% positive, researcher/evaluator still does not know what aspect of communication needs improvement. Result: Data is of little or no usefulness.

Rating Scale Bias Competence and knowledge in general medicine. Poor Fair Good Very Good Excellent The data will be artificially skewed in the positive direction with this scale because there are far more (4:1) positive than negative rating options.

Response bias can be in the evaluators themselves Rater/Evaluator Bias Response bias can be in the evaluators themselves Central Tendency Similarity Effect Halo Effect

Beware the Halo Effect The halo effect refers to a cognitive bias whereby the perception of a particular behavior or trait is influenced by the perception of the former traits in a sequence of interpretations. Thorndike (1920) was the first to support the halo effect with empirical research. People seem not to think of other individuals in mixed terms; instead we seem to see each person as roughly good or roughly bad across all categories of measurement. In a study published in 1920, Thorndike asked commanding officers to rate their soldiers; Thorndike found high cross-correlation between all positive and all negative traits.

The Halo Effect and Expectations The halo effect is involved in Kelley's implicit personality theory the first traits we recognize in other people influence our interpretation and perception of later ones because of our expectations.

Halo Effect Extends to Products / Marketing The iPod has had positive effects on perceptions of Apple’s other products…

Could this impact our evaluations here? Empirical evidence from our HouseStaff…. A question from the most recent GME HouseStaff survey: “The general feeling in my program is that your ability will be labeled based on your initial performance.” Overall SHC-LPCH Overall Peds Fellows If someone makes a mistake, it is often held against them .Agree Strongly Agree Moderately Agree Slightly - RED Disagree Slightly - YELLOW Disagree Moderately Disagree Strongly GREEN

Reverse Halo Effect A corollary to the halo effect is the reverse halo effect (devil effect) individuals, brands or other things judged to have a single undesirable trait are subsequently judged to have many poor traits, allowing a single weak point or negative trait to influence others' perception of the person, brand or other thing in general.

Blind Spots In the 1970s, the social psychologist Richard Nisbett demonstrated that we may have no awareness of when the halo effect influences us (Nisbett, R.E. and Wilson, T.D., 1977) The problem with Blind Spots is that we are blind to them…