Evaluating the Reliability and Validity of the Family Conference OSCE Across Multiple Training Sites Jeffrey G. Chipman MD, Constance C. Schmitz PhD, Travis.

Slides:



Advertisements
Similar presentations
Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Advertisements

Standardized Patients in Training and Evaluation Judith G. Gearhart, MD.
Workplace assessment Dr. Kieran Walsh, Editor, BMJ Learning. 2.
“Scoring an Oral Simulation Exam” Elizabeth A. Witt, Ph.D. American Board of Emergency Medicine Presented at the 2005 CLEAR Annual Conference September.
Assessment of Professionals M. Schürch. How do you assess performance? How do you currently assess the performance of your residents? What standards do.
Improving ED Bedside Teaching & Resident Evaluation Stanford EM Faculty Development May 21 st, 2003.
NYU School of Medicine CAN PROFESSIONALISM BE TAUGHT? FINALLY THERE IS EVIDENCE Department of Surgery New York University School of Medicine April 29,
Research Paper Critical Analysis Research Paper Critical Analysis 10 ways to look at a research paper systematically for critical analysis.
RELIABILITY consistency or reproducibility of a test score (or measurement)
Assessment of Clinical Competence in Health Professionals Education
GME Jeopardy. Compe 10 cies VISA issues ToolboxOversiteAlphabet Soup
Improving Communication in the Emergency Department: The 5 Cs Model of Consultation Educational Soundbites CORD Academic Assembly 2011 San Diego, CA Chad.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Internal Consistency Reliability Analysis PowerPoint.
Barbara J Pettitt, MD Emory University Dept. of Surgery Atlanta GA April 24, 2015.
Assessment of Communication Skills in Medical Education
Statistical Methods for Multicenter Inter-rater Reliability Study
Teaching Quality Improvement: A Needs Assessment for OBGYN Resident Education Teaching Quality Improvement: A Needs Assessment for OBGYN Resident Education.
Assessing Shoulder Dystocia Simulations for Quality
Assessing Teamwork in the Trauma Bay: Introduction of a Modified NOTECHS Scale for Trauma Supported by a grant from the American College of Surgeons and.
Rater Training for Clinical Skills Verification Module 5 Performance Standards Michael Jibson, M.D., Ph.D. Jeffrey Hunt, M.D. David Kaye, M.D. Richard.
Performance on Brief Practice Exam Identifies residents at Risk for Poor ABSITE and ABS Qualifying Exam Performance Michael Corneille MD, Ross Willis PhD,
OPERATING ROOM DASHBOARD Virginia Chard, RN, BSN, CNOR
Assessing clinical judgment using the script concordance test: The importance of using specialty-specific experts to develop the scoring key Petrucci AM.
EVALUATION OF THE COUNSELING PRACTICUM AS AN EFFECTIVE METHOD TO TEACH COUNSELING SKILLS TO DOCTORS Mary Dankoski, Ph.D. Shobha Pais, Ph.D. Kathy Zoppi,
Is the Script-Concordance Test a Valid Instrument for Assessment of Intra-operative Decision-making Skills? Brent Zabolotny 1, Robert Gagnon 2, Bernard.
Background: As students complete their clerkships throughout their M3 year they gain in clinical experience and confidence, which may translate into improved.
Does success on the American Board of Surgery Qualifying Examination guarantee Certifying Examination success? Thomas W. Biester MS, Jonathan D. Rubright.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Direct Observation of Clinical Skills During Patient Care NEW INSIGHTS – REYNOLDS MEETING 2012 Direct Observation Team: J. Kogan, L. Conforti, W. Iobst,
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Can a Brief On-line Education Tool Improve Surgical Resident Operative Dictations? A Prospective Evaluation Alicia Kieninger, MD, Yi Wei Zhang, MD, Anna.
IN-SITU, MULTIDISCIPLINARY, SIMULATION-BASED Trauma Team TRAINING IMPROVES THE EARLY CARE OF TRAUMA PATIENTS Susan Steinemann, MD, FACS Benjamin Berg,
Developing an Assessment System B. Joyce, PhD 2006.
KidSIM Journal Club Presenter: Amani Azizalrahman June 19 th, 2014.
Designs and Reliability Assessing Student Learning Section 4.2.
NMC Online Conference December 2005 Assessing Learning in a MORPG Patricia Youngblood, PhD Director of Evaluation, SUMMIT (Stanford University Medical.
Matthew Schill, BS, Debbie Tiemann, RN, Mary Klingensmith, MD, L. Michael Brunt, MD Department of Surgery and Institute for Minimally Invasive Surgery.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Measuring Non-technical Aspects of Surgical Clinician Development in an Otolaryngology Residency Training Program Jennifer J. Shin, M.D. S.M. Michael J.
Assessing Learners The Teaching Center Department of Pediatrics UNC School of Medicine The Teaching Center.
Simulated Patients Improve Medical Student Comfort Level with Breaking Bad News and End of Life Issues Skotti Church, MD Carl J Fichtenbaum, MD, FACP University.
School of Nursing Health Literacy Among Informal Caregivers of Persons With Memory Loss Judith A. Erlen, PhD, RN, FAAN; Jennifer H. Lingler, PhD, RN; Lisa.
The authors would like to acknowledge the families at the Children’s Hospital of Wisconsin Jane P. Pettit Pain and Palliative Care Center. For more information,
In 2014, U.S. Residency Programs including Ob-Gyn fully implemented the Next Accreditation System with the use of Milestone evaluation and reporting. Residency.
The use of OSCE to assess Patient Care, Professionalism and Interpersonal Communication Milestones in EM residents Miriam Kulkarni, MD, Harsh Sule, MD,
TEMPLATE AND PRINTING BY: GRMERC Consortium Members: Grand Valley State University, Michigan State University, Saint Mary’s.
Simulation Training and Assessment in EMS Sean W Moore MD CM, FRCPC, emergency medicine University of Ottawa, Department of Emergency Medicine Associate.
Reliability in assessment Cees van der Vleuten Maastricht University Certificate Course on Assessment 6 May 2015.
© 2014 Lehigh Valley Health Network Lehigh Valley Health Network, Allentown, Pennsylvania Inter Rater Reliability of Using Video Capture Technology and.
The CORD-EM Speaker Evaluation Form for Medical Conference Planners 1 Andrew W Phillips MD MEd, 2 David Diller MD, 3 Sarah R Williams MD, 4 Yoon Soo Park.
State University of New York at Buffalo Primary Care Master Educator Program David Newberger, M.D. Elie Akl, M.D., Ph.D. * Denise McGuigan, M.S. Ed. Andrew.
Defining Grades in the Surgery Clerkship Jeremy M. Lipman, MD MetroHealth Medical Center Case Western Reserve University School of Medicine.
1 Utilization of Operating Room Simulation and Debriefing to Enhance Surgical Resident Participation in the Surgical Timeout Checklist Edward Dominguez.
A New Model for Assessing Teaching Quality Improvement to Family Medicine Residents Does It Work? Fred Tudiver, Ivy Click, Jeri Ann Basden Department of.
Developing Global Family Medicine Faculty “de Novo” John G Halvorsen, MD, MS Professor Emeritus of Family and Community Medicine University of Illinois.
Maria Gabriela Castro MD Archana Kudrimoti MBBS MPH David Sacks PhD
CRITICALLY APPRAISING EVIDENCE Lisa Broughton, PhD, RN, CCRN.
DEVELOPING AND TESTING THE STANDARD OF PRACTICE AND EVALUATION OF CRITICAL-CARE-NURSING TOOL (SPECT) FOR GRADUATES OF CRITICAL CARE NURSE EDUCATION PROGRAMS.
THE OSTE AS A FACULTY DEVELOPMENT TOOL
May 2017 Kathy Zoppi PhD MPH Jafreen Sadeque MD Stephanie Nader LCSW
Assessment Theory and Models Part II
Family Practice Residents’ Use of Clear Communication Skills
Clinical Assessment Dr. H
Evaluating Residents’ Medical Knowledge
Jonathan dela Cruz, M.D., Jason A Kegg, M.D.
CLICK TO GO BACK TO KIOSK MENU
Department of Emergency Medicine
Interprofessional Education for
Presentation transcript:

Evaluating the Reliability and Validity of the Family Conference OSCE Across Multiple Training Sites Jeffrey G. Chipman MD, Constance C. Schmitz PhD, Travis P. Webb MD, Mohsen Shabahang MD PhD, Stephanie F. Donnelly MD, Joan M. VanCamp MD, and Amy L. Waer MD University of Minnesota, Department of Surgery Funded by the Association for Surgical Education Center for Excellence in Surgical Education, Research & Training (CESERT) Surgical Education Research Fellowship (SERF)

Introduction ACGME Outcome Project –Professionalism –Interpersonal & Communication skills Need test with validated measures

Professionalism & Communication More important than clinical skills in the ICU Crit Care Clin 20:363-80, 2004 –Communication –Accessibility –Continuity 1 out of 5 deaths in the US occurs in an ICU Crit Care Med 32(3):638, 2004 < 5% of ICU patients can communicate when end-of-life decisions are made Am. J. Resp. Crit. Care. Med. 155:15-20, 1997

Family Conference OSCE Two 20-minute encounters (cases) –End-of-life –Disclosure of a complication Literature-based rating tools Trained family actors and raters Ratings by family, clinicians, self Debriefing, video Chipman et al. J Surg Ed, 64(2):79-87, 2007.

Family Conference OSCE Minnesota Experience High internal consistency reliability Strong inter-rater agreement Raw differences favored PGY3s over PGY1s Small numbers Chipman et al. J Surg Ed, 64(2):79-87, 2007 Schmitz et al. Crit Care Med 35(12):A122, 2007 Schmitz et al. Simulation in Health Care 3(4): , 2008

Replication Study Purpose Test the feasibility of replicating the OSCE Examine generalizability of scores –Institutions –Types of raters (clinical, family, resident) Examine construct validity –PGY1s vs. PGY3s

Replication Study Methods 5 institutions IRB approved at each site Training Conference (Minnesota) Site Training –Detailed case scripts –Role plays –Videos of prior “good” and “bad” performances

Replication Study Methods – Learner Assessment Assessment by: –Clinical raters (MD & RN) –Family actors –Self Only family raters were blinded Rating forms sent to Minnesota Data analyzed separately for DOC, EOL

Generalizabilty Theory Classical test theory considers only one type of measurement error at a time –Test-retest –Alternate forms –Internal consistency –Inter-rater agreement Generalizability theory allows for errors that occur from multiple sources –Institutions –Rater type –Family actors Provides overall summary as well as breakdown by error sources and their combinations Mushquash C & O’Connor. SPSS and SAS programs for generalizability theory analyses Behavior Research Methods 38(3):542-47, 2006

Generalizabilty Theory Summary statistics (0 to 1) –1.0 = perfectly reliable (generalizable) assessment Relative generalizability –Stablility in relative position (rank order) Absolute generalizability –Agreement in actual score

Results Feasibility N = 61 residents Implementation fidelity was achieved at each site Key factors: –Local surgeon champions –Experienced standardized patient program –On-site training (4 hrs) by PIs –Standardized materials & processes

Results Internal Consistency Reliability Institutionn Cronbach’s Alpha by Case Disclosure n = 14 items End-of-Life n = 14 items University of Minnesota Hennepin County Med Center University of Arizona Mayo Clinic Med College of Wisconsin Scott & White, Texas A&M

Results Generalizability Case Relative G Coefficient Absolute G Coefficient End-of-life (n=61) Disclosure (n=61) The relative G-coefficients we obtained suggest the exam results can be used for formative or summative classroom assessment. The absolute G-coefficients suggest we wouldn’t want to set a passing score for the exam. Downing. Reliability: On the reproducibility of assessment data Med Educ 38: , 2004

Results Construct ValidityMANOVA DisclosureEnd-of-Life p = 0.44 p = 0.41 Between subjects effect (PGY 1 vs. PGY 3) was not significant (p = 0.66 DOC, p =.0.26 EOL).

Study Qualifications Only family members were blinded –Clinician and family ratings were significantly correlated on EOL & DOC Nested vs. fully crossed design

Conclusions Family Conference OSCE Feasible at multiple sites Generalizeable Scores –Useful for formative, summative feedback –Raters were greatest source of error variance Did not demonstrate construct validity –Questions the assumption that PGY-3 residents are inherently better than PGY-1 residents, particularly in communication

Study Partners Lurcat Group Amy Waer, MD –University of Arizona Travis Webb, MD –Medical College of Wisconsin Joan Van Camp, MD –Hennepin County Medical Center Mohsen Shabahang, MD, PhD –Scott & White Clinic, Texas A&M Stephanie Donnelly MD –Mayo Clinic Rochester Connie Schmitz, PhD –University of Minnesota Acknowledgments Jane Miller, PhD, and Ann Wohl, University of Minnesota IERC (Inter-professional Education Resource Center) Michael G. Luxenberg, PhD, and Matt Christenson, Professional Data Analysts, Inc., Minneapolis, Minnesota