Comparison of Osteopathic Medical School Curricula In Teaching Clinical Reasoning Amanda Kocoloski, OMS IV; Gordon Marler, OMS IV; Nicole Wadsworth, D.O.;

Slides:



Advertisements
Similar presentations
March 2007 ULS Information Literacy and Assessment of Learning Program.
Advertisements

Using Computer-Simulated Case-Based Scenarios to Improve Learning Department of Health Professions College of Health & Public Affairs University of Central.
SCA PRESENTATION: COMMON CORE ALGEBRA FRIDAY, OCTOBER 11, 2103 RAYMOND SCACALOSSI JR. MATHEMATICS COORDINATOR K-12.
Benchmarking Clinicians Farrokh Alemi, Ph.D.. Why should it be done? Hiring, promotion, and management decisions Help clinicians improve.
Warren Hills Regional School District State Assessment Results October 2013 Presenters Jaclyn Russo Director of Guidance Kimberly Unangst Director of Special.
Closing the Loop UNLV School of Nursing BSN Program Susan Kowalski, RN, PhD November 6, 2008 Academic Assessment Symposium.
State of College Admission 2011 David A. Hawkins Director of Public Policy and Research NACAC Presented November 10, 2011 CACNY Meeting, New York.
Correlation between EI and PE Introduction Method EI and age correlated positively, r(51) = +.41, p =.003. Students above age 33 had EI scores well above.
Staar Trek The Next Generation STAAR Trek: The Next Generation Performance Standards.
Unit Assessment Plan Weber State University’s Teacher Preparation Program.
“How to get into Medical School” In 12 easy steps. Stephen Symes MD, FACP Assistant Dean for Diversity and Multicultural Affairs University of Miami Miller.
Experiments and Observational Studies.  A study at a high school in California compared academic performance of music students with that of non-music.
Improving Learning via Tablet-PC-based In-Class Assessment Kimberle Koile, MIT CS and AI Lab David Singer, MIT Brain & Cognitive Sciences Classroom Presenter.
Understanding our First Years Two studies and a comparison.
Fundamentals of Assessment Todd L. Green, Ph.D. Associate Professor Pharmacology, Physiology & Toxicology PIES Seminar
Ohio’s Assessment Future The Common Core & Its Impact on Student Assessment Evidence by Jim Lloyd Source doc: The Common Core and the Future of Student.
Adaptation, Implementation, & Assessment Activity and Web-based Probability & Statistics Workshop M. Leigh Lunsford and Ginger Holmes Rowell July 14, 2004.
Performance Improvement in a Medical School: Defining Baseline Metrics – Pursuing Benchmark Targets.
Redesign of Beginning and Intermediate Algebra Lessons Learned Cheryl J. McAllister Laurie W. Overmann Pradeep Singh Southeast Missouri State University.
A Level PE at JRCS. Course Outline  Aims of the course are : - To increase physical competency - To develop involvement in physical activity - To increase.
Measuring Changes in Teachers’ Science Content Knowledge Dr. Anne D’Agostino Compass Consulting Group, LLC.
Quasi-Experimental Designs For Evaluating MSP Projects: Processes & Some Results Dr. George N. Bratton Project Evaluator in Arkansas.
Assessment tool OSCE AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
ASSESSMENT OF THE INDEPENDENT STUDY PATHWAY AT LECOM: STUDENT FEEDBACK Mark A.W. Andrews, Ph.D., Professor and Director, The Independent Study Pathway.
Adolescent Literacy – Professional Development
Determination of Entrance Exam Scores as a Valid Predictor for Final Grade in BIOL 213 Through Data Visualizations ANGELA K. SHAFFER CDS301 DECEMBER 12,
The 29th Conference ARAB-ACRAO 29 March - 3 April 2009 Comparison of Student Flow in Different Colleges of Kuwait University Using Absorbing Markov Analysis.
Jr. ACE Advisory Board. 6th Grade Summer Institute Academic Enrichment 7 th Grade Academic Year College Readiness Retention Activities 7 th Grade Summer.
Where did you learn this? Determining learner attributions for the sources of learning in a clerkship Divy Ravindranath MD MS and Tamara Gay MD Department.
College of Veterinary Medicine UNIVERSITY OF MINNESOTA OUTCOMES ASSESSMENT WITH CURRICULUM REVISION : HOW WILL WE KNOW IF WE HAVE ACCOMPLISHED OUR GOALS?
Web-Homework Platforms: Measuring Student Efforts in Relationship to Achievement Michael J. Krause INTRODUCTION Since the 2007 fall semester, I have used.
AN INNOVATIVE & INTEGRATED TESTING FORMAT COMBINING ANATOMY, PRIMARY CARE SKILLS, AND OMM IN A SIMULATED PATIENT ENCOUNTER Gail Dudley, DO, Francine Anderson,
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
The Longitudinal Student Assessment Project (LSAP)
Problem-based Learning Cherdsak Iramaneerat Department of Surgery Faculty of Medicine Siriraj Hospital 1PBL.
Early Identification of Introductory Major's Biology Students for Inclusion in an Academic Support Program BETHANY V. BOWLING and E. DAVID THOMPSON Department.
The COMparison report A COMPARISON OF OU - COM TO ALL COM s NATIONALLY Trends and Graphs Office of Institutional Assessment & Planning Academic & Student.
Louisiana State University at Alexandria. Background Students Introductory Biology for majors 2 laboratory sections of ~ 25 students combined into 1 lecture.
Comparing Performance on the Virtual Laparoscopic and Robotic Simulators Among Medical Students Pursuing Surgical versus Non-surgical Residencies Amanda.
The COMparison report A COMPARISON OF OU - COM TO ALL COM s NATIONALLY Trends and Graphs Office of Institutional Assessment & Planning November 2010 Institutional.
Developing Clinical Skills using a Virtual Patient Simulator in a Resource-limited Setting G. Bediang, C. Perrin, M.A Raetzo et al. Medinfo 2013 (Copenhagen),
Transfer Course Credit – Institutions of Higher Education Credit for Prior Learning Industry Recognized Credentials/Test Credit AGC – April 2016.
Moneyball: The Art of Statistical Analysis in Health-Related Professions Bradley D. Marcum University of Pikeville Kentucky College of Osteopathic Medicine.
Adventures in flipping a cell biology course Dr. Katie Shannon Biological Sciences Missouri S&T How do online videos and textbook reading engage students.
Family Medicine Online Penn State College of Medicine Shou Ling Leong, MD Department of Family and Community Medicine Penn State College of Medicine.
Best Practices for Using Your Curriculum Management System
Medical College of Georgia at Augusta University, Augusta, GA
Evaluating Blended Learning in a Large Introductory Psychology Course
Student Performance in a Rural Family Medicine Clerkship Experience: Are They Learning as Much as Their Peers? Hannah Maxfield, MD John Delzell Jr., MD,
Developing a Student Learning Outcomes Assessment Plan and Report
The Basics of High School
Steven Lin MD, Grace Chen Yu MD, and Erika Schillinger MD
LOW INTENSITY CYCLING THROUGHOUT A SEMESTER-LONG LECTURE COURSE DOES NOT INTERFERE WITH STUDENT TEST PERFORMANCE Matthew A. Kilgas, Alexandrea M. Holley,
State of College Admission
ADEE 2016 – Barcelona – August 24th – 26th
ASSESSMENT OF STUDENT LEARNING
Multiple measures and accurate student placement
Advanced Academics in Middle School
The Basics of High School
The Association between External Ear Size and Medical Student Performance: A Purely Hypothetical Study John Star Student, B.S. and Jane Doe Mentor, M.D.,
UT Southwestern Step 1 Course Arlene Sachs, Ph.D. Director, Student Academic Support Services Assistant Clinical Professor, Department of Psychiatry University.
State of College Admission
An Introduction to SPSS and Research Methodologies H
Reasoning in Psychology Using Statistics
Christopher S. Kiefer MD, Erica B. Shaver MD,
State of College Admission
Hale High School DEPTHS of Teaching.
Committee # VI: Medical Students: Student Services/Learning Environment.
Chetna Desai, R. K. Dikshit What they said about CBL…
College Work Readiness Assessment
Presentation transcript:

Comparison of Osteopathic Medical School Curricula In Teaching Clinical Reasoning Amanda Kocoloski, OMS IV; Gordon Marler, OMS IV; Nicole Wadsworth, D.O.; Grace Brannan, Ph.D.; John George, Ph.D.; Melanie Davis, M.A.; Godwin Dogbey, Ph.D. AbstractData Discussion Introduction: Osteopathic medical schools utilize different styles of curriculum for preclinical education. This study compared preclinical students from 2 schools with different curricular styles and measured their performance on the web-based clinical simulation program DxR Clinician TM. Methodology: A total of 17 students from School A and 51 from School B completed the same simulated case prior to beginning clinical clerkships. Grade point average (GPA) and Medical College Admission Test (MCAT®) scores were analyzed to assess homogeneity between the groups, and then DxR Clinician TM performance was compared. Results: There was significant difference between the two groups in their mean cumulative GPA (3.60, 3.45, p=.046), composite MCAT score (24.5, 26.6, p=.012) and biological science MCAT score (8.41, 9.59, p=.003). Students from School A performed significantly fewer exams (13.1, 30.2, p=.003) and completed a lower percentage of required exams (22.2, 35.0, p=.034) than School B. However, School A did order a higher percentage of required labs (13.6, 5.00, p=.050) and provided significantly more hypotheses than School B (5.35, 3.57, p=.011). Conclusion: We noticed some significant differences between the two schools in their performance on the DxR Clinician TM case, but the groups also differed in baseline GPA and MCAT® scores. There was no significant difference between the two groups on overall, diagnostic, or clinical scores.  Since the conceptual development of problem-based learning (PBL) in the 1960s, many medical schools throughout the world have implemented a form of this teaching methodology  PBL emphasizes learning basic sciences within a clinical framework, fostering clinical reasoning 1  Integrated curricula that combine features of both traditional and PBL-style learning have met with some success 2 ; however, up until this point curricular influence on medical decision-making has not been studied in osteopathic medical students  The DxR Clinician TM generates feedback comparable to an observed simulated patient encounter, and can be more practical and cost-effective than organizing and evaluating simulated patient encounters 3  Our hypothesis is that students in the systems-based integrated track at School A and students educated in the traditional, discipline-based curriculum at School B will differ in clinical reasoning performance on the DxR Clinician TM Methods References Introduction 1.Barrows HS. A taxonomy of problem-based learning methods. Medical Education. 1986;20(6): Miller AP, Schwartz PL, Loten EG. `Systems Integration': a middle way between problem-based learning and traditional courses. Medical Teacher. 2000;22(1): Turner MK, Simon SR, Facemyer KC, Newhall LM, Veach TL. Web- based learning versus standardized patients for teaching clinical diagnosis: a randomized, controlled, crossover trial. Teaching and Learning In Medicine. 2006;18(3): Acknowledgements  Baseline comparison between the two groups of students was made using grade point average and MCAT® scores at the time of medical school matriculation  Participants were recruited from the integrated, systems- based curriculum of school A’s class of 2012, and the entire class of 2012 from school B  All participants completed the DxR Clinician TM case prior to starting clinical clerkships. Participants from school A completed it the in the summer of 2010 and students from school B completed the case in the spring of 2010 as part of their pharmacology course  Since school B uses the DxR Clinician TM in its curriculum, participants had experience using the program prior to the study. To control for this difference participants from school A were provided a 15-minute tutorial on how to use the program and worked through a practice case before completing the study case  Students completed the case on their own time. They were given a time frame in which to complete the case, but no time limit for completion once the case was opened was imposed  Participants had to spend a minimum of 15 minutes on the case in order for their data to be analyzed  Measures of clinical and diagnostic reasoning were generated by the DxR Clinician TM and analyzed Table 1. Measures of Clinical and Diagnostic Reasoning Overall ScoreDiagnostic ScoreQuestions asked/ % of Required Labs ordered/ % of Required Clinical ScoreClinical LevelExams performed/ % of Required Number of hypotheses generated  Data was collected from the DxR Clinician TM and matched with students’ GPA and MCAT® scores, and all participant identifiers were removed prior to analysis  Multivariate analysis detected overall significance  Tests of between-subjects effects looked for significant differences between schools on GPA, MCAT® scores, and the DxR Clinician TM measures listed in Table 1 School ASchool Bp-value GPA: Cumulative Science MCAT® Score: Composite Physical Biological Verbal Table 2. Entering GPA and MCAT® Scores School ASchool Bp-value Overall Diagnostic Clinical Questions Asked % of Required Exams Performed % of Required Labs Ordered % of Required Hypotheses Time Spent (min.) Table 3. DxR Clinician TM Data Multivariate tests for significance yielded a Pillai’s Trace =.206, df = 62, p =.012. Multivariate tests for significance yielded a Wilks’ Lambda =.633, df = 58, p =.003. For all their help and support, I would like to thank Dr. Grace Brannan and the CORE Research Office, as well as the Academic Affairs departments at both participating institutions.  This study had several limitations.  Ideally the sample would have included the entire class of 2012 from both schools; however we were only able to enroll 17 participants from School A (out of 97) and use data from 51 students (out of 105) from School B based on the minimum time requirement of 15 minutes.  While we compared entering GPA and MCAT® scores to evaluate the similarity of the 2 groups prior to exposure to either curriculum, we had no baseline measurement of diagnostic reasoning or clinical decision-making ability.  The groups did differ significantly on some DxR Clinician TM parameters including the number of exams performed, percentage of required exams performed, percentage of required labs ordered, and number of hypotheses generated. There was no significant difference between the groups in diagnostic or clinical scores.  The DxR Clinician TM is a customizable program, but for this study the default settings were used. If this study were to be repeated, customizing the parameters of the DxR Clinician TM program, implementing measures to increase sample size and obtaining a clinical diagnostic baseline would improve external validity and applicability. Figure 1. The DxR Clinician TM program uses an algorithmic decision tree to evaluate students’ stepwise progression through the case and place each participant into a diagnostic level of competence.