Curriculum Inventory Administrators’ Group October 11, 2017

Slides:



Advertisements
Similar presentations
Introduction to Competency-Based Residency Education
Advertisements

Update on Goals 1 and 2 Curricular Domain Curricular Domain – accomplishments to date Developed baseline information about current level of faculty.
Using the MedBiquitous Curriculum Inventory Standard to Collect Curriculum Data for MedAPS AAMCs Medical Academic Performance Services Terri Cameron, MA.
Dept. of Computing and Technology (CaT) School of Science and Technology B.S. in Computer Information Systems (CIS) CIP Code: Program Code: 411.
Utilization-focused Assessment in Foundations Curriculum: Examining RCLS 2601: Leisure in Society Clifton E. Watts, PhD Dept. of Recreation & Leisure Studies.
Quality Improvement/ Quality Assurance Amelia Broussard, PhD, RN, MPH Christopher Gibbs, JD, MPH.
Quality Improvement Prepeared By Dr: Manal Moussa.
Purpose Program The purpose of this presentation is to clarify the process for conducting Student Learning Outcomes Assessment at the Program Level. At.
PROFESSIONALISM EDUCATION: POSSIBLE COMPETENCIES Barbara Barzansky, PhD, MHPE LCME Co-Secretary APHC Conference May 3, 2013.
ACGME OUTCOME PROJECT : THE PROGRAM COORDINATOR’S ROLE Jim Kerwin, MD University of Arizona.
Performance Improvement in a Medical School: Defining Baseline Metrics – Pursuing Benchmark Targets.
Update on the MedBiquitous Curriculum Inventory Standard Terri Cameron, MA Senior Program Manager, MedAPS.
Streamlined NCATE Visits Donna M. Gollnick Senior Vice President, NCATE 2008 AACTE Annual Meeting.
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Disclosure of Financial Conflicts of Interest in Continuing Medical Education Michael D. Jibson, MD, PhD and Jennifer Seibert, MD University of Michigan.
Outcomes Methods RRC-Internal Medicine Educational Innovations Project: Clinical Quality Improvement and Patient Safety- Deliverables to Healthcare from.
Session Objectives Analyze the key components and process of PBL Evaluate the potential benefits and limitations of using PBL Prepare a draft plan for.
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Wisconsin Administrative Code PI 34 1 Wisconsin Department of Public Instruction - Elizabeth Burmaster, State Superintendent Support from a Professional.
Accreditation Council for Graduate Medical Education Milestones are Coming: A Conversation with the Family Medicine Milestones Committee May 2013.
Update on the AAMC Curriculum Inventory Osteopathic School Pilot  Terri Cameron, MA, Director, Curriculum Programs Association of American Medical Colleges.
Creating Meaningful Benchmarking Reports Using Data from a Flexible Data Standard MedBiq 2016 Terri Cameron, MA Director, Curriculum Programs.
Medical School Use of AAMC Curriculum Inventory Reports MedBiq 2016 Terri Cameron, MA Director, Curriculum Programs.
Surviving the LCME Visit: Lessons Learned AJ Copeland, MD, FACS Clerkship Director Associate Professor Department of Surgery Uniformed Services University.
Next Accreditation System (NAS) Primer Cuc Mai IM Residency Program Director Annual PD Workshop 2015.
Meredith Davison Mike Roscoe
LCME Update November 2014.
Best Practices for Using Your Curriculum Management System
NCATE Unit Standards 1 and 2
Geriatric Social Work Competencies
Curriculum Inventory Administrators’ Group September 13, 2017
2017 CALS Employee Survey Results Deep Dive
Clinical Learning Environment Review GMEC January 8, 2013
This training references results from the 2017 TELL Kentucky Survey available online at The intent is to begin discussion among staff.
Well Trained International
DEVELOPING EVIDENCE-BASED PRACTICE IN CHAPLAINCY:
Research from the NCCSD: What’s new and exciting?
Community Project Overview
Child Outcomes Summary Process April 26, 2017
University Career Services Committee
Curriculum Inventory Administrators’ Group August 9, 2017
Readiness Consultations
Curriculum Model Policy (7.18)
Joan Gibson-Howell, RDH, MSEd, EdD The Ohio State University
STFM Predoctoral Education Conference 2008
Carolyn Dufault PhD, Washington University in St. Louis
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Colleen O’Connor Grochowski, PhD, Duke University
A Review of BSC Vocabulary
Katherine M. Hyland, PhD Marieke Kruidering-Hall, PhD
Overview – Guide to Developing Safety Improvement Plan
Curriculum Inventory Administrators’ Group November 8, 2017
Overview – Guide to Developing Safety Improvement Plan
Surviving the LCME VISIT Lessons Learned
DESE Educator Evaluation System for Superintendents
Defining, Assessing, and Fixing the Learning Environment
Interprofessional Systems to Enhance Professionalism
Curriculum Inventory Webinar July 11, 2018
Curriculum Inventory Webinar September 12, 2018
Gary Carlin, CFN 603 September, 2012
Program Review and Accreditation
State Steering Team Meeting
2018 SMU Staff Performance Review Training
Instructional Methods Lessons Learned & Next Steps
Welcome to the AAMC Building Better Curriculum Webinar Series We will begin our presentation shortly. Topic: UA COMP Clerkship Curriculum: How.
Completing your Program Review
Committee # 4: Educational Program For The MD
Learning Community II Survey
Site Visits and Clerkship Coordinators – Defining a Best Practice
Presentation transcript:

Curriculum Inventory Administrators’ Group October 11, 2017 Terri Cameron, MA Director, Curriculum Programs

Agenda CI 2016-2017 Upload Statistics Verification Report User Guide: Assessment Tables Planning for Veronica Catanese, MD, LCME Secretariat, CIAG Presentation October CI in Context: Electronic Health Record / Ensuring Core Values in Age of Technology September CI in Context: Simulation Centers in US Medical Schools SGEA 2017 Presentation: Defining the Medical School Learning Environment: An Exploratory Survey by Dr. David Musick, Virginia Tech Carilion SOM Next meeting: Wednesday, November 8, 1 pm ET

Review of 2016-2017 Upload Cycle 2012-2013 2013-2014 2014-2015 2015-2016 2016- 2017 (Preliminary) 90 Schools 120 Schools (82 of 90 retained; 28 new schools) 135 Schools (114 retained; 21 new schools) 141 Verified (10 new schools) 106 Verified 17 in process 21 in Staging Potential: 144   2012-2013 2013-2014 2014-2015 2015- 2016 2016- 2017 (Prelim) US Medical Schools 85 115 127 134 104 Canadian Medical Schools 5 2 US Osteopathic Schools (Pilot 2015-2016) n/a 3

25th 1552 Med. 1213 Avg. 1189 75th 872

2016-2017 Required SB Integrated Rotation Course AL (1 to 4 only) Schools Sequence Blocks 1 2 111 1232 9 71 102 987 3 15 61 72 503 76 325 4 7 45 198 515

2013 2014 2015 2016 2017 Lecture 54% 52% 49%

Data Validation Efforts Do the sequence blocks look about the same? The same number of events or referencing the same events? If the participant is only providing one academic level, did they only provide one last season as well? Are there fewer academic levels this year than previous years? Are the start and end dates for the sequence blocks too far apart? Are titles and descriptions approaching the 4000 character limit? Do any sequence blocks have an unusual number of events? Are there repeating expectation titles?

CI 2016-2017 Data Issues Validations run weekly Schools uploading a single academic level two years in a row Schools uploading fewer academic levels than previous year

Challenging Documentation Areas: Clerkships Current documentation efforts making it difficult to analyze / report on clerkship activities Several groups will be looking at the issue and working with schools that have begun to develop innovative documentation practices Discussion area for October Vendor Retreat Best current option: Create ‘model’ clerkship in curriculum management system and document each event (no summaries), with metadata Tracks Working with schools that documented tracks to develop Best Practices

Assessment Tables in Verification Report Two tables meant to replicate LCME DCI tables, including ‘summarizing’ assessment methods into an abbreviated list of LCME terms (see Crosswalk at www.aamc.org/cir under Training and Resources: Methods of Assessment – Courses Methods of Assessment – Clerkships Two tables to show all Assessment Methods listed in CI Standardized Vocabulary document: Summative Assessment Methods Formative Assessment Methods

Assessment Tables in Verification Report Issue in previous Verification Reports: double-counting of Assessment Events in Summative Assessment Methods Solution: System Change for 2016-2017: Changed tallies of assessment methods in the Summative and Formative Assessment tables so events are counted in one or the other. Basically, Methods of Assessment Table counts only Summative AMs in Assessment Events and Summative Assessment Table does NOT count Assessment Events

Assessment Tables in Verification Report Unexpected Consequences: Schools that were not experiencing the ‘double-counting’ issue or had adjusted data entry to avoid the issue are not seeing all of their Summative Assessment Methods counted Miscommunication resulted in this change being made to the Formative Assessment Table in error Since Formative Assessment tends not to be Summative, schools are reporting that the Formative Assessment table is not being populated as expected. We are investigating both of these issues and will be working with you as you find problems. We apologize for the confusion.

Verification Report User Guide Methods of Assessment: This table lists all non-Clerkship SBs and shows how Summative Assessment methods linked to Events in those SBs are mapped to the list of LCME DCI Educational Methods (see Crosswalk). The Total Number of Exams is calculated by totaling the number of Events for that SB that have Summative Assessment Methods in Assessment Events (Events that have Assessment Methods but do not have Instructional Methods). The Formative Column is checked If at least one event contains one or more Assessment Methods tagged as “Formative.”

Verification Report User Guide Methods of Assessment: Clerkships: Sequence Blocks will only be included in this table if they are tagged as clerkships. Summative Assessment methods linked to Assessment Events in those SBs are mapped to the list of LCME DCI Assessment Methods (see Crosswalk). Academic Level is based on the AL referenced by each SB in the CI Upload.

Verification Report User Guide Summative Assessment Methods: This table shows the number of Events* linked to CI Assessment Methods tagged as Summative. For Events to be counted in this table, both an Instructional Method AND an Assessment Method must be linked. Assessment Events are not shown in this Table.

Verification Report User Guide Formative Assessment Methods: This table shows the number of Events* linked to CI Assessment Methods tagged as Formative. For Events to be counted in this table, both an Instructional Method AND an Assessment Method must be linked. Assessment Events are not shown in this Table.

Planned changes for 2017-2018 CI Upload Cycle: New AAMC Business Rule: Competency Object titles will be required. New AAMC Business Rule: The year value for date will need to be greater within 6 years of either direction of the report start and end dates. CompetencyObjects that are deeply placed in the competency framework will appear in the verification report.

Goals for 2017-2018 Implementation of remainder of CIAB Website / Report Recommendations Improved Search Options on CI Report page Guide for Documenting Curricula in CI for CQI and Accreditation CQI and Accreditation ‘Manual’ Pilot CI Report Login Screen Discipline Reports Documentation of AAMC Core EPAs CI Publication(s) from CI Research Group

CI Reports: Adding Multi-Year Comparisons Issues: Participation Rate vs. Trends Space Which parts of the report would most benefit from these comparisons?

Current Content Report

CI in Context: July – December 2017 Jul-17 Mapping Clerkship Metadata (Cinda Stone, Arizona – Phoenix) Aug-17 Core EPA pilot (Kim Loomis, MD, Vanderbilt) Sep-17 Simulation in US Medical Schools (Gerald Wickham, PhD, U of I Peoria) Oct-17 Electronic Health Record / Ensuring Core Values in Age of Technology (Elizabeth Toll, MD, Brown University) Nov-17 Boot Camps (Tentative Author) Dec-17 Tracks (Sheila Crowe, EdD, University of Oklahoma) What topics should we be planning for?

Planning for Veronica Catanese, MD, LCME Secretariat, CIAG Presentation Introduction LCME Update What questions would you like for her to answer? What is LCME position on AAMC Core EPAs? Faculty access to curriculum management system to view curriculum mapping? What characteristics must a curriculum track have to be considered a track by the LCME?

Curriculum Inventory at Learn Serve Lead: Focused Discussion and Lunch – 2 Curriculum Inventory tables: *Curriculum Inventory Focus Group (Table 46) *Curriculum Inventory Q & A (Table 47) Sat., Nov. 4 11:45 AM - 1:15 PM; Center: AAMC Connect, Hall D Medical Education Research Toolkit: Sun., Nov. 5 10:30 AM - 11:45 AM; Sheraton: Backbay CD Curriculum Inventory Update: Sun., Nov. 5 4:45 – 5:45 PM; TBD

Proposals for Spring/Summer Education Meetings Making schools aware of potential of CI for research, benchmarking, curriculum development, curriculum reform, etc.

Defining the Medical School Learning Environment: An Exploratory Survey

Felicity Adams-Vanke, MD David W. Musick, PhD Daniel P. Harrington, MD Richard C. Vari, PhD Aubrey L. Knight, MD Felicity Adams-Vanke, MD Tracey M. Criss, MD Acknowledgement: Members of the Learning Environment Advocacy Committee

Authors have no relevant disclosures

Objectives: Explain the concept of the learning environment and its importance in medical education Describe a survey-based approach to identifying various factors that impact the learning environment

LCME Standard 3.5 A medical school ensures that the learning environment of its medical education program is conducive to the ongoing development of explicit and appropriate professional behaviors in its medical students, faculty, and staff at all locations and is one in which all individuals are treated with respect.

LCME Standard 3.5 The medical school and its clinical affiliates share the responsibility for periodic evaluation of the learning environment in order to: Identify positive and negative influences on the maintenance of professional standards; Develop and conduct appropriate strategies to enhance positive and mitigate negative influences; Identify and promptly correct violations of professional standards.

Definitions of Learning Environment? But what exactly IS the learning environment anyway? “The physical, social and psychological context for learning.” Outcomes of a supportive learning environment include enhanced student learning, sense of achievement, and humanism. (Sochet et al, 2013) “What’s it like to…. go to medical school here? be a resident here?”

What We Did

Learning Environment Survey Methodology Over 2 academic years, we surveyed groups listed, using very similar questions with wording tailored to each group M1 Students M2 Students M3 Students M4 Students Faculty VTCSOM Administrative Staff Residents Nurses Survey contained a total of 33 items

Learning Environment Survey Methodology Over 2 academic years, we surveyed groups listed, using very similar questions with wording tailored to each group M1 Students M2 Students M3 Students M4 Students Faculty VTCSOM Administrative Staff Residents Nurses Combined in first survey Combined in first survey

Major topic areas included Learning Environment Survey Methodology Major topic areas included Curricular experiences Interprofessional team experiences Professionalism Existing mechanisms for addressing student concerns

Sample Questions Survey Items Available Upon Request PHASE 1 STUDENTS PHASE 2 STUDENTS FACULTY RESIDENTS NURSING I am able to contribute to the learning environment in a positive way. I am able to contribute to the VTC medical students' learning environment in a positive way. My experiences have helped me learn about cultural biases in myself and in the health care system. VTC medical students exhibit awareness of cultural biases in themselves and in the health care system. Rating Scales Used: Never, Rarely, Sometimes, Often, Very Often, Unable to Rate/NA Yes/No Survey Items Available Upon Request

M2 and M4 Students had the Highest Response Rate Nurses and residents had the lowest response rate. Learning Environment Survey Response Rates (2 years) M1 Residents/ Faculty VTC Admin Staff M2 M3 M4 Nurses Total of 494 Responses Overall Response Rate 39%

Results

Medical Student Results High degree of knowledge of mistreatment policies and procedures (M1 and M2 students)

Medical Student Results High degree of knowledge of mistreatment policies and procedures (M3 and M4 students)

(95% or greater, either year) (75% or fewer, either year) M1 and M2 Student Results High and low scoring items – Percentage responding “OFTEN” + “VERY OFTEN” High Scoring Items (95% or greater, either year) Low Scoring Items (75% or fewer, either year) 95% 63% The teaching time is used effectively The medical school facilities conducive to effective learning and growth 97% 43% 86% The faculty are effective at giving feedback 74% 77% While in med school I have maintained work/life balance 74% 36% Working on a team w/ PA and nursing students has a positive impact on my learning. 24% 53% My experiences have helped me to learn about cultural and gender biases in myself and in the health care system 53% 2014 2015 77% During LACE I contributed positively to patient care 46%

10% or greater responding “SOMETIMES”, “OFTEN”, or “VERY OFTEN” M3 Student Results M3 students experienced negative team dynamics. They also witnessed disparities in health care delivery Item 10% or greater responding “SOMETIMES”, “OFTEN”, or “VERY OFTEN” 2014 2015 I have been part of team in which the team dynamics negatively impacted the outcome or experience 30% 42% I have witnessed disparities in health care delivery based on gender, race, sexual orientation, ability to pay, etc N 10%

(95% or greater, either year) (75% or fewer, either year) M3 Student Results High and low scoring items – Percentage responding “OFTEN” + “VERY OFTEN” High Scoring Items (95% or greater, either year) Low Scoring Items (75% or fewer, either year) The learning environment encourages me to treat others with empathy 74% 61% The faculty are effective at giving feedback 100% 58% 45% While in medical school I have maintained work/life balance 69% 62% Working on a clinical care team including pharmacists, nurses, medical students, and residents has a positive impact on my learning. 97% 53% My experiences have helped me to learn about cultural and gender biases in myself and in the health care system 70% 93% 77% VTC students, faculty and staff generally behave professionally I contributed positively to patient care 97% 72% 2014 2015 71% Medical student complaints are responded to in a meaningful fashion 72%

Greater than 10% responding “SOMETIMES”, “OFTEN”, or “VERY OFTEN” M4 Student Results Item Greater than 10% responding “SOMETIMES”, “OFTEN”, or “VERY OFTEN” 2014 2015 I have experienced uncalled for competition from classmates 24% 26% I have been part of a team in which the team dynamics negatively impacted the outcome or experience 30% 32% I have witnessed disparities in health care delivery based on gender, race, sexual orientation, ability to pay, etc. N 15% I have witnessed a patient being treated disrespectfully 13% 12% I have been a witness or recipient of mistreatment by other students, residents, faculty or nursing staff 21% I have been a witness or recipient of mistreatment based on race or culture

(95% or greater, either year) (75% or fewer, either year) M4 Student Results High and low scoring items – Percentage responding “OFTEN” + “VERY OFTEN” High Scoring Items (95% or greater, either year) Low Scoring Items (75% or fewer, either year) VTC students, faculty and staff generally behave professionally 93% The teaching time in the clinical setting is used effectively 69% 100% 71% 61% The faculty are effective at giving feedback 59% 67% While in medical school I have maintained a work/life balance 68% 74% The LE fosters a culture that values diversity and inclusion 68% 45% My experiences have helped me learn about cultural biases in myself and in the health care system 53% 78% I have contributed positively to patient care 2014 2015 74% 71% Medical student complaints are responded to in a meaningful fashion 65%

Faculty Results Faculty had a high degree of knowledge of mistreatment policies and procedures

Greater than 10% responding “SOMETIMES”, “OFTEN”, or “VERY OFTEN” Faculty Results Faculty experienced negative team dynamics Item Greater than 10% responding “SOMETIMES”, “OFTEN”, or “VERY OFTEN” 2014 2015 I have been part of a team in which the team dynamics negatively impacted the outcome or experience 12% 13%

Faculty Results Faculty were very positive scoring the top items at 96% and above with only one item below the 75% mark which relates to students requesting feedback High Scoring Items (95% or greater) Low Scoring Items (75% or fewer) The learning environment encourages me to treat others with empathy 94% 61% The VTC medical students request feedback 96% 69% 97% The faculty, residents and nurses strive to create an atmosphere of trust and support 97% 96% I enjoy working with VTC medical students 96% The VTC medical student workload is manageable 95% 96% 97% VTC medical student complaints are responded to in a meaningful fashion 96% 99% 2014 2015 VTC students, faculty, residents and nursing staff generally behave professionally 99%

Residents Results Residents had a high degree of knowledge of mistreatment policies and procedures but were lower than the other groups

Greater than 10% responding “SOMETIMES”, “OFTEN”, or “VERY OFTEN” Residents Results Residents experienced negative team dynamics but the percentage decreased from first to second year of survey Item Greater than 10% responding “SOMETIMES”, “OFTEN”, or “VERY OFTEN” 2014 2015 I have been part of a team in which the team dynamics negatively impacted the outcome or experience 29% 14%

Residents Results Residents believe that students, residents and nursing staff generally behave professionally, but gave low marks for students requesting feedback High and low scoring items – Percentage responding “OFTEN” + “VERY OFTEN” High Scoring Items (95% or greater) Low Scoring Items (75% or fewer) 38% The environment fosters opportunities for positive social interaction 84% The VTC medical students request feedback 96% 57% 95% 69% VTC students, faculty, residents and nursing staff generally behave professionally VTC medical students exhibit awareness of cultural biases in themselves and in the health care system 97% 71% 2014 2015

Themes Across Respondent Groups

Positive Themes All groups indicated a high degree of knowledge on how to report mistreatment General perception that students, faculty, residents, nurses and staff all exhibit strong professionalism Faculty received strong ratings for creating an environment of trust and support for student learning Medical school facilities are conducive to learning and growth The learning environment encourages stakeholders to treat others with empathy

Improvement Themes Giving and receiving feedback Effective use of teaching time Student work/life balance Negative impressions of health care teamwork experiences Witnessing disparities in health care delivery Experiencing competition from classmates Learning about cultural and gender biases in self and health care system Contributing to patient care during LACE experiences (years one and two)

Next Steps 1. Identify and prioritize key areas that need to be addressed 2. Perform root cause analysis to identify cause of negative impact on the learning environment 3. Work with stakeholders to develop a work plan and timeline to address areas of deficiency 4. Implement action plan and measure again to observe impact of corrective action

What Happened as a Result?

Steps Taken After Survey Established survey permanently (every 2 years) Disseminated results widely to constituent groups (e.g., Department Chairs, GMEC, Curriculum Committee, Dean’s Council) Established Wellness Committees Policy Revisions Annual learning environment presentations to all 4 classes (M3 year in particular) General enhancement of monitoring via LEAC, other mechanisms Positive culture change???

Questions/Comments to: David W. Musick, PhD dwmusick@carilionclinic.org REFERENCES: 1. LCME, Functions and Structure of a Medical School, Standard 3.5. 2. Sochet RB; Colbert-Getz JM; Levine RB; Wright SM. Gauging Events that Influence Students’ Perceptions of the Medical School Learning Environment: Findings From One Institution. Academic Medicine 2013; 88 (2): 246 - 242. 3. Schonrock-Adema, J; et al. Key Elements in Assessing the Educational Environment: Where is the Theory? Advances in Health Sciences Education 2012; 17: 727-42.

CI in Context: October 2017 Electronic Health Record / Ensuring Core Values in Age of Technology by Elizabeth Toll, MD, Warren Alpert Medical School of Brown University

CI in Context: November 2017 Will highlight new chart based on LCME Annual Questionnaire, Part II: Does the medical school require a separate course in the fourth year of the curriculum that aims to prepare medical students for the transition to residency?   provide the length (number of weeks) of the course. indicate if the course is specialty specific or generic for all fourth-year medical students. indicate topics addressed in the course Transitions to Residency in US Medical Schools Seeking Author

CI for CQI and Accreditation (CICA) CICA will resume meetings in November.

Curriculum Inventory Research Group October 4 Retreat resulted in kick-off of three research projects: What is a clerkship? What changes are we seeing in instructional methodology? (Decline in lecture and what is replacing it) Life Expectancy of Curriculum (and facilitators and barriers of reform)

Next meeting: Wednesday, November 8, 1 pm ET (Second Wednesday of each month, 1 pm ET) Registration Links posted in Training and Resources section of www.aamc.org/cir Please send agenda items to tcameron@aamc.org