Download presentation
Presentation is loading. Please wait.
Published byPreston Newman Modified over 9 years ago
1
Standardizing Learner Surveys Across the Enterprise Francis Kwakwa, MA, Radiological Society of North America Valerie Smothers, MA, MedBiquitous
2
Disclosure We have no financial relationships to disclose.
3
Objectives At the completion of this session, you will be able to: Adopt strategies to improve the collection of consistent evaluation data from learners Adopt strategies to improve the collection of consistent evaluation data from learners Adopt strategies to improve the analysis of evaluation data across the CME enterprise Adopt strategies to improve the analysis of evaluation data across the CME enterprise
4
Overview 1. Challenges in analyzing learner surveys 2. MedBiquitous and MEMS 3. RSNA’s Implementation of a Standardized Survey 4. Results of RSNA course evaluation 5. Challenges faced by RSNA 6. Key strategies for improving data collection and analysis
5
Challenges in Analyzing Learner Surveys Most of us use surveys Most of us use surveys Surveys often differ based on activity Surveys often differ based on activity Survey data may be in different systems or formats Survey data may be in different systems or formats The result: it’s hard to analyze results across activities The result: it’s hard to analyze results across activities
6
RSNA Radiological Society of North America Radiological Society of North America “to promote and develop the highest standards of radiology and related sciences through education and research” “to promote and develop the highest standards of radiology and related sciences through education and research” Over 40,000 members Over 40,000 members Online and in-person CME activities Online and in-person CME activities Member of MedBiquitous Member of MedBiquitous Francis Kwakwa, Chair of the MedBiquitous Metrics Working Group Francis Kwakwa, Chair of the MedBiquitous Metrics Working Group
7
MedBiquitous Technology standards developer for healthcare education Technology standards developer for healthcare education ANSI Accredited ANSI Accredited Develops open XML standards Develops open XML standards 60 members (societies, universities, government, industry) 60 members (societies, universities, government, industry) 7 working groups 7 working groups
8
The Focus on Metrics “Without the creation of a standard data set for reporting CME program outcomes … it is difficult to obtain consistent metrics of those outcomes. And if you can’t measure it, you can’t improve it.” “Without the creation of a standard data set for reporting CME program outcomes … it is difficult to obtain consistent metrics of those outcomes. And if you can’t measure it, you can’t improve it.” Medical Education Metrics – MEMS Medical Education Metrics – MEMS Ross Martin, MD, Director Healthcare Informatics Group, Pfizer
9
Another Perspective “I need this to better understand how my program as a whole is doing.” “I need this to better understand how my program as a whole is doing.” Nancy Davis, American Academy of Family Physicians
10
The MedBiquitous Metrics Working Group Mission: Mission: to develop XML standards … for the exchange of aggregate evaluation data and other key metrics for health professions education. Originally a subcommittee of the Education Working Group Originally a subcommittee of the Education Working Group Became a working group in April 2005 Became a working group in April 2005 We’re all using the same measuring stick… --Francis
11
Who is Involved? Francis Kwakwa, RSNA, Chair Francis Kwakwa, RSNA, Chair Linda Casebeer, Outcomes Inc. Linda Casebeer, Outcomes Inc. Nancy Davis, AAFP Nancy Davis, AAFP Michael Fordis, Baylor College of Medicine Michael Fordis, Baylor College of Medicine Stuart Gilman, Department of Veterans Affairs Stuart Gilman, Department of Veterans Affairs Edward Kennedy, ACCME * Edward Kennedy, ACCME * Jack Kues, University of Cincinnati Tao Le, Johns Hopkins University Ross Martin, Pfizer Jackie Mayhew, Pfizer Mellie Pouwels, RSNA Andy Rabin, CE City Donna Schoonover, Department of Veterans Affairs * Invited experts
13
What’s in MEMS Participation Metrics Participation Metrics how many participants how many participants Learner Demographics Learner Demographics profession, specialty profession, specialty Activity Description Activity Description name, type name, type Participant Activity Evaluation Participant Activity Evaluation survey results survey results Other types of evaluation metrics planned for future versions Other types of evaluation metrics planned for future versions
15
For more information: Metrics Working Group Page http://www.medbiq.org/working_groups/metri cs/index.html Metrics Working Group Page http://www.medbiq.org/working_groups/metri cs/index.html http://www.medbiq.org/working_groups/metri cs/index.html http://www.medbiq.org/working_groups/metri cs/index.html MedBiquitous Website http://www.medbiq.org MedBiquitous Website http://www.medbiq.org http://www.medbiq.org
16
Discussion Describe the learner surveys that you are using and how they differ from or are similar to the survey described. What are the benefits or drawbacks of using a standardized survey? Describe the learner surveys that you are using and how they differ from or are similar to the survey described. What are the benefits or drawbacks of using a standardized survey?
17
RSNA’s Project… Adoption of MEMS survey instrument coincided with implementation of a new Learning Management System Adoption of MEMS survey instrument coincided with implementation of a new Learning Management System Currently MEMS is used to evaluate over 300 online courses Currently MEMS is used to evaluate over 300 online courses
18
RSNA’s Project… Types of online courses using MEMS Cases of the Day (COD) Cases of the Day (COD) RadioGraphics CME Tests/Education Exhibits (EE) RadioGraphics CME Tests/Education Exhibits (EE) Refresher Courses (RSP) Refresher Courses (RSP)
19
Results…
20
COD-45 (N = 24) The course achieved its learning objectives
21
EE-355 (N = 32) The course achieved its learning objectives
22
RSP-2904 (N = 43) The course achieved its learning objectives
23
COD-45 (N = 24) The course was relevant to my clinical learning needs
24
EE-355 (N = 32) The course was relevant to my clinical learning needs
25
RSP-2904 (N = 43) The course was relevant to my clinical learning needs
26
COD-45 (N = 24) The course was relevant to my personal learning needs
27
EE-355 (N = 32) The course was relevant to my personal learning needs
28
RSP-2904 (N = 43) The course was relevant to my personal learning needs
29
COD-45 (N = 24) The online method of instruction was conducive to learning
30
EE-355 (N = 32) The online method of instruction was conducive to learning
31
RSP-2904 (N = 43) The online method of instruction was conducive to learning
32
COD-45 (N = 24) The course validated my current practice
33
EE-355 (N = 32) The course validated my current practice
34
RSP-2904 (N = 43) The course validated my current practice
35
COD-45 (N = 24) I plan to change my practice based on what I learned in the course
36
EE-355 (N = 32) I plan to change my practice based on what I learned in the course
37
RSP-2904 (N = 43) I plan to change my practice based on what I learned in the course
38
COD-45 (N = 24) The faculty provided sufficient evidence to support the content presented
39
EE-355 (N = 32) The faculty provided sufficient evidence to support the content presented
40
RSP-2904 (N = 43) The faculty provided sufficient evidence to support the content presented
41
COD-45 (N = 24) Was the course free of commercial bias towards a particular product or company?
42
EE-355 (N = 32) Was the course free of commercial bias towards a particular product or company?
43
RSP-2904 (N = 43) Was the course free of commercial bias towards a particular product or company?
44
COD-45 (N = 24) Did the course present a balanced view of clinical options?
45
EE-355 (N = 32) Did the course present a balanced view of clinical options?
46
RSP-2904 (N = 43) Did the course present a balanced view of clinical options?
47
Group Discussion What challenges to survey data collection and analysis have you faced? What challenges to survey data collection and analysis have you faced?
48
Challenges Faced by RSNA Survey is optional; little data available for some courses Survey is optional; little data available for some courses Little variation in the data Little variation in the data Some disconnect with educators on how the data is used Some disconnect with educators on how the data is used Difficult to get data out of the LMS Difficult to get data out of the LMS Surveys for live events are not included Surveys for live events are not included
49
Key Strategies Data Collection Data Collection Common core set of survey questions Common core set of survey questions Common format for evaluation data Common format for evaluation data Data Analysis Data Analysis Compare within type and modality Compare within type and modality Compare across type and modality Compare across type and modality Look for trends and variation Look for trends and variation Look for red flags Look for red flags
50
An Added Benefit Assists with program analysis and improvement required by the ACCME “The provider gathers data or information and conducts a program-based analysis on the degree to which the CME mission of the provider has been met through the conduct of CME activities/educational interventions.” -- ACCME Updated Accreditation Criteria, September 2006
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.