One of the most important aspects of any CME activity is evaluation, or outcomes measurement. CME Compliance: Evaluation Measuring the educational outcomes.

Slides:



Advertisements
Similar presentations
UCSF School of Medicine OCME Program Frequently Asked Questions Faculty Disclosure and Resolution of Conflict of Interest.
Advertisements

Continuous Quality Improvement Perspective in the Self-Study Katy E. Marre, Ph.D. Associate Vice President for Graduate Studies & Research University of.
Third Year Clerkship Evaluations Office of Professionalism, Evaluation & Learning Maria Esquivel, Executive Secretary
The Challenge and Importance of Evaluating Residents and Fellows Debra Weinstein, M.D. PHS GME Coordinators Retreat March 25, 2011.
Activity announcements can be: Save-the-Date cards Mass Posters Electronic Message Board CME Compliance: Announcements.
CME 101 The Application Process Sponsored by the Office of Continuing Medical Education.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
The Good, the Bad, and the Ugly: Kuali Coeus Model Office at Indiana University 2011 Kuali Coeus User Conference March 29, 2011 Carey L. Conover Tracey.
Welcome to Nevikor This short presentation will introduce you to some of the more essential functions. This short presentation will introduce you to some.
Surveys: One More Outcomes Measure Jay Shapiro, MD Program Director Anesthesiology.
Welcome 2008 CME Coordinators Workshop SUNY Downstate Medical Center Office of Continuing Medical Education.
Commercial support is defined as financial or in- kind contributions given by a commercial interest, which is used to pay all or part of the costs of a.
Simple Survey Resources: Templates, Tabulation & Impact Jeff Buckley, Jenna Daniel, & Casey Mull.
2013 Joint KMS and MSMA CME Provider Conference. The Provider describes that its CME Committee sets the topics and educational goals of its CME activities.
CME Compliance: Disclosure 1.All CME activities are required to disclose to learners the relationship faculty, planners, authors or anyone in a position.
15th Annual Primary Care Update May 8-12, 2012
Using Smartphone and Open Source Web Technologies to Automate Academic Regularly Scheduled Series Presented at: 2011 Annual Medbiquitous Conference Presented.
Once faculty have been identified to participate in a CME activity, they will receive a letter of invite to be completed. If an honorarium is to be paid,
Damian Gordon.  Summary and Relevance of topic paper  Definition of Usability Testing ◦ Formal vs. Informal methods of testing  Testing Basics ◦ Five.
Assessing Financial Education: A Practitioner’s Guide December 2010.
Website Content, Forms and Dynamic Web Pages. Electronic Portfolios Portfolio: – A collection of work that clearly illustrates effort, progress, knowledge,
Teaching Tip Series Day One: The First Meeting with a Student.
Meeting the Intent of the Accreditation Standard: Making ResiTrak® Work for Your Residency Program Beth Bryles Phillips, PharmD, BCPS Jamie Kalus, PharmD,
Exploring the Accreditation Criterion 1-15 BEST PRACTICES.
Blackboard Strategies: Using Blackboard Pedagogically.
Local Evaluation Overview and Preliminary Findings Diane Schilder, EdD.
Red Roofs Surgery Local Patient Participation Report We are a long established practice, located close to the centre of Nuneaton, serving approximately.
ALERT™ Education Outcomes and Impact on Clinical Practice Chris Parr Senior Information Analyst 1.
An ITS initiative in association with the TSC Gathering your needs and requirements to support eLearning at Western Talk to Us!
Using FluidSurveys for Administrative Purposes Campus Technology Day.
Evaluation Assists with allocating resources what is working how things can work better.
EDU December 1, Methods – 8 Likert five-point questions were entered into SPSS 14.0 – Analysis was conducted into two parts: Analysis.
Certificate IV in Project Management Course Structure Course Number Qualification Code BSB41507.
(Name of Conference) Housekeeping Slides Welcome to the Name of Conference These are our daily announcements.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Director of Evaluation and Accountability Manager, UW’s Grand Rapids, Michigan Robert McKown, CIRS Director of Evaluation and Accountability Sherri.
Evaluating and measuring impact in career development: extension workshop Presented by – Date – Just to identify strengths and areas to improve are no.
1 Ambulatory Pediatric Association Educational Guidelines for Pediatric Residency Tutorial 6 Source: Kittredge, D., Baldwin, C. D., Bar-on, M. E., Beach,
ORS Moderator Instructions. Moderators must review this presentation, And follow the link on the last slide to indicate you have completed the online.
Designing a Training Program RATIONALE OF THE TRAINING Background or introduction of what the training is all about –Developments in the field/discipline/area.
The University of British Columbia Faculty of Medicine Department of Family Practice Post Graduate Program.
Sara Lovell, CPCS Education Coordinator Providence Alaska Medical Center.
Michal Fedeles, PhD Director, Continuing Health Education, Adjunct Professor, Faculty of Health Sciences Simon Fraser University Céline Cressman, MSc Collaborator,
360 Feedback A Tool For Improving Individual And Organizational Effectiveness.
Ongoing Assessments Introducing… Feedback What is feedback? Feedback is an assessment for faculty to gather information from their students in real time.
Office of the Chief Information Officer Introduction to Qualtrics for Online Survey & m-Learning Office of the CIO.
CME & CNE Joint Providership for ACC Chapters Tuesday, November 10, 2015.
Draft Teamcard Project Update Chief Resident’s Lunch Meeting February 7 th, 2011 Last Updated: 1/18/2011.
M & E System for MI’s Training Program & Guidelines for MI’s Completion Report Presented by Monitoring and Evaluation Officer Mekong Institute January.
Jump into Blended and Online Course Development We’ll Help You Look Before You Leap! Stephanie Boychuk Learning Technologies Support Specialist Vancouver.
COLLABORATE 13- OAUG FORUM OAUG Speaker Orientation March 19 th at 3:00 PM March 21 st at 11:00 AM.
Accreditation Council for Graduate Medical Education Milestones are Coming: A Conversation with the Family Medicine Milestones Committee May 2013.
The Education Design Challenge Jonathan St. George MD, Jared Rich MD, John Won MD New Tools for Bridging the Education Gap The 21st century is defined.
Introduction Is there a difference between what residents believe best facilitates their learning and how preceptors teach and believe residents want to.
Kimberlee Pottberg.  Part 1: Why we use WEAVEonline  Part 2: How to enter components.
Improving Impact Statements Using Survey Data 2016 Extension Conference Jeff Buckley, Jennifer Cantwell, January 14, 2016.
CME Process and Accreditation Criteria Objectives Explain Updated ACCME Accreditation Criteria Utilize the revised CME Planning Process to incorporate.
Continuing Medical Education Guidance for Planners of Regularly Scheduled Series (RSS) To insert your company logo on this slide From the Insert Menu Select.
Librarian Led Technology Sessions Participation in EBM Conference
Required Record Keeping Documents
Heart Failure Management
Data Collection in MTM Choosing the right method for survey data collection.
T-1270 Pre employement medicals and injury managment
Coordinator Application and My Credits Module
T-1270 Pre employement medicals and injury managment
Accreditation and Internal Reviews
Disclosure of Faculty Conflict of Interest
Housekeeping: CME Information Accreditation & Designation Statements
Presentation transcript:

One of the most important aspects of any CME activity is evaluation, or outcomes measurement. CME Compliance: Evaluation Measuring the educational outcomes of your activity is an essential part of the Plan, Do, Study, Act cycle:

Some examples of effective evaluation methods are: Pre and post-tests Post-activity evaluation forms Informal polling at the activity (show of hands, audience response systems) Follow-up surveys CME Compliance: Evaluation

Evaluations are designed to collect data about: 1.Whether the learning objectives have been met 2.How participation will impact the physicians abilities and/or strategies, help modify his/her practice, and lead to improved patient outcomes 3.The facultys knowledge and presentation skills 4.The quality of the educational design and format 5.How well the activity met the learners needs 6.If the audience perceived any commercial bias in the presentations

At LSU School of Medicine, we typically use the following method: The CME office generates evaluation forms through Survey Monkey, a web based evaluation tool which allows for results to be analyzed and reported to the Activity Medical Director shortly after an activity ends. Evaluations can be completed on paper or online. CME Compliance: Evaluation

For online evaluation, the tag reader app uses the learners phone camera to scan an image like this one: CME Compliance: Evaluation To access the evaluation form on Survey Monkey

CME Compliance: Evaluation

Evaluations distributed and collected on paper at each activity are manually entered into Survey Monkey so that all feedback can be analyzed and reported to activity directors and planners quickly.

CME Compliance: Evaluation Outcomes are very important to LSUs continued CME accreditation and an important part of our ongoing educational planning process and used to track the overall effectiveness of the CME program to ensure we are meeting our mission. Therefore, evaluation return rates are continually monitored by the CME office.

How you can use the evaluation results: Survey Monkey compiles the results of your evaluation data, and the CME office sends you a summary of these results. Monitor your programs attendance Send feedback to presenters Customize the evaluation for the activity or group Feedback to share with your overall educational program/faculty/resident evaluation CME Compliance: Evaluation

Post-Test: Evaluation Which of the following is NOT the kind of data CME evaluation is meant to collect? A.Whether the learning objectives were met B.How comfortable the meeting space is C.The facultys knowledge and presentation skills D.If any commercial bias was perceived CME Compliance: Evaluation

Post-Test: Evaluation Which of the following is NOT the kind of data CME evaluation is meant to collect? A.Whether the learning objectives were met B.How comfortable the meeting space is C.The facultys knowledge and presentation skills D.If any commercial bias was perceived CME Compliance: Evaluation

Post-Test: Evaluation True or False: Evaluation data is used to track the overall effectiveness of the CME program to ensure we are meeting our mission. CME Compliance: Evaluation

Post-Test: Evaluation True or False: Evaluation data is used to track the overall effectiveness of the CME program to ensure we are meeting our mission. CME Compliance: Evaluation True: Many of the questions asked on evaluations are designed to measure whether activities are successfully meeting specific educational goals as outlined by the ACCME and the LSU CME Mission Statement.

Post-Test: Evaluation True or False: Evaluation forms must be completed on paper. CME Compliance: Evaluation

Post-Test: Evaluation True or False: Evaluation forms must be completed on paper. CME Compliance: Evaluation False: Evaluations can be completed online via smart phone through the Survey Monkey web site or on paper if they are distributed at the time of the activity. They can also be a show of hands, a live post-activity feedback session, or audience response system.

QUESTIONS? Please contact the LSU CME office at (504) or Doug Grigsby at CME Compliance: Commercial Support