Program Evaluation Intensive Research and Development in Medical Education: Faculty Development Workshop Series 2011-2012.

Slides:



Advertisements
Similar presentations
Introduction to Monitoring and Evaluation
Advertisements

Strategies for Implementing Outcomes in Practice Carolyn Baum, PhD, OTR, FAOTA.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Evaluation.
Comparing Assessment to Educational Research. In this session, we will… Construct and discuss the (related but different) goals and attributes of assessment.
Understanding Teaching Effectiveness and Assessment Projects.
EDU555 CURRICULUM & INSTRUCTION ENCIK MUHAMAD FURKAN MAT SALLEH WEEK 4 CURRICULUM EVALUATION.
Measuring Learning Outcomes Evaluation
The Educator’s Portfolio: Creation and Evaluation
Effective dissemination and evaluation
How to Develop the Right Research Questions for Program Evaluation
Impact assessment framework
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
1 Introduction to Evaluating the Minnesota Demonstration Program Paint Product Stewardship Initiative September 19, 2007 Seattle, WA Matt Keene, Evaluation.
Evaluation Assists with allocating resources what is working how things can work better.
Christy K. Boscardin, PhD Pat O’Sullivan, EdD
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of step 7 of the UFE checklist and to “level.
Qualitative Research Methods and Data Collection
“Strategies for Effective Clinical Teaching and Evaluation” Assessment & Evaluation – (Part 2) Patricia A. Mahoney, MSN, RN, CNE This presentation is a.
© 2014 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
© University of California San Francisco Medical School The Educator’s Portfolio: Creation and Evaluation Brian Schwartz, MD David M. Irby, PhD Kanade.
University of california, san francisco school of medicine Concept Mapping for Assessment in Medical Education Arianne Teherani.
© 2011 Partners Harvard Medical International Strategic Plan for Teaching, Learning and Assessment Program Teaching, Learning, and Assessment Center Strategic.
© 2015 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
© University of California San Francisco Medical School The Educator’s Portfolio: Creation and Evaluation David M. Irby, PhD Kanade Shinkai, MD, PhD Brian.
University of california, san francisco school of medicine Focus Groups For Educational Research and Evaluation School of Medicine Educational Skills Workshop.
©2013 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Overview of Chapters 11 – 13, & 17
Introduction to Critical Reflection
© 2015 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
Planning and Designing Scenario- based Simulations A step-wise approach 2014 Kanbar Center for Simulation, Clinical Skills and Telemedicine Education Pam.
Stages of the WSI life cycle Guidelines for Managing Integrity in Water Stewardship Initiatives: A Framework for Improving Effectiveness and Transparency.
© 2014 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
© 2014 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
Shaping a Health Statistics Vision for the 21 st Century 2002 NCHS Data Users Conference 16 July 2002 Daniel J. Friedman, PhD Massachusetts Department.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
The PHEA Educational Technology Initiative. Project Partners PHEA Foundations – Ford, Carnegie, Kresge, MacArthur South African Institute for Distance.
Using Logic Models in Program Planning and Grant Proposals The Covenant Foundation.
© 2015 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
© 2014 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
Program Evaluation Overview. Definitions of Program Evaluation systematic collection of information abut the activities, characteristics, and outcome.
© 2014 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
© 2014 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
EngageNY.org ©2012 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.
Curriculum Development: an Overview of 6 Steps MAJ Heather O’Mara, DO, FAAFP Faculty Development Fellow.
© 2015 Core Knowledge Foundation. This work is licensed under a Creative Commons Attribution- NonCommercial-ShareAlike 3.0 Unported License.
Free Writing Prompt #6 You will have 5 minutes to offer your thoughts about this question. Keep the pen (or pencil) moving. The idea is to access your.
Understanding Assessment Projects. Learning Objectives for this Session After completing this session you should be able to… 1.Articulate the requirements.
The Educator’s Portfolio: Creation and Evaluation
The Educator’s Portfolio: Creation and Evaluation
Effective Outcomes Assessment
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Morning Reflections / Today’s objectives
WEEK 4 CURRICULUM EVALUATION
Pre and post workshop assessments
Writing the Methods Section
This presentation will include:
Introduction to CPD Quality Assurance
Preparing Tables and Figures: Some Basics
Note to evaluator… The overall purpose of this presentation is to guide evaluators through the completion of steps 4 to 6 of the UFE checklist. The main.
Optimizing UCSF’s Learning Environment: Creating Actionable Plans
UCSF Educational Skills Workshop Small Group Teaching
Presentation transcript:

Program Evaluation Intensive Research and Development in Medical Education: Faculty Development Workshop Series

Creative Commons License Attribution-NonCommercial-Share Alike 3.0 Unported You are free: to Share — to copy, distribute and transmit the work to Remix — to adapt the work Under the following conditions: Attribution. You must give the original authors credit (but not in any way that suggests that they endorse you or your use of the work). Noncommercial. You may not use this work for commercial purposes. Share Alike. If you alter, transform, or build upon this work, you may distribute the resulting work only under a license identical to this one. See for full license.

Purpose of Workshop Describe program evaluation, its importance and purposes Identify models used in evaluation Identify considerations and barriers to program evaluation Describe the steps in designing an evaluation Develop an evaluation plan and one instrument

Overview of Session Brief didactic review of the purpose and importance of program evaluation Review principles and models for program evaluation Consider the elements of a successful program evaluation design Develop a evaluation plan including one instrument

What is Program Evaluation? Systematic collection of information about a broad range of topics for use by specific people for a variety of purposes The collection and analysis of quality information for decision makers

What are the Purposes of Program Evaluation? Maintain and improve services Protect citizens To improve program –Help decide to replace, develop further, eliminate, accredit To determine next step/make decisions To measure reliability, cost-effectiveness, efficiency, safety, ease of use To determine effectiveness To measure outcomes

For curricular purposes, evaluation helps Ensure teaching is meeting learner’s needs Identify where teaching can be improved Inform the allocation of resources Provide support to faculty and learners Diagnose and document program strengths and weaknesses Articulate what is valued by the institution Determine that educational objectives met Examine program goals, structure and process

Influences on the evaluation External –Accrediting agencies –Public –Funding priorities Internal –Who needs what answers? –Who gets to pose the questions? –How will the answers be made known?

Barriers to Program Evaluation Tension between implementing and evaluating Lack of skills in conducting applied social science research Paucity of funding, time, and publication outlets Failure to recognize evaluation as scholarship and place in literature

Many Models Goal Oriented/Objective-Based (Tyler) Goals-free Evaluation (Scriven) Judicial/Adversary Evaluation CIPP (context, input, process, products) (Stufflebeam) Kirkpatrick’s 4-level model Situated Evaluation Connoisseurship Evaluation (Eisner) Utilization-Oriented Evaluation (Patton)

Definitions Formative evaluation: focused on process of the activity being evaluated; on-going; allows for changes to be made in process Summative: focused on the outcomes of the activity Assessment: measure of individual performance

Exercise 1 During the next 25 minutes, please complete: Evaluation Background (5 mins) Purpose of the Evaluation (5 mins) Evaluation Users (2 mins) Evaluation Framework (10 mins) We will walk through each of these sections before you launch into writing. Responses are guides. Once you are done, spend 5 minutes with a partner discussing questions you have that either pertained to your project in particular or both of your projects to bring back to the group for discussion

Evaluation Background What is the merit, worth, and need of the program? –Merit: Does the program do what it is supposed to do? –Worth: Is program needed? What gap in education is it filling? –Need: Why was your program needed? What need or usefulness is being filled by the program goals? Can you defend the needs for those goals? Who will enroll and how long the program will be?

Purpose of Evaluation What do you hope to achieve by evaluating the program? Are you trying to improve the program, determine what the next steps are or make decisions about the viability of the program? Are you trying to document successes and outcomes? Will there be any other outcomes, not currently a part of the objectives, that are likely to be impacted via your program?

Evaluation Users Who, besides yourself, will be using your evaluation findings? –Learners –Faculty –Curriculum developers –Administrators –Agencies –Other stakeholders What do these users want from the evaluation?

Evaluation Framework If your evaluation framework is based on combining an outcome evaluation with a process evaluation? (decided in Purpose section) Are there any other components that you will be considering in your evaluation (e.g. needs assessment, documenting work with collaborators, program developers). Include in this section a description a summary of relevant literature about your program or studies on program similar to yours.

Exercise 1 - Large Group Questions from the group  In thinking about what you wanted to achieve from the evaluation, how did you chose to decide on final focus?  Any projects have a summative (keep or eradicate) program focus? If so, how did you deal with addressing that in the first four sections of your plan?  As you wrote down your users, could you think of a logical way of grouping your users? That is, are there groups of users that would care to see similar types of information on your program evaluation results? Did any of you do a full review of literature and/or discuss with others who have similar programs to yours? Did you build your program based on what others already know or is your program completely built from the ground up? What does your program have to add to what is currently known in the literature about either a teaching method or educational outcome of interest?

Exercise 2 During the next 25 minutes, please list the objectives of your educational intervention (Table 1) and additional processes or outcomes (Table 2) in relation to your educational intervention, program or curriculum here. Move back into your pairs as you go through this process. You can work individually and discuss with each other as needed.

Exercise 2 Objective. The objective or goal your program is created to achieve. Other Processes or Outcomes. Additional, not explicitly defined programs goals, structure, or processes that impact program implementation, improvement, adoption and adaptation. Method. Data collection instruments used to gather information on whether the objectives are achieved Frequency. Timeline within which you will administer the methods. Standard. Defensible standard by which you will determine whether the method indicates the objectives or goals are being met. Responsible person. The individual in charge of ensuring each method is executed.

Exercise 2 – Discussion Questions about the process from groups

Exercise 3 – Instrument Development 1.Satisfaction 2.Advance in knowledge, skills and attitudes 3.Skills used in everyday environment of the learner –Kirkpatrick

Exercise 3 and Summary Questions as you went through process? Considerations Selecting items with relevance to objective Considering the reporting ultimately and whether stakeholders would care about each of the items or aggregate results? Timeline provided for you to complete to consider 1.Timing and frequency 2.Data collection methods (qualitative/quantitative and existing vs. de novo) 3.Resources required 4.When and who you need to report to and what they would care about. 5.Institutional Review Board

Teaching Material Drawn From Stufflebeam DL. Shinkfield AJ. Evaluation Theory Models and Applications. John Wiley & Sons. Patricia O’Sullivan and the AAMC Medical Education Research Certificate Program. Kern, DE. et al Curriculum Development for medical education: a six-step approach. Johns Hopkins, Baltimore, Kirkpatrick, D. Evaluating Training Programs: The Four Levels (second edition). Berret Koehler Pub Patton, MQ. Utilization-focused evaluation. Sage Publications, Newbury Park, 1986.