MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.

Slides:



Advertisements
Similar presentations
Program Evaluation: What is it?
Advertisements

Focusing an Evaluation Ben Silliman, Youth Development Specialist NC 4-H Youth Development.
General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.
The Challenge and Importance of Evaluating Residents and Fellows Debra Weinstein, M.D. PHS GME Coordinators Retreat March 25, 2011.
Introduction to Monitoring and Evaluation
Standards Definition of standards Types of standards Purposes of standards Characteristics of standards How to write a standard Alexandria University Faculty.
Johns Hopkins University School of Education Johns Hopkins University Evaluation Overview.
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
Training Kirkpatrick’s Four Levels of Evaluation Kelly Arthur Richard Gage-Little Dale Munson Evaluation Strategies for Instructional Designers.
Program Evaluation and Measurement Janet Myers. Objectives for today… To define and explain concepts and terms used in program evaluation. To understand.
An Assessment Primer Fall 2007 Click here to begin.
GME Lunch n Learn Series Cuc Mai September Common Program Requirements: Competency-based goals and objectives for each assignment at each educational.
Program Evaluation Intensive Research and Development in Medical Education: Faculty Development Workshop Series
Formative and Summative Evaluations
Evaluating and Revising the Physical Education Instructional Program.
The Academic Assessment Process
Evaluation. Practical Evaluation Michael Quinn Patton.
Application of Ethical Principles During the Informed Consent Process for Clinical Trials Barbara E. Barnes, MD, MS Joanne Russell, MPPM Maurice Clifton,
Measuring Learning Outcomes Evaluation
UOFYE Assessment Retreat
Welcome… The attendee will understand assessment basics with a focus on creating learning activities and identifying assessment expectations. Apply the.
Tool for Assessing Statistical Capacity (TASC) The development of TASC was sponsored by United States Agency for International Development.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Too expensive Too complicated Too time consuming.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Elizabeth Godfrey 1.  Periodic assessment of results Appropriateness, effectiveness, efficiency, impact, sustainability  Identifies intended and unintended.
Using Electronic Portfolios to Assess Learning at IUPUI. Trudy Banta, et. al. Indiana University-Purdue University Indianapolis 2007.
Leading Change. THE ROLE OF POLICY IN CHANGE Leading Change – The Role of Policy Drift to Quantitative Compliance- Behavior will focus on whatever is.
“Strategies for Effective Clinical Teaching and Evaluation” Assessment & Evaluation – (Part 2) Patricia A. Mahoney, MSN, RN, CNE This presentation is a.
Introduction to Evaluation Odette Parry & Sally-Ann Baker
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
Assessment Workshop College of San Mateo February 2006.
Using Needs Assessment to Build A Strong Case for Funding Anna Li SERVE, Inc.
Accreditation in the higher education
Evaluating HRD Programs
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
ARE STUDENTS LEARNING WHAT WE SAY THEY ARE? THE IMPORTANCE AND PROCESS FOR CONDUCTING EFFECTIVE PROGRAM REVIEWS IN THE BUSINESS CURRICULUM Presented by:
ScWk 242 Course Overview and Review of ScWk 240 Concepts ScWk 242 Session 1 Slides.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
PPA 502 – Program Evaluation Lecture 2c – Process Evaluation.
EDU 385 CLASSROOM ASSESSMENT Week 1 Introduction and Syllabus.
SACS-CASI Southern Association of Colleges and Schools Council on Accreditation and School Improvement FAMU DRS – QAR Quality Assurance Review April 27-28,
Richard Beinecke, Professor and Chair Suffolk University Institute for Public Service.
The Major Steps of a Public Health Evaluation 1. Engage Stakeholders 2. Describe the program 3. Focus on the evaluation design 4. Gather credible evidence.
Program Evaluation.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
 2007 Johns Hopkins Bloomberg School of Public Health Introduction to Program Evaluation Frances Stillman, EdD Institute for Global Tobacco Control Johns.
Evaluating Multilingual Education Programs Seminar on Multilingual Education Kabul, March 2010 Dennis Malone, Ph.D.
Program Evaluation DR. MAJED WADI. Objectives  Design necessary parameters used for program evaluation  Accept different views of program evaluation.
1 © 2009 University of Wisconsin-Extension, Cooperative Extension, Program Development and Evaluation Planning your evaluation This presentation provides.
Program Evaluation Principles and Applications PAS 2010.
This project is financed by the European Union 1 The project is implemented by a European Profiles S.A. led consortium Evaluation of Training By Senior.
The purpose of evaluation is not to prove, but to improve.
Re-Cap NGSS. Assessment, Evaluation, and Alignment.
Kaplan University Writing CenterSaturday, March 05, IX520 Needs Assessment and Evaluation in Education and Training Unit 7 Seminar September 13,
Promoting the Vision & Mission of the School Governing Board Online Training Module.
Training Evaluation Chapter 6 6 th Edition Raymond A. Noe Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
.  Evaluators are not only faced with methodological challenges but also ethical challenges on a daily basis.
Accreditation of study programs at the Faculty of information technologies Tempus SMGR BE ESABIH EU standards for accreditation of study.
Curriculum Development: an Overview of 6 Steps MAJ Heather O’Mara, DO, FAAFP Faculty Development Fellow.
Overview of Education in Health Care
Training Trainers and Educators Unit 8 – How to Evaluate
Department of Political Science & Sociology North South University
MUHC Innovation Model.
Is there another way besides accreditation?
Training Trainers and Educators Unit 8 – How to Evaluate
Introduction to CPD Quality Assurance
School of Dentistry Education Research Fund (SDERF)
Faculty Development Dr Samira Rahat Afroze.
TLQAA STANDARDS & TOOLS
Presentation transcript:

MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School

Readiness Assurance Test

What is Program Evaluation? Systematic collection of information about a broad range of topics for use by specific people for a variety of purposes Patton, 1986

Definitions Evaluation ---- program Assessment ---- individual Formative Evaluation -- to improve Summative Evaluation --- to prove Outcomes Research --- patient care

Purposes of Program Evaluation To improve program To determine next steps/make decisions Help decide to replace, develop further, eliminate, accredit To determine effectiveness To document success To measure outcomes

For curricular purposes, evaluation helps Ensure teaching is meeting learner’s needs Identify where teaching can be improved Inform the allocation of resources Provide support to faculty and learners Diagnose and document program strengths and weaknesses Articulate what is valued by the institution Determine that educational objectives met Adapted from Morrison (2003)

Influences on the evaluation External Accrediting agencies Public Funding priorities Internal Who needs what answers? Who gets to pose the questions? How will the answers be made known?

Barriers to Program Evaluation Tension between implementing and evaluating Lack of skills in conducting applied social science research Paucity of funding, time, and publication outlets Failure to recognize evaluation as scholarship and place in literature Wilkerson, 2000

What is the biggest barrier for you or your institution to collect and analyze program evaluation data? Tension between getting a program implemented and evaluating it Lack of skills Paucity of funding or time Limited outlets to present or publish findings

Many Models Goal Oriented/Objective-Based (Tyler) Goals-free Evaluation (Scriven) Judicial/Adversary Evaluation CIPP (Stufflebeam) Kirkpatrick’s 4-level model Situated Evaluation Connoisseurship Evaluation (Eisner) Utilization-Oriented Evaluation (Patton) Logic Model

Program Logic Model - MERC

Tyler Model - MERC Objectives:MethodsContent/SpecificsFrequency/ Timing Person 1. Increase their participation in medical education research activities (research presentations and publications) (outcome evaluation) 2. Apply medical education research principles from MERC to their daily work (outcome evaluation) Short survey Retrospective pre/post survey 12 closed-ended dichotomous items participation in medical education research activities (ie, collaborating in medical education research project, publishing a peer- reviewed publication) Open-ended question 6-12 months after completion of MERC MERC Evaluation Committee to launch and analyze data

Kirkpatrick’s Four Levels of Outcomes 1.Satisfaction 2.Advance in knowledge, skills and attitudes 3.Skills used in everyday environment of the learner 4.Bottom-line a.Effect on participants’ “learners” b.Effect on participants’ career c.Institutional improvements

Overview of 10 Program Evaluation Steps (Workplan) Step 1: Identify Users Step 2: Identify Uses Step 3: Identify Resources Step 4: Identify Evaluation Questions/Objectives Step 5: Choose Evaluation Design Step 6: Choose Measurement Methods and Construct Instruments Step 7: Address Ethical Concerns Step 8: Collect Data Step 9: Analyze Data Step 10: Report Results

Step 1: Identify Users Who will use the evaluation? Learners Faculty Workshop developers Administrators Agencies Other stakeholders What do they want from the evaluation?

Step 2: Identify Uses Generally both formative and summative Individual and program decisions Qualitative and/or quantitative information Consider specific needs of each user Judgments about individuals Judgments about project management and processes

What uses do you have for program evaluation? Improving existing or new programs Proving that a program works

Step 3: Identify Resources What time is needed from everyone? What personnel is needed? What equipment? What facilities? What funds?

Step 4: Identify Evaluation Questions/Objectives These go back to the model chosen, but Generally Relate to specific measurable objectives for  Learner  Process  Outcomes Wise to include some questions that get at what was not anticipated both as strengths and weaknesses

Step 4: Identify Evaluation Questions/Objectives – cont. Evaluation questions should: Be clear and specific Congruent with the literature Focus on outcomes versus process Outcomes imply change  Workshop will improve educator’s skill RATHER THAN  How the workshop was given (process) Align with goals and objectives

? Evaluation of Learning What are the questions? Process Outcomes Ease of use Efficiency Relevance Language Knowledge Attitudes Behaviors Evaluation of Content Authority Accuracy Appropriateness Breadth Depth Evaluation of Cost Development ImplementationMaintenance Needs assmt Objectives Materials Staffing Design Staff time Materials Recruitment Facilities Hardware Portability Coordination Durability Tech support Presentation & Organization Pedagogy Interface Instructional method Structure Active learning Learner differences Objectives ~ methods Interaction Feedback Clarity Quality Organization Adapted from Elissavet & Economides (2003)

Step 5: Choose Evaluation Designs What ones are appropriate to the questions? Posttest only X - - O  Satisfaction/reactions Retrospective Pretest X - - O  Attitudes Pretest-Posttest O - -X- - O  Changes in knowledge/attitudes Quasi-Experimental O - -X- - O O  Cross-over O O - -X - - O

Step 6: Choose Measurement Methods and Construct/Adapt Instruments Common methods Rating forms Self-assessments Essays Exams Questionnaires Interviews/focus groups Direct observations Performance audits Existing data (AAMC questionnaires, Course evals, JAMA) Collect appropriate demographics

SOURCES OF DATA What do we have? What do we need? What, realistically, can we do?

Group Assignment 1 Evaluate the effectiveness of the CORD program (see handout) Use your experiences and the information in the handout to address the first four steps During this workshop, you will begin to:

What would be the best model to use as we begin to develop our plan? Goal oriented/ objective based Kirkpatrick’s 4-level model Logic model

Assignment 2 Take your own project/program and begin filling in one of the blank matrices Be prepared to discuss with the group

Step 7: Address Ethical Concerns Confidentiality Access to data Consent Resource allocation Seek IRB approval

Step 8: Collect Data Timing and response rate Already existing data collection Impact on instrument design (e.g. mail vs. web survey) Assignment of responsibility

Step 9: Analyze Data Plan at the same time as the rest of the evaluation Want congruence between question asked and analysis that is feasible

Step 10: Report Results Timely Format fits needs of users Display results in succinct and clear manner

QUESTIONS???

©