Presentation is loading. Please wait.

Presentation is loading. Please wait.

MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School.

Similar presentations


Presentation on theme: "MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School."— Presentation transcript:

1 MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School

2 Readiness Assurance Test

3 What is Program Evaluation? Systematic collection of information about a broad range of topics for use by specific people for a variety of purposes Patton, 1986

4 Definitions Evaluation ---- program Assessment ---- individual Formative Evaluation -- to improve Summative Evaluation --- to prove Outcomes Research --- patient care

5 Purposes of Program Evaluation To improve program To determine next steps/make decisions Help decide to replace, develop further, eliminate, accredit To determine effectiveness To document success To measure outcomes

6 For curricular purposes, evaluation helps Ensure teaching is meeting learner’s needs Identify where teaching can be improved Inform the allocation of resources Provide support to faculty and learners Diagnose and document program strengths and weaknesses Articulate what is valued by the institution Determine that educational objectives met Adapted from Morrison (2003)

7 Influences on the evaluation External Accrediting agencies Public Funding priorities Internal Who needs what answers? Who gets to pose the questions? How will the answers be made known?

8 Barriers to Program Evaluation Tension between implementing and evaluating Lack of skills in conducting applied social science research Paucity of funding, time, and publication outlets Failure to recognize evaluation as scholarship and place in literature Wilkerson, 2000

9 What is the biggest barrier for you or your institution to collect and analyze program evaluation data? Tension between getting a program implemented and evaluating it Lack of skills Paucity of funding or time Limited outlets to present or publish findings

10 Many Models Goal Oriented/Objective-Based (Tyler) Goals-free Evaluation (Scriven) Judicial/Adversary Evaluation CIPP (Stufflebeam) Kirkpatrick’s 4-level model Situated Evaluation Connoisseurship Evaluation (Eisner) Utilization-Oriented Evaluation (Patton) Logic Model

11 Program Logic Model - MERC

12 Tyler Model - MERC Objectives:MethodsContent/SpecificsFrequency/ Timing Person 1. Increase their participation in medical education research activities (research presentations and publications) (outcome evaluation) 2. Apply medical education research principles from MERC to their daily work (outcome evaluation) Short survey Retrospective pre/post survey 12 closed-ended dichotomous items participation in medical education research activities (ie, collaborating in medical education research project, publishing a peer- reviewed publication) Open-ended question 6-12 months after completion of MERC MERC Evaluation Committee to launch and analyze data

13 Kirkpatrick’s Four Levels of Outcomes 1.Satisfaction 2.Advance in knowledge, skills and attitudes 3.Skills used in everyday environment of the learner 4.Bottom-line a.Effect on participants’ “learners” b.Effect on participants’ career c.Institutional improvements

14 Overview of 10 Program Evaluation Steps (Workplan) Step 1: Identify Users Step 2: Identify Uses Step 3: Identify Resources Step 4: Identify Evaluation Questions/Objectives Step 5: Choose Evaluation Design Step 6: Choose Measurement Methods and Construct Instruments Step 7: Address Ethical Concerns Step 8: Collect Data Step 9: Analyze Data Step 10: Report Results

15 Step 1: Identify Users Who will use the evaluation? Learners Faculty Workshop developers Administrators Agencies Other stakeholders What do they want from the evaluation?

16 Step 2: Identify Uses Generally both formative and summative Individual and program decisions Qualitative and/or quantitative information Consider specific needs of each user Judgments about individuals Judgments about project management and processes

17 What uses do you have for program evaluation? Improving existing or new programs Proving that a program works

18 Step 3: Identify Resources What time is needed from everyone? What personnel is needed? What equipment? What facilities? What funds?

19 Step 4: Identify Evaluation Questions/Objectives These go back to the model chosen, but Generally Relate to specific measurable objectives for  Learner  Process  Outcomes Wise to include some questions that get at what was not anticipated both as strengths and weaknesses

20 Step 4: Identify Evaluation Questions/Objectives – cont. Evaluation questions should: Be clear and specific Congruent with the literature Focus on outcomes versus process Outcomes imply change  Workshop will improve educator’s skill RATHER THAN  How the workshop was given (process) Align with goals and objectives

21 ? Evaluation of Learning What are the questions? Process Outcomes Ease of use Efficiency Relevance Language Knowledge Attitudes Behaviors Evaluation of Content Authority Accuracy Appropriateness Breadth Depth Evaluation of Cost Development ImplementationMaintenance Needs assmt Objectives Materials Staffing Design Staff time Materials Recruitment Facilities Hardware Portability Coordination Durability Tech support Presentation & Organization Pedagogy Interface Instructional method Structure Active learning Learner differences Objectives ~ methods Interaction Feedback Clarity Quality Organization Adapted from Elissavet & Economides (2003)

22 Step 5: Choose Evaluation Designs What ones are appropriate to the questions? Posttest only X - - O  Satisfaction/reactions Retrospective Pretest X - - O  Attitudes Pretest-Posttest O - -X- - O  Changes in knowledge/attitudes Quasi-Experimental O - -X- - O- - - - - O  Cross-over O - - - - - O - -X - - O

23 Step 6: Choose Measurement Methods and Construct/Adapt Instruments Common methods Rating forms Self-assessments Essays Exams Questionnaires Interviews/focus groups Direct observations Performance audits Existing data (AAMC questionnaires, Course evals, JAMA) Collect appropriate demographics

24 SOURCES OF DATA What do we have? What do we need? What, realistically, can we do?

25 Group Assignment 1 Evaluate the effectiveness of the CORD program (see handout) Use your experiences and the information in the handout to address the first four steps During this workshop, you will begin to:

26 What would be the best model to use as we begin to develop our plan? Goal oriented/ objective based Kirkpatrick’s 4-level model Logic model

27 Assignment 2 Take your own project/program and begin filling in one of the blank matrices Be prepared to discuss with the group

28 Step 7: Address Ethical Concerns Confidentiality Access to data Consent Resource allocation Seek IRB approval

29 Step 8: Collect Data Timing and response rate Already existing data collection Impact on instrument design (e.g. mail vs. web survey) Assignment of responsibility

30 Step 9: Analyze Data Plan at the same time as the rest of the evaluation Want congruence between question asked and analysis that is feasible

31 Step 10: Report Results Timely Format fits needs of users Display results in succinct and clear manner

32 QUESTIONS???

33 ©


Download ppt "MERC Ten Steps to Designing an Evaluation for Your Educational Program Linda Perkowski, Ph.D. University of Minnesota Medical School."

Similar presentations


Ads by Google