Chapter 14 Evaluation in Healthcare Education

Slides:



Advertisements
Similar presentations
Developing the Teaching Portfolio Carol Tresolini, Ph. D
Advertisements

Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
Conducting the Community Analysis. What is a Community Analysis?  Includes market research and broader analysis of community assets and challenges 
Donald T. Simeon Caribbean Health Research Council
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
Dr. Suzan Ayers Western Michigan University
Evaluation.
Midterm Review Evaluation & Research Concepts Proposals & Research Design Measurement Sampling Survey methods.
Evaluation. Practical Evaluation Michael Quinn Patton.
How to Write Goals and Objectives
Measuring Learning Outcomes Evaluation
Choosing Your Primary Research Method What do you need to find out that your literature did not provide?
Unit 9: Program Evaluation CERT Program Manager. CERT Program Manager: Program Evaluation 9-2 CERT Program Manager: Program Evaluation Unit Objectives.
How to Develop the Right Research Questions for Program Evaluation
Chapter 3: Marketing Intelligence Copyright © 2010 Pearson Education Canada1.
Unit 5:Elements of A Viable COOP Capability (cont.)  Define and explain the terms tests, training, and exercises (TT&E)  Explain the importance of a.
Too expensive Too complicated Too time consuming.
Comp 20 - Training & Instructional Design Unit 6 - Assessment This material was developed by Columbia University, funded by the Department of Health and.
Evaluating a Research Report
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Program Evaluation.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
Welcome! A-Team Session 4 Fall Overview for Today 1 Minute Message review Outcomes Feedback Methodology Review Assessment Steps  Sampling  Validity/Reliability.
Evaluation: from Objectives to Outcomes Janet Myers, PhD MPH AIDS Education and Training Centers National Evaluation Center
Are we there yet? Evaluating your graduation SiMR.
Evaluation and Assessment Evaluation is a broad term which involves the systematic way of gathering reliable and relevant information for the purpose.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
Monitor and Revise Teaching. ObjectivesObjectives Describe how to monitor teaching List ways to contribute to broader evaluations Explain how to review.
Monitoring and Evaluation Systems for NARS organizations in Papua New Guinea Day 4. Session 10. Evaluation.
Publisher to insert cover image here Chapter 9
Stages of Research and Development
Introduction Social ecological approach to behavior change
Quality Assurance processes
Classroom Assessments Checklists, Rating Scales, and Rubrics
Incorporating Evaluation into a Clinical Project
Rebecca McQuaid Evaluation 101 …a focus on Programs Rebecca McQuaid
3 Chapter Needs Assessment.
DATA COLLECTION METHODS IN NURSING RESEARCH
Designing Effective Evaluation Strategies for Outreach Programs
Part Two.
Chapter 14 Evaluation in Healthcare Education
Bell Ringer List five reasons why you think that some new businesses have almost immediate success while others fail miserably.
Monitoring and Evaluation Systems for NARS Organisations in Papua New Guinea Day 3. Session 9. Periodic data collection methods.
Chapter 8: Performance-Based Strategies
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
How to Research Lynn W Zimmerman, PhD.
Classroom Assessment A Practical Guide for Educators by Craig A
Evaluation Emma King.
MUHC Innovation Model.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
FEASIBILITY STUDY Feasibility study is a means to check whether the proposed system is correct or not. The results of this study arte used to make decision.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Introduction to Client Assessment
Software Requirements analysis & specifications
Prepared by BSP/PMR Results-Based Programming, Management and Monitoring Presentation to Geneva Group - Paris Hans d’Orville Director, Bureau of Strategic.
Is there another way besides accreditation?
IB Environmental Systems and Societies
The Marketing Research Process
Data and Data Collection
Measuring Data Quality and Compilation of Metadata
Creating Assessable Student Learning Outcomes
Introduction to Student Achievement Objectives
The Process of Advertising Research
Unit 7: Instructional Communication and Technology
Presenter: Kate Bell, MA PIP Reviewer
IDEA Student Ratings of Instruction
Chapter 4 Instructional Media and Technologies for Learning
Prepared by : Sereen alsabi Dareen alzubaidi.
Using Logic Models in Project Proposals
Presentation transcript:

Chapter 14 Evaluation in Healthcare Education

An Evaluation Is: A process A critical component of other processes nursing process decision-making process education process A way to provide data to demonstrate effectiveness The bridge at the end of one process that guides direction of the next

Definition of Evaluation Evaluation: A systematic and continuous process by which the significance or worth of something is judged; the process of collecting and using information to determine what has been accomplished and how well it has been accomplished to guide decision making.

Steps in Evaluation Focus Design Conduct Analyze Interpret Report Use

Evaluation, EBP and PBE Evaluations are not intended to be generalizable, but are conducted to determine effectiveness of a specific intervention in a specific setting with an identified individual or group. Practice-based evidence is just beginning to be defined.

Assessment = Input Evaluation = Output

The Difference between Assessment and Evaluation Assessment and evaluation are two concepts that are highly interrelated and are often used interchangeably as terms, but they are not synonymous. Assessment: a process to gather, summarize, interpret, and use data to decide a direction for action. Evaluation: a process to gather, summarize, interpret, and use data to determine the extent to which an action was successful.

Five Foci of Evaluation In planning any evaluation, the first and most crucial step is to determine the focus of the evaluation. Evaluation focus includes five basic components: audience purpose questions scope resources To determine these components, the following five questions should be asked:

Evaluation Focus For what audience is the evaluation being conducted? For what purpose is the evaluation being conducted? What questions will be asked? What is the scope of the evaluation? What resources are available to conduct the evaluation?

RSA Evaluation Model high low frequency time & cost Impact Outcome Content Process low Total Program high

Process (Formative) Evaluation Audience: individual educator Purpose: to make adjustments as soon as needed during education process Question: What can better facilitate learning? Scope: limited to specific learning experience; frequent; concurrent with learning Resources: inexpensive and available

Content Evaluation Audience: educator/clinician individual or team Purpose: to determine whether learners have acquired knowledge/skills just taught Question: To what degree did learners achieve specified objectives? Scope: limited to specific learning experience and objectives; immediately after education completed (short-term) Resources: relatively inexpensive; available

Outcome (Summative) Evaluation Audience: educator, education team/director, education funding group Purpose: to determine effects of teaching Question: Were goals met? Did (planned) change in behaviors occur? Scope: broader scope, more long term and less frequent than content evaluation Resources: expensive, sophisticated, may require expertise that is less readily available

Impact Evaluation Audience: institution administration, funding agency, community Purpose: to determine relative effects of education on institution or community Question: What is the effect of education on long-term changes at the organizational or community level? Scope: broad, complex, sophisticated, long-term; occurs infrequently Resources: extensive, resource-intensive

Total Program Evaluation Audience: education dept., institutional administration, funding agency, community Purpose: to determine extent to which total program meets/exceeds long-term goals Question: To what extent did all program activities meet annual departmental/ institutional/community goals? Scope: broad, long-term/strategic; lengthy, therefore conducted infrequently Resources: extensive, resource-intensive

Evaluation vs. Research Audience specific to single person, group, institution, or location Conducted to make decisions in specific setting Focused on needs of primary audience Time constrained by urgency of decisions to be made Audience generic Conducted to generate new knowledge and/or expand existing knowledge Focused on sample representativeness, generalizability of findings Time constrained by study funding

Five Levels of Learner Evaluation LEVEL I LEVEL II LEVEL III LEVEL IV Learner’s dis- satisfaction & readiness to learn (Needs assessment) Learner’s par- ticipation & satisfaction during inter- vention Learner’s per- formance & satisfaction after interven- tion Learner’s per- formance & attitude in daily setting Learner’s maintained performance & attitude (Initial; process) (Short-term; content) (Long-term; outcome) (Ongoing; impact)

Evaluation Methods What types of data will be collected? Complete (people, program, environment) Concise (will answer evaluation questions) Clear (use operational definitions) Comprehensive (quantitative and qualitative) From whom or what will data be collected? From participants, surrogates, documents, and/or preexisting databases Include population or sample

Evaluation Methods (cont’d) How, when, and where will data be collected? By observation, interview, questionnaire, test, record review, secondary analysis Consistent with type of evaluation Consistent with questions to be answered By whom will data be collected? By learner, educator, evaluator, and/or trained data collector Select to minimize bias

Evaluation Barriers Lack of clarity Lack of ability Resolve by clearly describing five evaluation components. Specify and operationally define terms. Lack of ability Resolve by making necessary resources available. Solicit support from experts.

Evaluation Barriers (cont’d) Fear of punishment or loss of self-esteem Resolve by being aware of existence of fear among those being evaluated. Focus on data and results without personalizing or blaming. Point out achievements. Encourage ongoing effort. COMMUNICATE!!!

Selecting an Evaluation Instrument Identify existing instruments through literature search, review of similar evaluations conducted in the past. Critique potential instruments for: Fit with definitions of factors to be measured Evidence of reliability and validity, especially with a similar population Appropriateness for those being evaluated Affordability, feasibility

When conducting an evaluation: Conduct a pilot test first. Assess feasibility of conducting the full evaluation as planned. Assess reliability, validity of instruments. Include “extra” time. Be prepared for unexpected delays. Keep a sense of humor!

Data Analysis and Interpretation The purpose for conducting data analysis is two-fold: To organize data so that they can provide meaningful information, such as through the use of tables and graphs, and 2. To provide answers to evaluation questions. Data can be quantitative and/or qualitative in nature.

Reporting Evaluation Results Be audience focused. Begin with a one-page executive summary. Use format and language clear to the audience. Present results in person and in writing. Provide specific recommendations. Stick to the evaluation purpose. Directly answer questions asked.

Reporting Evaluation Results (cont’d) Stick to the data. Maintain consistency between results and interpretation of results. Identify limitations.

Summary of Evaluation Process The process of evaluation in healthcare education is to gather, summarize, interpret, and use data to determine the extent to which an educational activity is efficient, effective, and useful to learners, teachers, and sponsors. Each aspect of the evaluation process is important, but all of them are meaningless unless the results of evaluation are used to guide future action in planning and carrying out interventions.