Download presentation
Presentation is loading. Please wait.
1
Chapter 14 Evaluation in Healthcare Education
2
An Evaluation Is: The final components of; Education process
Nursing process Decision-making process Because these process are cyclic ,so evaluation serves as a bridge at the end of one process that guides direction of the next process
3
Definition of Evaluation
Gathering, summarizing, interpreting, and using data to determine the extent to which an action was successful A systematic process by which the worth or value of something-in this case teaching and learning –is judged
4
Evaluation Evaluations are not intended to be generalizable, but are conducted to determine effectiveness of a specific intervention in a specific setting with an identified individual or group.
5
What is the relationship between Evaluation, Evidence-Based Practice EBP and Practice-based evidence PBE
6
Evidence-Based Practice EBP
EBP has evolved and expanded over decades and can be defined as the conscientious use of current best evidence in making decisions about patient care (Melnyk and Fineout-Overholt,2005,p.6) It includes results of systematically conducted evaluation from researches
7
Practice-based evidence PBE
PBE is just beginning to be defined and include results of systematically conducted evaluation from practice and clinical experience rather than from research
8
The Difference between Assessment and Evaluation
Assessment = Input Evaluation = Output
9
The Difference between Assessment and Evaluation
Assessment and evaluation are two concepts that are highly interrelated and are often used interchangeably as terms, but they are not synonymous. Assessment: a process to gather, summarize, interpret, and use data to decide a direction for action. Evaluation: a process to gather, summarize, interpret, and use data to determine the extent to which an action was successful.
10
Formative and summative assessment
Formative assessment A set of formal and informal assessment methods undertaken by the teachers at the time of the learning process is known as Formative Assessment. It is a part of the instructional process, which is undertaken by the teachers, with an objective of enhancing the student’s understanding and competency, by modifying teaching and learning methods.
11
Formative Assessment attempts to provide direct and detailed feedback to both teachers and students, regarding the performance and learning of the student. It is a continuous process, that observes student’s needs and progress, in the learning process. The goal of formative assessment is to monitor student learning to provide ongoing feedback that can be used by instructors to improve their teaching and by students to improve their learning.
12
More specifically, formative assessments:
help students identify their strengths and weaknesses and target areas that need work help faculty recognize where students are struggling and address problems immediately
13
Formative assessments are generally low stakes, which means that they have low or no point value. Examples of formative assessments include asking students to: draw a concept map in class to represent their understanding of a topic submit one or two sentences identifying the main point of a lecture turn in a research proposal for early feedback
14
Summative assessment refers to the evaluation of students; that focuses on the result. It is a part of the grading process which is given periodically to the participants, usually at the conclusion of the course, term or unit. The purpose is to check the knowledge of the students, i.e. to what extent they have learned the material, taught to them.
15
Summative Assessment, seeks to evaluate the effectiveness of the course or program, checks the learning progress, etc. Scores, grades or percentage obtained to act as an indicator that shows the quality of the curriculum and forms a basis for rankings in schools.
16
The goal of summative assessment is to evaluate student learning at the end of an instructional unit by comparing it against some standard or benchmark. Summative assessments are often high stakes, which means that they have a high point value. Examples of summative assessments include: a midterm exam a final project a paper a senior recital
17
Information from summative assessments can be used formatively when students or faculty use it to guide their efforts and activities in subsequent courses.
18
Basic difference between Formative and summative assessment
BASIS FOR COMPARISON FORMATIVE ASSESSMENT SUMMATIVE ASSESSMENT Meaning Refers to a variety of assessment procedures that provides the required information, to adjust teaching, during the learning process. Defined as a standard for evaluating learning of students. Nature Diagnostic Evaluative What is it? It is an assessment for learning. It is an assessment of learning. Frequency Monthly or quarterly Term end Aims at Enhancing learning Measuring student's competency. Goal Monitor student learning. Evaluate student learning. Weight of grades Low High
19
Steps in Evaluation Focus of evaluation Designing evaluation
Conducting evaluation Analyze and interpret data Reporting and using result of data
20
STEP ONE Focus of evaluation
21
RSA Evaluation Model RSA model places five basic types of evaluation in relation to one another based on purpose ,related question, scope and resources components of evaluation focus (Figure 14-1) These 5 types leading from the simple to complex
22
RSA Evaluation Model high low frequency time & cost Impact Outcome
Content Process low Total Program high
23
Process (Formative) Evaluation
Purpose: to make adjustments as soon as needed during education process Scope: limited to specific learning experience; frequent; concurrent with learning
24
Content Evaluation Purpose: to determine whether learners have acquired knowledge/skills just taught Scope: limited to specific learning experience and objectives; immediately after education completed (short-term)
25
Outcome (Summative) Evaluation
Purpose: to determine effects of teaching Scope: broader scope, more long term and less frequent than content evaluation
26
Impact Evaluation Purpose: to determine relative effects of education on institution or community Scope: broad, complex, sophisticated, long-term; occurs infrequently
27
Total Program Evaluation
Purpose: to determine extent to which total program meets/exceeds long-term goals Scope: broad, long-term/strategic; lengthy, therefore conducted infrequently
28
STEP TWO Designing the evaluation
29
Designing the evaluation An important question to be answered in designing an evaluation is “How rigorous should the evaluation be” All evaluation should be systematic and and carefully planned and structured before they are conducted. Evaluation design could be structured from a research perspective
30
Essential questions to be asked when designing evaluation tool
What types of data will be collected? Complete (people, program, environment) From whom or what will data be collected? From participants, surrogates, documents, and/or preexisting databases Include population or sample
31
How, when, and where will data be collected?
By observation, interview, questionnaire, test, record review, secondary analysis Consistent with type of evaluation Consistent with questions to be answered By whom will data be collected? By learner, educator, evaluator, and/or trained data collector Select to minimize bias
32
Evaluation Barriers Lack of clarity Lack of ability
Resolve by clearly describing five evaluation components. Specify and operationally define terms. Lack of ability Resolve by making necessary resources available. Solicit support from experts.
33
Fear of punishment or loss of self-esteem
Resolve by being aware of existence of fear among those being evaluated. Focus on data and results without personalizing or blaming. Point out achievements. Encourage ongoing effort. COMMUNICATE!!!
34
Selecting an Evaluation Instrument
Identify existing instruments through literature search, review of similar evaluations conducted in the past. Critique potential instruments for: Fit with definitions of factors to be measured Evidence of reliability and validity, especially with a similar population Appropriateness for those being evaluated Affordability, feasibility
35
Conducting an evaluation
STEP THREE Conducting an evaluation
36
When conducting an evaluation:
Conduct a pilot test first. Assess feasibility of conducting the full evaluation as planned. Assess reliability, validity of instruments. Include extra time. Be prepared for unexpected delays. Keep a sense of humor!
37
STEP FOUR Data Analysis and Interpretation
38
Data Analysis and Interpretation
The purpose for conducting data analysis is two-fold: To organize data so that they can provide meaningful information, such as through the use of tables and graphs, and 2. To provide answers to evaluation questions. Data can be quantitative and/or qualitative in nature.
39
STEP FIVE Reporting and using result of data
40
Reporting and using Evaluation Results
Be audience focused. Begin with a one-page executive summary. Use format and language clear to the audience. Present results in person and in writing. Provide specific recommendations. Stick to the evaluation purpose. Directly answer questions asked.
41
Use data as intended. Maintain consistency between results and interpretation of results. Identify limitations.
42
Summary of Evaluation Process
The process of evaluation in healthcare education is to gather, summarize, interpret, and use data to determine the extent to which an educational activity is efficient, effective, and useful to learners, teachers, and sponsors. Each aspect of the evaluation process is important, but all of them are meaningless unless the results of evaluation are used to guide future action in planning and carrying out interventions.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.