Download presentation
Presentation is loading. Please wait.
Published byBaldric Bradley Modified over 9 years ago
1
Evaluating Clinical Simulations Pamela R. Jeffries, DNS, RN, FAAN Johns Hopkins University School of Nursing Mississippi – February 25, 2011
2
Objectives of the Session: The participant will be able to: Discuss the evaluation process using simulations Describe strategies using simulations as an evaluation tool in courses or programs Describe different components that need to be assessed when using simulations
3
WHAT IS EVALUATION? Feedback Coaching Assigning Grades Judgmental: objective or subjective Form of quality improvement Assessment
4
ASSESSMENT What it is: “Systematic collection, review and use of information about educational programs undertaken for the purpose of improving student learning and development.” (Palumba and Banta, 1999) Focus: development and improvement rather than judgment and grading
5
WHY EVALUATE? Determine learning outcomes and program goals achieved Give feedback while learning Improve effectiveness of teaching and learning Attain performance standards Assure patient safety
6
WHEN TO EVALUATE? 1. 1.Frequently when: Learning is complex There is a high risk of failure When the consequences of error would be serious 2. 2.When outcomes are critical: Ensure that learners are prepared for clinical practices When findings can be used to alter the direction of the project 3. 3.End of the module, activity, or project to be certain of learning outcomes
7
EVALUATION PROCESS Is… A systematic effort involving identifying what, who, when and why, and then gathering, analyzing and interpreting the data Concludes when… the findings of the evaluation are reported and used
8
EVALUATION PROCESS Judgment: Relative worth or value of something Focus: Specifically on the student learner in e-learning Throughout educational process: Interwoven throughout the learning process usually in the form of feedback
9
EVALUATION INSTRUMENT 1. Norm-referenced: Focus: how learners rank in comparison to each other Outcome: The interpretation of the evaluation is to determine who has the most knowledge, best skill performance, etc. and who has the least
10
Evaluation Instruments 2. Criterion-referenced Focus: the learner’s ability to attain the objectives or competencies that have been specified Purpose: to determine who has achieved the outcomes and who has not Outcome: The outcome standard must be specified by the educator, and then the learner is evaluated to determine if the standard is met
11
PRINCIPLES OF EVALUATING ADULT LEARNERS Involve in planning Capable of self-evaluation Motivated to achieve Deal with real world problems and need real world evaluation Like variety in teaching, learning, and evaluation
12
PRINCIPLES OF EVALUATING ADULT LEARNERS Respond to feedback (evaluation) Need frequent and informative feedback Are self-directed Learn from each other and can evaluate each other Respond to choices…provide options for evaluation
13
GUIDELINES FOR SELECTING EVALUATION STRATEGIES Appropriate for the domain of learning The learner should have the opportunity to practice in the way he/she will be evaluated Used to provide multiple ways of assessing learning Ease of development/use on the part of the educator Evaluation instrument should be valid and reliable
14
Steps to Take When Evaluating 1. Identify the purpose of the evaluation 2. Determine a time frame 3. Identify when to evaluate 4. Develop the evaluation plan 5. Select the instrument(s) 6. Collect the data 7. Interpret the data
15
Evaluations and Simulations Four areas of Evaluation: Evaluating the simulation itself Evaluating the Implementation phase Evaluating student learning outcomes Using simulation as an evaluation tool
16
Evaluating the simulation (design) A good simulation design and development is needed to obtain the outcomes you need The SDS provides a measure of the importance of each design feature
17
Data Results Simulation Design Scale (SDS) analyzed using factor analyses with varimax rotation on items for each scale Cronbach alphas calculated on each subscale and the overall scale to assess instrument reliability
18
Simulation Design Scale Information/Objectives 5 items 0.87 alpha Student Support 4 items 0.90 alpha Problem-Solving 5 items 0.82 alpha Feedback/Debriefing 4 items 0.89 alpha Fidelity (realism) 2 items 0.76 alpha Overall Scale 20 items 0.95 alpha
19
Incorporating the Educational Practices into the Simulation Assemble the simulation with the educational practices in mind when implementing the simulation Active learning Collaboration Diverse ways of learning High expectations
20
Data Results Educational Practices within a Simulation Scale (EPSS) analyzed using factor analyses with varimax rotation on items for each scale Cronbach alphas calculated on each subscale and the overall scales to assess instrument reliability
21
Educational Practices Scale Using factor analyses with varimax rotation on items for each scale, the analyses revealed 4 factors Active Learning 10 items 0.91 alpha Collaboration 2 items 0.92 alpha Diverse Ways of Learning 2 items 0.87 alpha High Expectations 2 items 0.83 alpha Overall Scale 16 items 0.92 alpha
22
Instrument Testing Testing of the two instruments, Simulation Design Scale and Educational Practices Scale continued in this Phase. Reliabilities for both scales were found to be good. It was important we found measures to assess the quality of the designed simulations that were being used.
23
Evaluating Student Learning Outcomes Simulation technologies used to measure both process and outcomes range from case studies to standardized patients (OSCE), to task trainers and high fidelity mannequins
24
Example of a Clinical Competency Measure to Measure a Curriculuar Thread (J. Cleary – Ridgewater, MN) Critical competency Performance Standard Novice (1) Accomplished (2) Exemplary (3) Selects appropriate channels of communication Applies professional communication theories and techniques Participates in an interdisciplinary team Uses interdisciplinary resources to assist clients Attempts to establish rapport Generally establishes rapport Written communication is generally accurate Tentative to communicate to an interdisciplinary team Consistently establishes rapport and uses therapeutic communication
25
Evaluating outcomes Formative evaluation measures: simulation is used by the learner/faculty to mark progress toward a goal Summative evaluation measures: include determining course competencies, licensing and certification examinations, and employment decisions
26
Exemplars of student outcome measures used today Knowledge Skill performance Learner satisfaction Critical Thinking Self-confidence Skill proficiency Teamwork/collaboration Problem-solving Skills
27
Using simulations as the Evaluation Tool When skill sets, clinical reasoning, and selected clinical competencies need to be measured, simulations can be used as the mechanism to do this.
28
Simulations to Evaluate Set-up a simulation as an evaluation activity Issues to address: Make sure student is aware it is an evaluation Describe the evaluation metrics Is it objective?
29
Ways to Use Simulations as an Evaluation Metric As an Objective Structured Clinical Exam (OSCE) Hired simulated patients are portrayed by actors or actresses Students are immersed into specific scenarios
30
Ways to Use Simulations as an Evaluation Metric Set-up a simulation to measure such concepts as teamwork, patient safety competencies, ACLS protocol, communication skills, any selected intervention
31
Ways to Use Simulations as an Evaluation Metric Simulations used for evaluation can come in the form of computer-based learning, scores, and metrics to demonstrate knowledge, skill competency, and proficiency
33
Examples of products that can be used for evaluation purposes: MicroSim DVD- scoring mechanism ACLS computer-based software scenario packages – scoring Cath simulators – scoring devices – standardized scales, benchmarks CPR models/mannekins – programming and scoring
34
Summary Simulations require evaluation of many variables, including the simulation design, the implementation process, and learning outcomes In addition, simulations can serve as the mechanism to evaluate students
35
Nursing Implications When developing the course and curriculum planning in nursing, decide what the purpose of the simulation encounters serve and evaluate to make sure the outcomes or purpose is being achieved. More evidence/documentation is needed that simulations are serving the need for improved clinical performance, critical thinking, and improved diagnostic reasoning
36
Conclusion “How to tell students what to look for without telling them what to see is the dilemma of teaching.” Lascelles Abercrombie
37
Any Questions? pjeffri2@son.jhmi.edu
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.