Download presentation
Presentation is loading. Please wait.
Published byAlban O’Neal’ Modified over 6 years ago
1
Observational Evaluation of Simulation in Undergraduate Nursing Programs using Quality Indicators NETNEP Conference 2014 Ashley Kable, Tracy Levett-Jones, Carol Arthur, Kerry Reid-Searl, Melanie Humphreys, Sara Morris, Pauline Walsh Funded by University of Newcastle, Faculty of Health Pilot Grant Scheme 2011
2
Background Previous student evaluations of simulation learning experiences have reported students satisfaction and knowledge and skills gained. These evaluations did not measure the quality of the design and delivery of simulation activities. Quality Indicators for the design and implementation of simulation learning experiences were developed in 2010 using a Delphi technique to achieve international consensus. Indicators articulate 5 key elements in effective simulation design and implementation. Can be applied to a variety of simulation activities and emphasize simulation integration across the curriculum, scaffolding of simulation sessions and adequacy of physical and staff resources. These indicators were used to construct a set of tools for use in evaluating the quality of simulation activities in undergraduate nursing curricula, and a pilot study was conducted to test them. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
3
Study Objective To report the evaluation of the implementation of evidence based quality indicators for simulation learning experiences using an observation schedule for undergraduate nursing programs in 3 universities. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
4
Study Design, Study Settings & Participants
Observational evaluation instrument comprising four specific domains derived from the quality indicators: Pedagogical principles, Fidelity, Student preparation and Orientation, and Debriefing. Two Australian and one UK universities with UG nursing programs. 17 Simulation sessions: Participants and Simulation methods Undergraduate nursing students: first, second and third years of program Actors: Professional standardised patients (mental health simulations) Actors: Trained tutors, Mask-Ed (KRS Simulation) Simulation manikins (HPSM Laerdal Sim Man 3G) Tutors with various levels of immersion in simulation sessions. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
5
8 Clinical Scenario Based Situations Simulated
Mental health assessment and communication with patient with depression (3 sessions). Mental health behaviour management for patient presenting with symptoms of mania. Aged care assessment and management of pain and delirium (2 sessions). Surgical fluid status assessment and management: hypovolaemia or hypervolaemia (2 sessions). Communication and managing patient distress: basic skills. Elderly falls risk and pressure area assessment: basic skills. Stoma care and uridome application: basic skills. Evolving sepsis (6 sessions). September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
6
The Observational Evaluation Instrument
The observational evaluation instrument was designed using student specific measures derived from the quality indicators. The observation form consisted of 27 questions designed to measure the extent to which the simulation activity achieved the following elements: - Pedagogical principles (3 items) - Student preparation and orientation (6 items) - Fidelity (clinical authenticity, relevance and fidelity) (5 items) - Debriefing (7 items) - Session details including: Clinical scenario, teaching strategies, simulation modality, location of simulation, number of students and student roles (6 items) September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
7
Study Methods Students were invited to participate prior to simulation sessions. First year students sessions were introductory skills. Second and third year simulation sessions were fully immersive. Voluntary consent sought from all participants. Observation schedules were completed during simulation learning sessions by trained observers. Analyses include comparisons between programs and years September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
8
Results (n = 143 students) Session details:
Most sessions were experiential (13), and 2 were demonstrations and 2 were facilitated sessions. Simulation modalities used were: Actors (4), MASK ED (5) and High Fidelity Manikins (8). Most simulations were conducted in clinical laboratories (15) however 2 were conducted in simulated learning environments. The number of students involved in each simulation session ranged from 2-25 (mean=8, median=4). First year sessions had significantly higher numbers of students (~24) (p= 0.02). Most students had an active role in the simulated learning session (15 sessions), some of whom also had an observer role (7) and 2 had an observational role only. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
9
Pedagogical principles: Overall mean score 5.4 (max 6)
Most sessions had course AND session objectives (76%) however 24% only had course objectives at one site (second year ). Most sessions were fully immersive (82%), 12% were partially immersive. Most scenarios addressed the stated learning objectives (88%) however one site had some sessions that only partially achieved this (p= 0.007). September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
10
Student Preparation and Orientation: Overall mean score 8 (max 12)
All sessions were preceded by a comprehensive orientation (53%) or a brief orientation (47%) for the simulation session. For the manikin/equipment to be used, over 40% had a brief orientation, 30% had a comprehensive orientation and 30% had none. Session structure was outlined briefly for 59% sessions, and in detail for the others. In most sessions, students were advised about learning objectives briefly (40%) or in detail (47%) and 12% had none. There was significant variation in the provision of preparatory activities such as lectures, tutorials, online learning and readings between sites. Site 1 and 2 had extensive preparation compared with site 3 that had limited preparation (p=0.002). There was significant variation in the teaching and practice of skills needed prior to the simulation sessions. Site 1 and 3 had extensive skills preparation compared with site 2 that had no skills preparation (p=0.006), however these were basic skills sessions. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
11
Fidelity: Overall mean score 6 (Max 9)
The scenario AND the manikin or standardised patient was clinically realistic for most sessions (59%) and somewhat realistic for others (40%). The simulation modality used supported the learning objectives for 16 sessions. Most (16) of the session environments provided clinically realistic equipment however at site 3 these were only somewhat realistic (p=0.024). Medical charts and records were frequently not provided (70%), particularly at site 3 (100%). September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
12
Debriefing: Overall mean score 6.5 (Max 9)
Debriefing was conducted within 30 minutes of every session. 16 sessions including debriefing about non-technical skills. Students were encouraged to reflect on and self-evaluate their practice during debriefing after 59% of sessions. 76% of debriefing sessions included feedback to students about their strengths, 71% included feedback about students weaknesses, and 59% included both. Only 14% of debriefing sessions included support for students who were disappointed with their performance during the simulation. This was significantly different for site 1 sessions where support was provided during some sessions (p= 0.015). September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
13
Discussion Pedagogical principles were scored higher than all other observed components of simulation sessions. Preparation and orientation and fidelity were scored lowest suggesting there was scope for improving these components. Debriefing was conducted within 30 minutes for all sessions and usually included debriefing about non-technical skills. There were few differences between sites however some observed indicators in these sessions were of concern: 30% of sessions had no equipment orientation provided. 12% received no advice about learning objectives. 70% of sessions had no medical charts or records provided. 40% of debriefing sessions did not include reflection and self evaluation. There was scope to improve feedback about students strengths and weaknesses. 86% of debriefing sessions did not include support for disappointed students. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
14
Conclusion The observation schedule effectively measured observable quality indicators for a range of simulation sessions at all study sites, confirming the utility of these quality indicators. The evaluation data provided valuable information about the quality of the simulation sessions, identifying components that could be improved and components that were done well. Additional testing of this evaluation instrument in other programs is desirable. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
15
Thank you Contact details: Dr Ashley Kable Publications: Kable A, Arthur C, Levett-Jones T, Reid-Searl K. Student evaluation of simulation in undergraduate nursing programs in Australia using quality indicators. (2013) Nursing and Health Sciences. 15 (2) Arthur C, Levett-Jones T, Kable A. Quality indicators for the design and implementation of simulation experiences: A Delphi study. Nurse Education Today. Doi: /j.nedt Further information about Quality Indicators: Further information about MASK-EDTM (KRS Simulation): September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP |
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.