Observational Evaluation of Simulation in Undergraduate Nursing Programs using Quality Indicators NETNEP Conference 2014 Ashley Kable, Tracy Levett-Jones,

Slides:



Advertisements
Similar presentations
Standards Definition of standards Types of standards Purposes of standards Characteristics of standards How to write a standard Alexandria University Faculty.
Advertisements

Gwinnett Teacher Effectiveness System Training
Team 6 Lesson 3 Gary J Brumbelow Matt DeMonbrun Elias Lopez Rita Martin.
1 GRS and Accreditation March Learning objectives After reviewing this presentation, you will understand  How the Global Rating Scale supports.
Implementation of a Vertical Teaching Model Tom Randell, Steve Glautier, and Doug Bernstein University of Southampton.
CURRICULUM DEVELOPMENT
Simulation teaching on Depression at the University of Zimbabwe Dr. Chido Rwafa.
Retooling Health Assessment: It Takes More Than a Hammer Cheryl Wilson MSN, ARNP, ANP-BC.
Forerunner Projects Overview. Four projects: 1.ITEC – Intelligent Technologies Enhancing Communication. 2.Mentorship Skills: Development of an innovative.
Confederates in competency-based learning for general surgical trainees in SET: CIS SET Bruce Waxman & Debra Nestel Faculty of Medicine, Nursing and Health.
Assessing and Evaluating Learning
Project NEStLeD Move Forward VIA University College Project NEStLeD NESTLED (Nurse Educator Simulation Based Learning) Project Leonardo Transfer of Innovation.
CPD4k Skills Competitions, CIF & PS
Debriefing in Medical Simulation Manu Madhok, MD, MPH Emergency Department Children’s Hospital and Clinics of Minnesota.
Managing deteriorating patients: rural registered nurses’ performance in a simulated setting. The FIRST2ACT Patient Deterioration Program A/Professor Dr.
DEVELOPING A MULTI-DISCIPLINARY SIMULATION HOSPITAL Presented by: Lee Jerls MSN, RN and Terri Currie BSN, RN.
Why Simulation Offers patient care experiences to the novice that are rare and risky for them to participate in. High acuity patient levels Shortened patient.
Evidence Based Teaching Strategy: Applied to Student Orientation
Teacher Keys Effectiveness System Forsyth County Schools Orientation May 2013 L.. Allison.
Planning and Designing Scenario-based Simulations
“Strategies for Effective Clinical Teaching and Evaluation” Assessment & Evaluation – (Part 2) Patricia A. Mahoney, MSN, RN, CNE This presentation is a.
Using Bibliographic Software as a Tool for Promoting Academic Integrity Amongst Undergraduate Students: A Case Study Debbie Booth Faculty Librarian – Engineering.
Planning and Designing Scenario- based Simulations A step-wise approach 2014 Kanbar Center for Simulation, Clinical Skills and Telemedicine Education Pam.
Assuring Safety for Clinical Techniques and Procedures MODULE 5 Facilitative Supervision for Quality Improvement Curriculum 2008.
Introduction Created for practitioners from diverse healthcare fields and with varying levels of experience, the Teaching and Assessment for Nursing and.
Introduction Teaching and Assessment for Medical Educators is a programme designed to enhance teaching, learning, assessment, feedback and evaluation in.
Program Evaluation Principles and Applications PAS 2010.
Certificate IV in Project Management Assessment Outline Course Number Qualification Code BSB41507.
Certificate IV in Project Management Assessment Outline Course Number Qualification Code BSB41507.
MRCGP The Clinical Skills Assessment January 2013.
Foundation Trainee Simulation Faculty Pedley H, Rawding O, McGuinness C, Abid N, Collins K, Holdsworth B, Midha D, Osborn A, Patel D Education Centre,
Storyboarding as a pedagogical tool for learning about death situations in children’s nursing education Yvonne Dexter Senior Lecturer Child Health Nursing.
OBSTETRIC EMERGENCY DRILLS Improve the quality of care for women having obstetric emergencies.
AN EXPLORATION OF PERSON- CENTRED CARE ACROSS ACUTE HOSPITAL SETTINGS IN IRELAND By Dr R Parlour & Dr P Slater.
Richard Woods, Georgia’s School Superintendent “Educating Georgia’s Future” gadoe.org Quality Comprehensive Improvement System Key School Performance Standards.
NURS 3043 ELA 5 Transition to Practice
Introduction Developed in collaboration with: Lead Advisor
Case studies: e-learning and public health
Dean of the School of Nursing at Widener University
IAEA E-learning Program
Prebriefing: The Final Frontier
Key recommendations Successful components of physical activity interventions fall into three categories: Planning and developing physical activity initiatives.
An exploration of the attitude of Orthoptic clinical educators towards reflection and its use in facilitating learning in undergraduate Orthoptists Helen.
MRCGP The Clinical Skills Assessment January 2013.
Readiness Consultations
Immersive Simulation in the Foundation Programme
“An online program to enhance the quality of clinical education”.
NCSBN Study & NCSBN Guidelines
“An online programme to enhance the quality of clinical education”.
A Path of Learning and Improvement
Controlling Measuring Quality of Patient Care
Learning objectives At the end of the session the students will be able to Define microteaching Mention the purpose of microteaching Narrate the methods.
Getting in S.T.E.P. with Simulations
Derek Herrmann & Ryan Smith University Assessment Services
UK Medical School Admissions
pathways for clinical learning
Jennifer Bryer PhD, RN, CNE Virginia Peterson-Graziose DNP, RN, CNE
Improving Outcomes by Helping People Take Control
School’s Cool Makes a Difference!
Writing a Strong Intellectual Statement
Current Practice and Plans for the Future
Unit 7: Instructional Communication and Technology
The Heart of Student Success
Medical Students Documenting in the EMR
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Medical Students Documenting in the EMR
Welcome to the CSBM operational workshop:
Writing a Strong Intellectual Statement
Student Learning Outcomes Assessment
Presentation transcript:

Observational Evaluation of Simulation in Undergraduate Nursing Programs using Quality Indicators NETNEP Conference 2014 Ashley Kable, Tracy Levett-Jones, Carol Arthur, Kerry Reid-Searl, Melanie Humphreys, Sara Morris, Pauline Walsh Funded by University of Newcastle, Faculty of Health Pilot Grant Scheme 2011

Background Previous student evaluations of simulation learning experiences have reported students satisfaction and knowledge and skills gained. These evaluations did not measure the quality of the design and delivery of simulation activities. Quality Indicators for the design and implementation of simulation learning experiences were developed in 2010 using a Delphi technique to achieve international consensus. Indicators articulate 5 key elements in effective simulation design and implementation. Can be applied to a variety of simulation activities and emphasize simulation integration across the curriculum, scaffolding of simulation sessions and adequacy of physical and staff resources. These indicators were used to construct a set of tools for use in evaluating the quality of simulation activities in undergraduate nursing curricula, and a pilot study was conducted to test them. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Study Objective To report the evaluation of the implementation of evidence based quality indicators for simulation learning experiences using an observation schedule for undergraduate nursing programs in 3 universities. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Study Design, Study Settings & Participants Observational evaluation instrument comprising four specific domains derived from the quality indicators: Pedagogical principles, Fidelity, Student preparation and Orientation, and Debriefing. Two Australian and one UK universities with UG nursing programs. 17 Simulation sessions: Participants and Simulation methods Undergraduate nursing students: first, second and third years of program Actors: Professional standardised patients (mental health simulations) Actors: Trained tutors, Mask-Ed (KRS Simulation) Simulation manikins (HPSM Laerdal Sim Man 3G) Tutors with various levels of immersion in simulation sessions. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

8 Clinical Scenario Based Situations Simulated Mental health assessment and communication with patient with depression (3 sessions). Mental health behaviour management for patient presenting with symptoms of mania. Aged care assessment and management of pain and delirium (2 sessions). Surgical fluid status assessment and management: hypovolaemia or hypervolaemia (2 sessions). Communication and managing patient distress: basic skills. Elderly falls risk and pressure area assessment: basic skills. Stoma care and uridome application: basic skills. Evolving sepsis (6 sessions). September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

The Observational Evaluation Instrument The observational evaluation instrument was designed using student specific measures derived from the quality indicators. The observation form consisted of 27 questions designed to measure the extent to which the simulation activity achieved the following elements: - Pedagogical principles (3 items) - Student preparation and orientation (6 items) - Fidelity (clinical authenticity, relevance and fidelity) (5 items) - Debriefing (7 items) - Session details including: Clinical scenario, teaching strategies, simulation modality, location of simulation, number of students and student roles (6 items) September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Study Methods Students were invited to participate prior to simulation sessions. First year students sessions were introductory skills. Second and third year simulation sessions were fully immersive. Voluntary consent sought from all participants. Observation schedules were completed during simulation learning sessions by trained observers. Analyses include comparisons between programs and years September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Results (n = 143 students) Session details: Most sessions were experiential (13), and 2 were demonstrations and 2 were facilitated sessions. Simulation modalities used were: Actors (4), MASK ED (5) and High Fidelity Manikins (8). Most simulations were conducted in clinical laboratories (15) however 2 were conducted in simulated learning environments. The number of students involved in each simulation session ranged from 2-25 (mean=8, median=4). First year sessions had significantly higher numbers of students (~24) (p= 0.02). Most students had an active role in the simulated learning session (15 sessions), some of whom also had an observer role (7) and 2 had an observational role only. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Pedagogical principles: Overall mean score 5.4 (max 6) Most sessions had course AND session objectives (76%) however 24% only had course objectives at one site (second year ). Most sessions were fully immersive (82%), 12% were partially immersive. Most scenarios addressed the stated learning objectives (88%) however one site had some sessions that only partially achieved this (p= 0.007). September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Student Preparation and Orientation: Overall mean score 8 (max 12) All sessions were preceded by a comprehensive orientation (53%) or a brief orientation (47%) for the simulation session. For the manikin/equipment to be used, over 40% had a brief orientation, 30% had a comprehensive orientation and 30% had none. Session structure was outlined briefly for 59% sessions, and in detail for the others. In most sessions, students were advised about learning objectives briefly (40%) or in detail (47%) and 12% had none. There was significant variation in the provision of preparatory activities such as lectures, tutorials, online learning and readings between sites. Site 1 and 2 had extensive preparation compared with site 3 that had limited preparation (p=0.002). There was significant variation in the teaching and practice of skills needed prior to the simulation sessions. Site 1 and 3 had extensive skills preparation compared with site 2 that had no skills preparation (p=0.006), however these were basic skills sessions. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Fidelity: Overall mean score 6 (Max 9) The scenario AND the manikin or standardised patient was clinically realistic for most sessions (59%) and somewhat realistic for others (40%). The simulation modality used supported the learning objectives for 16 sessions. Most (16) of the session environments provided clinically realistic equipment however at site 3 these were only somewhat realistic (p=0.024). Medical charts and records were frequently not provided (70%), particularly at site 3 (100%). September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Debriefing: Overall mean score 6.5 (Max 9) Debriefing was conducted within 30 minutes of every session. 16 sessions including debriefing about non-technical skills. Students were encouraged to reflect on and self-evaluate their practice during debriefing after 59% of sessions. 76% of debriefing sessions included feedback to students about their strengths, 71% included feedback about students weaknesses, and 59% included both. Only 14% of debriefing sessions included support for students who were disappointed with their performance during the simulation. This was significantly different for site 1 sessions where support was provided during some sessions (p= 0.015). September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Discussion Pedagogical principles were scored higher than all other observed components of simulation sessions. Preparation and orientation and fidelity were scored lowest suggesting there was scope for improving these components. Debriefing was conducted within 30 minutes for all sessions and usually included debriefing about non-technical skills. There were few differences between sites however some observed indicators in these sessions were of concern: 30% of sessions had no equipment orientation provided. 12% received no advice about learning objectives. 70% of sessions had no medical charts or records provided. 40% of debriefing sessions did not include reflection and self evaluation. There was scope to improve feedback about students strengths and weaknesses. 86% of debriefing sessions did not include support for disappointed students. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Conclusion The observation schedule effectively measured observable quality indicators for a range of simulation sessions at all study sites, confirming the utility of these quality indicators. The evaluation data provided valuable information about the quality of the simulation sessions, identifying components that could be improved and components that were done well. Additional testing of this evaluation instrument in other programs is desirable. September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au

Thank you Contact details: Dr Ashley Kable Ashley.kable@newcastle.edu.au Publications: Kable A, Arthur C, Levett-Jones T, Reid-Searl K. Student evaluation of simulation in undergraduate nursing programs in Australia using quality indicators. (2013) Nursing and Health Sciences. 15 (2) 235-243. Arthur C, Levett-Jones T, Kable A. Quality indicators for the design and implementation of simulation experiences: A Delphi study. Nurse Education Today. Doi: 10.1016/j.nedt.2012.07.012 Further information about Quality Indicators: http://ogma.newcastle.edu.au:8080/vital/access/manager/Repository/uon:12883 http://www.newcastle.edu.au/Resources/Projects/Nursing%20and%20Midwifery%20Projects/Clinical%20Reasoning/Quality-Indicators-for-the-Design-and-Implementation-of-Simulation-Experiences.pdf Further information about MASK-EDTM (KRS Simulation): http://www.cqu.edu.au/masked September 22, 2018 Observational Evaluation of Simulation using Quality Indicators Presentation NETNEP | www.newcastle.edu.au