Evaluating simulations

Slides:



Advertisements
Similar presentations
John C. Witwer RN, MSN. State 4 Model Components – Describe elements of SCE construction Understand Unique Applications – Interpret component application.
Advertisements

Evaluation Overview - Basics. Purpose of Testing Diagnostic Formative Summative.
Core Curriculum for Clinical Coaching Intro - VNIP Model
Alternative Strategies for Evaluating Teaching How many have used end-of-semester student evaluations? How many have used an alternative approach? My comments.
StatisticalDesign&ModelsValidation. Introduction.
Core Competencies Student Focus Group, Nov. 20, 2008.
Chapter 4 How to Observe Children
Redesigned Nurse Residency Program & Nurse Residency Program Study
Comparison: Traditional vs. Outcome Project Evaluative Processes Craig McClure, MD Educational Outcomes Service Group University of Arizona December 2004.
UNIT 11: CLINICAL JUDGMENT WRITTEN BY KATHLEEN MCCULLOUGH-ZANDER, MA, RN, CTN.
 Weblogs → blogs  Online journals –  Communication tools +  Marketing power advertising publicity.
performance INDICATORs performance APPRAISAL RUBRIC
Kimberly Holden Nursing 450 Ferris State University
Clinical Leadership Skills Acquisition in Nurse Residents
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
DEVELOPING A MULTI-DISCIPLINARY SIMULATION HOSPITAL Presented by: Lee Jerls MSN, RN and Terri Currie BSN, RN.
Why Simulation Offers patient care experiences to the novice that are rare and risky for them to participate in. High acuity patient levels Shortened patient.
Assessing Shoulder Dystocia Simulations for Quality
Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition, or past practice. The importance.
Evidence based research in education Cathy Gunn University of Auckland.
Becoming a Teacher Ninth Edition Forrest W. Parkay Chapter 13 Becoming a Professional Teacher Parkay ISBN: © 2013, 2010, 2007 Pearson Education,
TRAINING EVALUATION WHAT? WHAT? WHY? WHY? HOW? HOW? SO WHAT? SO WHAT?
 3:30 Simulation and curriculum integration Learning and competency assessment of students (Margaret Hindman)  4:15 Debriefing & simulation scenario.
WHO Global Standards. 5 Key Areas for Global Standards Program graduates Program graduates Program development and revision Program development and revision.
“R.I.M.E.” MODEL – A SYNTHETIC EVALUATION CONCEPT R eporter I nterpreter M anager- E ducator Pangaro LN. A new vocabulary and other innovations for improving.
Understanding the Practice of New Nurses. According to Benner’s Theory Novice to Expert – Novice – Advanced Beginner – Competent – Proficient – Expert.
Evidence-Based Practice Evidence-Based Practice Current knowledge and practice must be based on evidence of efficacy rather than intuition, tradition,
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov February 16, 2011.
This action-based research study used a descriptive triangulation process, which included quantitative and qualitative methods to analyze nursing students’
Core Curriculum for Clinical Coaching Intro - VNIP Model © Vermont Nurses In Partnership, Inc. All rights reserved. No copying without permission.
 “I have to teach the same information skills each year because students do not learn them.”  “I don’t have time to give tests so I do not assess student.
Enhancing Clinical Competencies using Multi-Patient Simulations Julie Fomenko, MSN, RN Susan Greathouse, MSN, BN, RN Texas A & M University-Corpus Christi.
Challenges for Designing Simulation Research KIM LEIGHTON, PHD, RN, ANEF DEVRY MEDICAL INTERNATIONAL’S INSTITUTE FOR RESEARCH & CLINICAL STRATEGY.
Students’ Perceptions of Clinical Reasoning Development Rebecca Jensen, PhD, RN.
Critiquing Quantitative Research.  A critical appraisal is careful evaluation of all aspects of a research study in order to assess the merits, limitations,
A conceptual framework is described as a group of concepts broadly defined and systematically organized to provide a focus, rationale, and tool for the.
Southern Illinois University Edwardsville,
Copyright © Springer Publishing Company, LLC. All Rights Reserved. BECOMING A SCHOLAR IN NURSING EDUCATION – Chapter 16 –
Creating a Positive Learning Environment
Statistics & Evidence-Based Practice
Learning Assessment Techniques
Critical Thinking in the Nursing Process
B260: Fundamentals of Nursing
Evaluation Requirements for MSP and Characteristics of Designs to Estimate Impacts with Confidence Ellen Bobronnikov March 23, 2011.
CI Training for CE III: Final Clinical Experience
Competency Based Learning and Project Based Learning
Prebriefing: The Final Frontier
Crucial Elements in Developing Maryland’s Simulation Education Leaders
Theresa Fraser’s Teaching and Learning Philosophy
Siobhán Smyth and Evelyn Byrne
Effective Outcomes Assessment
ASSESSMENT OF STUDENT LEARNING
CLINICAL JUDGMENT Concept 36.
HLT 540 Competitive Success-- snaptutorial.com
HLT 540Competitive Success/tutorialrank.com
HLT 540 Education for Service-- snaptutorial.com
HLT 540 Education for Service-- tutorialrank.com.
HLT 540 Teaching Effectively-- snaptutorial.com
Chapter Six Training Evaluation.
Analyzing Reliability and Validity in Outcomes Assessment Part 1
Advanced Program Learning Assessment
Research & scholarship
Unit XI: Data Analysis in nursing research
Vermont Nurses In Partnership Susan A. Boyer, RN, M.Ed., FAHCEP
Analyzing Reliability and Validity in Outcomes Assessment
Student Learning Outcomes at CSUDH
The relationship between Social skills and Learning Achievements of Business students at Asia Pacific International University Walaiporn Seksuntisakul,
Integrating Best Practices of Participant Evaluation Clinical Instructor Intensive Adrienne Small, DNP, FNP-C, CNE, CHSE Medical Instructor Duke University.
Ohio Standards for the Teaching Profession EHHS Conceptual Framework
Working with actors in healthcare simulation
Presentation transcript:

Evaluating simulations Susan Prion EdD, RN, CNE Professor School of Nursing and Health Professions University of San Francisco San Francisco, CA, USA

“Tis better to have assessed incompletely than to never have evaluated at all” More is better Multiple methods Multiple perspectives Multiple times

Purpose of evaluation Assess learner/participant outcome achievement ? Purpose of evaluation Assess learner/participant outcome achievement Quality improvement Evidence for additional resources

Evaluation, assessment & debriefing ? Evaluation, assessment & debriefing Evaluation: making a judgment about performance Assessment: giving feedback for the purpose of improvement Debriefing: a purposeful, facilitated discussion and analysis of an experience by its participants

Evaluation or assessment? In simulation learning experiences, assessment and evaluation are frequently used interchangeably.

Evaluation or assessment? Learners/participants need to be clear if the purpose of the simulation is constructive feedback or a test (evaluation) of their performance.

Measurement in nursing education and research Valid and reliable measurement tools are essential components of nursing research. Poor or inadequate measurement tools limit the scope potential and quality of research the scope,potential and quality of research (DeVellis, 2003). “Excellence in nursing education requires evidence-based curricula, teaching approaches, and evaluation methods” (Oermann, 2009, p. 2)

What to assess/evaluate?

What is the goal of the simulation? What are the learner/participant learning outcomes? What are the critical learning points for the simulation? Is the purpose practice, feedback or testing?

participant facilitator

Possible outcomes of simulation Knowledge Skills Self-confidence Satisfaction Critical thinking

Other outcomes of simulation? Teamwork Delegation Communication Integration of theory and practice Time management Other?

HOW to assess/evaluate?

Student/participant learning outcomes Direct information Indirect information Program outcomes Student/participant learning outcomes Formative Summative ?

Direct: observation or demonstration of performance Indirect: self-report or other’s report about performance ?

Program outcomes: did the program meet the stated goals? Student/participant learning outcomes: did the students/participants accomplish or achieve the expected learning? ?

Formative: feedback is collected during the learning experience to make ongoing improvements Summative: feedback is collected AT THE CONCLUSION of the learning experience to make improvements for the next time ?

LEARNING Self-report or instructor-report Pre-posttest ? LEARNING Self-report or instructor-report Pre-posttest Tests in theory class Academic paper Demonstration (“tell me why…”) Concept map Observation of change in behavior Journal or blog Checklist of critical knowledge elements Subsequent clinical simulation performance

Sample theory questions to evaluate student learning ?

Sample clinical instructor report of increased student learning ?

SKILLS Skills checklist Repeat demonstration Self-assessment of increased skill proficiency Instructor/preceptor observation of increased skill proficiency ?

Sample critical elements checklist ?

Sample self-report of skill proficiency ?

Rubrics for clinical evaluation* Issacson, J.J. and Stacy, A.S. (2009). Rubrics for clinical evaluation: Objectifying the subjective experience. Nursing Education in Practice, 9, 134-140.

Creighton Simulation Evaluation Instrument* Assessment Communication Critical Thinking Technical Skills *Todd, M., Manz, J.A., Hawkins, K.S., Parsons, M.E., and Hercinger, M. (2008). The development of a quantitative evaulation tool for simulations in nursing education. International Journal of Nursing Education Scholarship, 5 (1), article 41.

STUDENT SATISFACTION Qualitative versus quantitative? Social desirability of responses? Relationship of satisfaction to learning? Relationship of satisfaction to events?

1. What was the best or most useful part of the simulation? Why? Sample student satisfaction open-ended questions 1. What was the best or most useful part of the simulation? Why? 2. What was the least useful part of the simulation? Why? 3. What parts of the simulation experience would you change? Why? 4. Overall how satisfied are you with the simulation as a learning experience? ?

Sample student satisfaction items ?

STUDENT SELF-CONFIDENCE ? Qualitative versus quantitative? Social desirability of responses? Relationship of confidence to learning? Relationship of satisfaction to confidence to learning? Relationship of confidence to simulation events?

Sample student self-confidence items ?

Sample student self-confidence items Do you feel more confident about XXX after participating in the clinical simulation? Why or why not? What was the most influential in changing your self-confidence about performing Y procedure? What activities could have helped increase your confidence about working with a patient with XXX? ?

CRITICAL THINKING Qualitative versus quantitative? Novice characteristics? Relationship of learning, skills, confidence and satisfaction to learning and critical thinking? Measurement issues for critical thinking Critical thinking or clinical judgment???

“…the ways in which the nurse comes to understand the problems, issues, or concerns of clients/patients, to attend to salient information and to respond in concerned and involved ways” Benner, P., Tanner, C.A. & Chesla, C. A. (1996). Expertise in nursing practice: caring, clinical judgment and ethics. New York: Springer. ?

Tanner Clinical Judgment model Noticing Interpreting Responding Reflecting Tanner, C.A. (2006). Thinking like a nurse: a research-based model of clinical judgment in nursing. Journal of Nursing Education, 45, 204-211. ?

CRITICAL THINKING Self-report Clinical instructor/preceptor report Demonstration Observation Written case study analysis with questions Repeat simulation demonstration ?

Lasater Clinical Judgment Rubric Lasater, K. (2007). High fidelity simulation and the development of clinical judgment: students’ experiences. Journal of Nursing Education, 46 (6), pp. 269-276). Lasater, K. (2007). Clinical judgment development: using simulation to create an assessment rubric. Journal of Nursing Education, 46 (11), pp. 496-503. ?

Lasater, K. (2007). Clinical judgment development: using simulation to create an assessment rubric. Journal of Nursing Education, 46 (11), pp. 496-503. ?

Prion, S. K. , Gilbert, G. E. , Adamson, K. A. , Kardong-Edgren, S. E Prion, S.K., Gilbert, G.E., Adamson, K.A., Kardong-Edgren, S.E. and Quint, S. (2016, accepted for publication). Development and testing of the Quint Leveled Clinical Competency Tool (QLCCT). Clinical Simulation in Nursing. ?

QLCCT 10 competencies 4 proficiency levels Focused observation & info seeking Recognizing deviations from expected patterns Prioritizing data Making sense of data Response demeanor Clear communication Effective interventions Nursing skills Nursing evaluation Self reflection & improvement 4 proficiency levels 4=Graduating nurse 3=Advancing 2=Progressing 1=Novice

WHEN to assess/evaluate?

Before the experience for baseline During for formative improvements Immediately after for summative changes Later for application, transfer and retention ?

HOW to assess/evaluate?

Hard-earned advice….. Don’t do it alone. Don’t develop your own tool unless absolutely necessary. Consult an expert at least three times during the process-beginning, middle, and end. Use a statistician from the very beginning. Links not leaps!

Resources ?

International Nursing Association for Clinical Simulation and Learning (INACSL) http://www.inacsl.org/

Clinical Simulation in Nursing professional journal http://www.nursingsimulation.org/

Prion, S.K. (2008). A practical framework for assessing the impact of clinical simulation experiences in the nursing curriculum. Clinical Simulation in Nursing Education, 4 (5). Prion, S.K. (2016, under review). Updating the framework for assessing the impact of clinical simulation experiences in the nursing curriculum. Clinical Simulation in Nursing.

Society for Simulation in Healthcare http://www.ssih.org/public/

Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare

Simulation Innovation Resource Center (SIRC) http://sirc.nln.org/

Prion, S. K. (2008, 2015). Evaluation involved in using simulations Prion, S.K. (2008, 2015). Evaluation involved in using simulations. Available at http://www.sirc4nln.org/dev/mod/ resource/view.php?id=38. New York: National League for Nursing.

Kardong-Edgren,S. , Adamson, K. A. , and Fitzgerald, D. (2010) Kardong-Edgren,S., Adamson, K.A., and Fitzgerald, D. (2010). A review of currently published evaluation instruments for human patient simulation. Clinical Simulation in Nursing, 6, e25-35. Adamson, K.A., Kardong-Edgren, S., and Willhaus, J. (2013). An updated review of published simulation evaluation instruments. Clinical Simulation in Nursing, 9, e393-400.

Simple linear regression Multiple regression Latent variable analysis Pearson product moment correlation coefficient MANOVA ANOVA Independent t-tests Normal curve Measures of variability Significance Cronbach’s alpha Central tendency Dependent t-tests Frequencies Effect size Statistical power The differences between parametric and non-parametric analysis Chi-square Post-hoc power analysis Logistic regression Choosing the right correlation coefficient Generalizability theory ANCOVA Repeated measures studies Multiple comparisons for ANOVA Transformations to achieve normality Real world research Helpful resources Ethics and participant recruitment Rigor in qualitative research Validity Reliability Need for rigor in simulation research Operationalizing research variables Formative and summative evaluation during simulation Feedback, assessment and evaluation Quantitative data analysis software Qualitative data analysis softward Role of qualitative research in simulation practice Developing a data analysis plan for your research study Program evaluation Cost-effective simulation research Developing strong research questions Testing validity using Cain’s model Quality improvement research and the SQUIRE Guidelines Odds ratio Lawshe’s content validity index Is my data normal? Paramatric and nonparametric data analysis Nonparametric measures of association

Obstacles to evaluation?

? Time Resources Creativity Valid and reliable instruments Clear student learning outcomes Opportunity for follow-up Recognition of importance of information ?

“Tis better to have assessed incompletely than to never have evaluated at all” Multiple methods Multiple perspectives Multiple times

Questions? Comments? Great evaluation ideas? prions@usfca.edu