Home Article
International Journal of Healthcare Simulation
image
A virtual debriefing professional development module for nurse educators

DOI:10.54531/wqlc6185, Pages: 1-3
Article Type: Short Reports on Simulation Innovations Supplement (SRSIS), Article History
Luctkar-Flude, Silva, Killam, Bruneau, Ziegler, and Tyerman: A virtual debriefing professional development module for nurse educators

Introduction

Virtual simulation is an innovative knowledge translation strategy that can be as effective as in-person simulation, while being more accessible and cost-efficient, particularly in situations where in-person interaction is difficult (e.g. during online or remote courses) [1]. According to the International Nursing Association for Clinical Simulation and Learning (INACSL), there are four main phases in the design and implementation of simulation-based education: presimulation activities, the simulation experience, debriefing and evaluation [2]. However, robust debriefing following virtual simulations is often neglected which can negatively impact student learning [3].

Debriefing is essential for learning and includes a guided reflection to provide feedback, identifying existing gaps in knowledge or competencies, and discussion to improve future performance [4]. For effective implementation, the facilitator needs to have adequate knowledge and skills related to debriefing frameworks [4]. We use the term virtual debriefing to refer to synchronous online debriefing. Although similar to an in-person debrief, the facilitator must also deal with technical challenges related to engaging learners in an online platform. Virtual debriefing of virtual simulations poses additional challenges as instructors may not observe learner performance and must support self-assessment of performance and identification of learning needs. Additionally, nurse educators in our networks reported gaps in experience or training in virtual debriefing and/or debriefing virtual simulations.

Innovation

To support educators in using virtual simulations to augment or replace clinical experiences, the Canadian Alliance of Nurse Educators using Simulation (CAN-Sim) developed an online module to address identified professional development needs. The online debriefing module consists of two different components: (1) a Facilitator Guide to Virtual Debriefing (https://can-sim.ca/); and (2) a Synchronous Virtual Debriefing Virtual Simulation Game (VSG) (https://can-sim.ca/). The debriefing guide is a 2-hour interactive presentation that focuses on Debriefing for Meaningful Learning [5]. The guide includes a video demonstration of a virtual debriefing using the Zoom videoconferencing platform. The VSG provides the opportunity for participants to apply the knowledge learned from the debriefing guide as they progress through a series of decision points. To better understand the impact of that online debriefing module, this study aims to evaluate the efficacy of the virtual debriefing professional development module to improve competency of nurses’ educators.

Evaluation

A pretest–post-test survey design was used to evaluate the effect of the debriefing module. To measure self-perceived competence, the professional development for virtual debriefing learning outcomes assessment rubric was used at three time points: (1) baseline pretest; (2) post-test 1 after the facilitator guide; (3) post-test 2 after the synchronous virtual debriefing VSG. Participants were nurse educators who were faculty members, clinical instructors and/or lab instructors of a college in Texas, USA. Ethical approval was obtained from the Austin Community College Institutional Research Review Committee and the Queen’s University Health Sciences & Affiliated Teaching Hospitals Research Ethics Board. After data collection, the survey data were analysed using SPSS Software Version 25.0 through descriptive statistics (frequencies, means, and standard deviations) and repeated measures ANOVA and post hoc analysis.

Outcomes

In the development of our e-learning module, we emphasized the creation of interactive and engaging content to enhance the learning experience and potential impact. Through measuring the outcomes and impact of such interventions we sought to assess the effectiveness and practical relevance of the module.

The total number of participants varied across different measurement time points: pretest (n = 34); post-test 1 (n = 27); post-test 2 (n = 30). The mean pretest knowledge of participants for each competency is displayed in Table 1. Using a repeated measures ANOVA with a Greenhouse–Geisser correction (see Table 2), the mean total learning outcome scores were statistically significantly different [F(1.665,34.972) = 22.259, p < .001]. In addition, we report the partial eta-squared (η p2) which reflects the effect size, with a value greater than 0.14 indicating a large effect size. Similarly, the differences between the pretest and the two post-test scores for each competency were also statistically significant, with large effect sizes.

Table 1:
Mean learning outcome scores at pretest, post-test 1, and post-test 2
Measure Mean score (SD)
Pretest Post-test 1 Post-test 2
1. Applying debriefing 3.22 (1.45) 3.83 (0.98) 4.43 (0.66)
2. Comparing facilitation methods 2.64 (1.29) 3.73 (1.03) 4.45 (0.74)
3. Establishing a climate of respect and confidentiality 3.52 (1.34) 4.30 (0.88) 4.70 (0.47)
4. Facilitating engagement in reflection 3.39 (1.27) 3.96 (0.88) 4.48 (0.67)
5. Providing formative feedback 3.30 (1.11) 3.83 (0.98) 4.35 (0.78)
Total score 15.77 (5.42) 19.45 (4.07) 22.32 (2.92)

Note: Scores for outcomes 1 through 5 were measured out of five.

Table 2:
Repeated measures ANOVA for learning outcomes scores
Measure Pretest to post-test 1 Post-test 1 to post-test 2 Repeated measures ANOVA
Mean difference Std. error Sig Mean difference Std. error Sig F p ηp2
Competency 1 −.609 .241 .058 −.609 .151 .002 14.63 <0.001 .399
Competency 2 −1.091 .236 <.001 −.727 .210 .007 29.97 <0.001 .588
Competency 3 −.783 .235 .009 −.391 .186 .141 12.51 <0.001 .363
Competency 4 −565 .216 .048 −.522 .207 .059 10.78 <0.001 .329
Competency 5 −.522 .176 .022 −.522 .165 .013 14.14 <0.001 .391
Total Score −3.682 .977 .003 −2.864 .774 .004 22.56 <0.001 .515

Competency 1 – Debriefing into a synchronous virtual debriefing session to support learners to gain a deeper understanding of their actions and thought processes.

Competency 2 – Compare various methods of facilitating a virtual debriefing for nursing students to identify the debriefing method that best aligns with the learning experience.

Competency 3 – Establish a climate of respect and confidentiality related to the content of the debriefing discussion to support psychological safety.

Competency 4 – Facilitate participant’s engagement in the reflective process of virtual simulated experience to assist with knowledge translation and application.

Competency 5 – Provide formative feedback following a virtual experience to ensure participants have met the learning objectives.

Post hoc analysis of the learning outcomes assessment rubric total scores demonstrated a significant effect from pretest to post-test 1 (p = .003) which reflects the impact of the facilitator guide, and a significant effect from post-test 1 to post-test 2 (p = .004) which reflects the impact of the VSG on nurse educators’ perceived competency related to virtual debriefing. Pairwise comparisons for the individual competencies were significant for 4 out of 5 of the facilitator guide scores, and 3 out of 5 of the virtual simulation game scores.

What is next?

Our findings demonstrate that an online professional development module that includes a virtual simulation game can significantly improve nurse educators’ self-perceived competency related to facilitating virtual debriefing. Using pre-packaged professional development modules is a cost-effective way to provide flexible and easily accessible platform education and allows for greater scalability as it can be utilized by a large number of nurse educators globally and help overcome geographical barriers. Also, modules may be used as an introduction to or a refresher on debriefing in a virtual environment. Collecting data from a diverse group of nurse educators, strengthened the potential for validity, reliability and generalizability of our findings and supports the module’s potential utility for other end users.

Further modules could be developed and evaluated that focus on different debriefing frameworks for virtual debriefing to provide nurse educators with debriefing options that align best with their context and learner needs. Lastly, further investigation using a multi-site setting is needed to further assess the effectiveness of this model; thus, a larger evaluation study is currently being conducted by our research team.

Declarations

Authors’ contributions

MLF: conceptualization and design of study, major contributor to manuscript. AS: major contributor to manuscript. LK: major contributor to manuscript. JB: contributor to manuscript. EZ: conceptualization and design of study, contributor to manuscript. JT: conceptualization and design of study, contributor to manuscript.

Funding

None declared.

Availability of data and materials

None declared.

Ethics approval and consent to participate

Ethics approval was received from the Austin Community College IRB Committee and the Queen’s University Health Sciences and Affiliated Teaching Hospitals Research Ethics Board.

Competing interests

None declared.

References

1. 

Harder N. Simulation as a knowledge translation strategy. Clinical Simulation in Nursing. 2023;76:A1A2.

2. 

Tyerman J, Luctkar-Flude M, Graham L, Coffey S, Olsen-Lynch E. A systematic review of health care presimulation preparation and briefing effectiveness. Clinical Simulation in Nursing. 2019;27:1225.

3. 

Dreifuerst KT, Bradley CS, Johnson BK. Using debriefing for meaningful learning with screen-based simulation. Nurse Educator. 2021;46(4):239244.

4. 

Decker S, Alinier G, Crawford SB, Gordon RM, Jenkins D, Wilson C. Healthcare simulation standards of best Practice™ the debriefing process. Clinical Simulation in Nursing. 2021;58:2732.

5. 

Dreifuerst KT. Getting started with debriefing for meaningful learning. Clinical Simulation in Nursing. 2015;11(5):268275.