Home Article
Improved self-efficacy in human factors skills in early-stage psychiatric trainees following online simulation: a quantitative comparison study with in-person training
Improved self-efficacy in human factors skills in early-stage psychiatric trainees following online simulation: a quantitative comparison study with in-person training

Article Type: Original Research Article History
Abstract

Background

Simulation-based education (SBE) supports early-stage psychiatric doctors to bridge educational and clinical practice while encouraging reflective practice. Research comparing the efficacy of in-person and online mental health SBE is limited. In a large-scale comparison study, we assessed for significant course evaluation measure differences between in-person and online participants attending an SBE course for early-stage psychiatric doctors.

Methods

A full-day in-person course was adapted for online delivery over a half-day. It focused on developing confidence and clinical skills relevant to early-stage psychiatric doctors. In-person (n = 228) and online (n = 90) participants were early-stage psychiatric doctors based in South London mental health trusts. Pre- and post-course quantitative data using the Human Factors Skills for Healthcare Instrument (HuFSHI) and Course Specific Questions (CSQ) measures were compared across the two delivery formats. Data collected from previous in-person deliveries were compared with online delivery data.

Results

Paired-samples t-tests comparing pre- and post-course HuFSHI and CSQ scores indicated significant improvements across both measures for the two delivery formats. Large and very large effect sizes, respectively, were observed for HuFSHI and CSQ scores in both delivery formats. Participants reported greater benefits from in-person delivery across CSQ measures and from digital delivery across HuFSHI measures. Independent-samples t-tests used to assess for significant differences between online and in-person delivery formats for HuFSHI and SCQ scores indicated no significant differences in scores favouring either in-person or online delivery.

Discussion

The data suggest online mental health SBE potentially represents an effective adjunct or alternative to in-person delivery. Further research is required to better understand these differences to support learners, educators, and commissioners.

Bansal, O’Sullivan, Tirbhowan, Powell, Bignell, Parish, and Iannelli: Improved self-efficacy in human factors skills in early-stage psychiatric trainees following online simulation: a quantitative comparison study with in-person training

What this study adds

  • Self-report data suggest that online mental health simulation-based education (SBE) potentially represents an effective adjunct or alternative to conventional in-person delivery.
  • Effective online delivery may hold promising implications for increasing accessibility of mental health simulation training, which has traditionally been offered in-person with the associated cost and physical infrastructure requirements.
  • To our knowledge, this is the first such comparison of SBE delivery formats to be made on this scale.
  • This research will support others in the field to investigate, develop and deploy effective online simulation-based training.

Introduction

Simulation-based education (SBE) supports early-stage psychiatric doctors to bridge the gap between educational and clinical practice. It enables this through exposure to a variety of clinical presentations and a safe space to hone communication and de-escalation techniques, whilst also encouraging reflective practice [1,2]. Achieving key learning objectives during the initial stages of psychiatric training presents inherent challenges. Opportunities for learning can be limited to patient crises or when interacting with patients with severe mental illnesses without close supervision by senior colleagues (such as out-of-hours). Moreover, these patients may be made anxious by early-stage trainee doctors [3].

COVID-19 required simulation faculties to provide online alternatives to in-person training. Online SBE offers psychiatric trainees the opportunity to continue to access high-fidelity experiential learning irrespective of location or shielding status. In this paper, we define high-fidelity SBE as the development and deployment of immersive scenarios that exclusively employ professional actors and offer a high degree of physical, environmental and psychological realism for our participants. Beyond the pandemic context, online SBE ostensibly also presents several further advantages in terms of equity of access for trainees in remote areas or smaller training schemes, in addition to potentially reduced costs for providers.

Across medical sub-specialties, however, there is a heterogeneous evidence base in terms of the comparative efficacy of online SBE compared to in-person training. Comparisons of both modalities with medical student participants have shown higher knowledge and self-confidence scores for in-person SBE; however, both in-person and online SBE were emphasized as effective methods of course delivery [46]. The effectiveness of both modalities has been demonstrated in nursing training and both have been deemed suitable and effective options for delivering simulation to off-site participants when considering social distancing, and for assessment, including virtual objective structured clinical examinations [79].

With respect to psychiatric training, there are limited large-scale research studies comparing the efficacy of online and in-person SBE. Despite the many recent innovations in online learning, educators have become increasingly aware of some potential negative aspects of online delivery. A prominent example highlighted in the recent literature is the inherently fatiguing effect of prolonged videoconferencing (VC) [10,11]. As universities, training bodies, and healthcare organizations seek to develop their simulation capacity, further research comparing and evaluating online delivery is necessary in terms of determining its strengths, most appropriate applications, and indeed, in identifying and understanding the nature of any such consistent limitations. This evidence will be vital in optimizing course design. It will also support commissioners and training bodies to appropriately integrate online SBE into their training such that learners are afforded high-quality and educationally effective experiential learning.

During the pandemic, our organization pivoted fully to large-scale online delivery of mental health SBE: defined as simulation training delivered entirely via a VC platform to a participant group remote from one another and the simulation team. As such, we are well-positioned to evaluate and elucidate its strengths and limitations which has now become a pressing need for learners, educators and commissioners. By comparing human factors skills self-efficacy in addition to course-specific measures, we aimed to assess for significant differences across a large sample of early-stage psychiatric doctors attending in-person and online versions of an SBE course.

Methods

Course development

Based in South London, our organization has several years’ experience of training early-stage psychiatric doctors through its regular delivery of Practicing Psychiatric Competencies 1.0 (PPC 1.0), an in-person simulation course focused on developing confidence and skills in psychiatric history-taking; mental state examination; and risk assessment and formulation; meeting the relevant learning outcomes of the Royal College of Psychiatrists [12].

Scenarios for PPC 1.0 were originally developed by our faculty team comprised of nurses, psychologists and psychiatrists with experience across a range of sub-specialties within mental healthcare. Our simulation faculty are trained in simulation scenario development and design courses in collaboration with professional actors. In scenario development, we relied on Kolb’s experiential learning theory as the conceptual framework [13]. This theory allowed our faculty to develop scenarios to maximize participants’ learning capabilities through the simulation experience. We worked with professional actors from a specialist medical role-play agency, trained for participating as patients in mental health simulations. Our SBE practice relies primarily on a communication processes-focused debrief. The multidisciplinary faculty undergo dedicated debrief training and peer debrief reflective practice sessions. In addition to comprehensive experience in the delivery of high-fidelity SBE, our team has extensive experience in its evaluation [1417]. The existing full-day PPC 1.0 in-person course was adapted for online delivery over a half-day.

SBE application

Regardless of delivery format, the PPC 1.0 course followed the same structure and principles. Within each course, the simulation scenarios were preceded by an introductory session. This encompassed ice-breaker sessions to engage participants, information on key concepts related to SBE such as psychological safety, and operational and scheduling plans for the session.

The scenarios introduced participants to common mental health conditions and with a focus on the de-escalation of acutely agitated patients in different clinical contexts (see Table 1). Each scenario was followed by a structured debrief using a modified version of Pendleton’s model [18]. The simulation training day finished with a summary of the learning, and an opportunity for feedback and reflection on the course.

Table 1:
Scenario summaries with selected learning objectives
Summary Selected learning objectives to guide debrief
1 Setting: Home visit (in-person version) or remote consultation (online).
Context: A 36-year-old junior doctor has been referred by her GP. A health visitor doing a recent 6-week check on her new baby encouraged her to seek help for low mood. She is showing signs of postnatal depression.
Task: The participant must begin to take a relevant history.
• To demonstrate effective skills in managing difficult dynamics with relatives in consultations.
• To discuss the differences between risk factors for harm to self in the perinatal period, versus the non-perinatal period.
• To analyse how certain personal attributes, e.g. health professional as a patient, can impact the dynamics of a consultation.
2 Setting: Inpatient psychiatric unit.
Context: A 29-year-old woman was admitted overnight following a crisis. She is now requesting to leave.
Task: The participant must attempt to come to an understanding of her presentation and risk and begin to collaborate with her on a management plan.
• To outline and define a framework for risk assessment with personality disorders.
• To review and demonstrate skills in managing a personality disorder, including the maintenance of both boundaries and empathy.
• To evaluate the impact of stigma in mental health.
3 Setting: Inpatient psychiatric unit.

Context: A 32-year-old man has recently returned from unescorted leave in an agitated state.

Task: The participant must assess his mental state and demonstrate appropriate verbal de-escalation skills and situational awareness.
• To review protocols and demonstrate both technical and non-technical skills in the assessment and management of violence and aggression in an inpatient psychiatric setting, including the management of personal safety.
• To demonstrate the ability to work effectively with colleagues, including team working.
• To recognize and identify the rapid tranquillization protocol.

Both delivery modalities featured two trained facilitators, one of whom was a higher psychiatric trainee. At least one simulation technician assisted in all deliveries regardless of format. They provided operational, moderator (online format) and general technical support. Different faculty members led this training throughout the year based on the organization’s yearly faculty rotation schedule.

Delivery method

In-person

The in-person course was delivered at our simulation centre in South London. The original full-day in-person course comprised 12 high-fidelity live 10-minute simulation scenarios. The active participants entered an immersive environment. Live footage of this scene was broadcast to a debrief room containing the non-active participants and simulation faculty.

Online

Zoom was our VC platform of choice [19]. Each online session was preceded by a half-hour platform orientation session delivered by our technicians who supported the delivery throughout. Actors, participants and faculty were fully remote and relied on in-built webcams and microphones to participate. Participants accessed the course primarily via personal or work computers and a minority used tablets. The half-day online course comprised of seven simulation scenarios (five live and two video-based scenarios) of 10 minutes each. Non-active participants observed the scenarios with their cameras off, to limit distraction for the active participant.

The PPC 1.0 course content was the same in both delivery formats, except for the number of scenarios. In both delivery formats, each scenario was followed by a 15-minute structured debrief that encouraged participants to explore consultation dynamics, with a strong emphasis on communication and human factors skills.

Participants

The maximum number of participants for each PPC 1.0 course delivery was 12, in both in-person and online delivery. In-person (n = 228) and online (n = 90) participants comprised of early-stage psychiatric doctors (at either core psychiatric trainee, or GP trainee level) based in mental health trusts in South London (i.e. South London and Maudsley NHS Foundation Trust, and South West London and St George’s Mental Health NHS Trust). In terms of recruitment, this course was part of participants’ mandatory induction for psychiatric placements. In total, 318 junior doctors attended the PPC 1.0 course in-person between September 2018 and December 2019, and online between February 2020 and April 2021.

Data collection

Ethical approval for this project was provided by the Psychiatry, Nursing and Midwifery Research Ethics Subcommittee at King’s College London on behalf of the Health Research Authority (ref. PNM1314173).

Quantitative pre- and post-course human factors skills for healthcare [20] (HuFSHI) and course-specific questions (CSQs) self-report survey measures were collected from participant groups across both delivery formats, i.e. in-person and online.

The HuFSHI is a validated 12-item tool for assessing interprofessional learning across healthcare practice settings. It was developed amongst a group of healthcare professionals working in both acute and community care settings and attending acute care and mental health simulation training courses in London. It has been demonstrated to be a reliable and valid method of assessing trainees’ human factors skills self-efficacy across acute and mental health settings with good internal consistency and sensitivity to change. We deemed this scale to be applicable to our training and participant group given the overlap in its development and delivery contexts.

CSQs were developed internally by the simulation faculty to assess the course-specific learning objectives. This included 14 items designed to measure participants’ confidence, skills and knowledge about the course content on a 10-point scale ranging from totally disagree to totally agree (see Table 2 for examples).

Table 2:
Examples of course-specific questions
No. Item
1. I am able to assess a patient’s risk of harm to themselves and others.
2. I am able to manage risk in my surroundings when working with patients with mental illness.
3. I can collaborate effectively with multidisciplinary colleagues to support people experiencing mental illness.
4. I can demonstrate effective communication with patients with severe psychiatric presentations.
5. I feel confident in de-escalating a situation that might involve violence and aggression.

Previously collected in-person delivery data alongside newly collected online delivery data were used to compare the new course format. Quantitative measures were completed immediately before and after both course formats. In-person data were collected using tablets distributed by simulation technicians, and online data were collected via an online survey platform.

Completed response rates were greater than 90% for both modalities, except for CSQs for in-person deliveries which was relatively low. This may partly have been due to data loss and difficulties in completing this measure using tablets distributed during in-person training.

Statistical analysis

Statistical analyses were conducted using SPSS [21]. The pre- and post-data were screened using Mahalanobis Distance [22] to identify participant outliers which indicated there were four degrees of freedom, equating to a chi-square value of 1.83 (p = 0.127). Cases (n = 3) with a distance score exceeding this value were excluded. The final sample size for further statistical analysis was 315 participants. Levene’s test confirmed that there was no violation of the assumption of homogeneity for HuFSHI pre-course scores, F(1, 307) = 0.66, p = 0.417, HuFSHI post-course scores, F(1, 304) = 0.62, p = 0.430, CSQ pre-course scores, F(1, 139) = 1.68, p = 0.197 and CSQ post-course scores, F(1, 145) = 1.36, p = 0.245.

Paired-samples t-tests were conducted to assess for significant changes in HuFSHI and CSQ scores pre- and post-course, based on the course delivery format. Independent-samples t-tests were conducted to assess for significant differences between online and in-person delivery formats for HuFSHI and SCQ scores.

Results

Paired-samples t-tests

In-person delivery

For the HuFSHI scores, there was a significant improvement between the pre-course scores (M = 78.33, SD = 13.18) and post-course scores (M = 89.30, SD = 14.55), t(218) = 12.44, p < 0.001. Likewise, for CSQ scores, a significant improvement was observed between the pre-course scores (M = 91.62, SD = 15.45) and post-course scores (M = 119.96, SD = 14.46), t(54) = 16.63, p < 0.001 for in-person delivery.

Online delivery

For the HuFSHI scores, there was a significant improvement between the pre-course scores (M = 76.72, SD = 12.70) and post-course scores (M = 89.40, SD = 13.48), t(80) = 9.08, p < 0.001. Likewise, for CSQ scores, a significant improvement was observed between the pre-course scores (M = 90.12, SD = 14.22) and post-course scores (M = 115.71, SD = 17.12), t(84) = 16.28, p < 0.001 for online delivery.

Effect sizes

A large effect size was observed for the HuFSHI scores in both online (d = 0.96) and in-person delivery (d = 0.79). A very large effect size was observed for the CSQ scores in both online (d = 1.62) and in-person delivery (d = 1.90).

Independent-samples t-tests

For the HuFSHI scores, there was no significant difference in scores between online and in-person delivery modalities, t(298) = 1.02, p = 0.851, 95% CI = [−1.59, 5.02]. However, participants scored slightly higher in the online delivery (M = 12.58, SD = 12.57) compared to in-person delivery (M = 10.96, SD = 13.05).

Similarly, for the CSQ scores, there was no significant difference in scores between online and in-person delivery modalities, t(138) = 1.15, p = 0.263, 95% CI = [−7.48, 1.96]. However, participants scored slightly higher in the in-person delivery (M = 28.35, SD = 12.64) compared to online delivery (M = 25.59, SD = 14.49).

Discussion

With respect to a comparison between in-person and online SBE in this context, what remains unclear is whether one is superior in relation to reported self-efficacy in human factors skills and meeting specific course learning objectives. Higher means were observed across both measures (HuFSHI and CSQ) in addition to larger effect sizes across formats in favour of in-person delivery for the CSQ, and in favour of online delivery for HuFSHI. Large and very large effect sizes, respectively, were observed for HuFSHI and CSQ scores in both delivery formats.

Given these results, our data suggest both participant groups benefitted from the training in terms of self-efficacy in human factors skills and meeting course-specific learning objectives. Interestingly, these data suggested the benefit was greater in terms of course-specific learning objectives for in-person participants, and greater in terms of self-efficacy in human factors skills for online participants. However, we found no evidence that participants found one delivery modality to be significantly better than the other.

Previous research supports our findings, with studies reporting high participant satisfaction rates after attending a remote simulation training programme [23], and significantly increased subject knowledge post-online simulation training [24]. A study comparing the outcomes of switching from in-person to online delivery of an educational course (combining lectures and simulation-based training) found similar results to ours, wherein there was no clear preference in favour of either in-person or online simulation training [25]. Rather, learners identified merits to both training modalities and showed a preference towards one or the other modality based on specific aspects of the course. Our finding of no significant differences in post-course scores between the two delivery modalities further strengthens support for the hypothesis that online delivery may represent a viable adjunct to in-person delivery.

It is noteworthy that whilst significant efforts were made in terms of technical orientation, course design and the use of breaks, the potential latent effects of VC fatigue may have impacted the participants’ overall comfort and ability to concentrate over the online sessions’ duration [26]. The inherent constraints of the VC platform limit interactivity, fluency and engagement [27], and in turn, may account for the higher means and larger effect sizes for in-person delivery in terms of the CSQ measure to some unknown degree. In view of the growing understanding in this area, there may be a need for a reliable measure of perceptions of VC fatigue for online training.

In terms of strengths and limitations for each delivery modality, one obvious difference and, indeed advantage, of online simulation is that it offers the active observer a far richer sensory experience. During in-person deliveries, observers typically collectively view the live scenario on a television screen in a separate debrief room. By contrast, during online SBE, participants are far more immersed and benefit from improved visual and audio inputs with enhanced views of facial expressions and body language, for example, through their individual devices. We remain cautious, however, not to extrapolate beyond our data in terms of what factors may be facilitating higher scores in the online format. Additionally, online SBE offers extremely high-fidelity learning opportunities with respect to virtual consultations. An obvious limitation of online simulation is that debriefing can prove demanding and debriefers experience a high cognitive load online, with potentially adverse implications for performance and learner outcomes [28]. In-person delivery offers its own unique advantages in that it affords participants the opportunity to meet and interact without the inherent constraints of VC platforms. This takes place at a specialist centre which represents an ‘away-day’ for staff removed from their usual working environment. As a result, participants spend the break periods together rather than alone as with online delivery. The nature of in-person delivery renders inter-participant engagement within the debriefing phase more fluent. We argue that this potentially enhances their cohesion as a learner group and adds to psychological safety. However, in-person deliveries may limit access to those that are geographically remote or have disabilities, in a way that is easily overcome through the online modality.

Implications

The COVID-19 pandemic has fast-tracked digital health innovations and heralded the widespread adoption of virtual consultations. The National Health Service (NHS) Long Term Plan [29] includes the mainstream adoption of digitally enabled care, which online SBE supports by uniquely offering staff the ability to train in remote consultations with high fidelity. There are numerous positive implications for learners such as increased access to training and greater flexibility. For example, healthcare professionals in rural regions would ordinarily have to take prolonged time away from clinical practice to attend training in a larger urban area. Another benefit of increased accessibility for trainees is the ability to meet more specialized learning objectives, e.g. perinatal mental health, by accessing expertise remotely. The UK postgraduate training system is known worldwide, attracting doctors from all over the world. Through online SBE, there is scope for international medical graduates to gain experience and build confidence by familiarizing themselves with the NHS prior to arrival, which would support the recruitment and retention of healthcare professionals as set out in numerous government policy documents [29,30].

Strengths

This study benefited from the use of a human factors instrument validated for the context in which we worked and with a uniquely large sample size with the final sample including pre- and post-course data from 315 participants. To the authors’ knowledge, there are no known comparison studies using such large samples. Data were collected systematically across both measures before and after each delivery and maintained on a database. Data were collected from each cohort, i.e. in-person and online, over periods of 15 and 14 months, respectively: this spanned several training in-takes. As several different faculty members delivered this training, this relatively prolonged data collection window mitigated any biases that otherwise may have been more prominent in relation to the facilitation styles and patterns of engagement associated with individual staff members. It is our view that this variation in facilitators over time, as such, represents a key strength of this study, in addition to its overall considerable scale.

Additionally, our debriefers delivered weekly online simulation from July 2020 onwards. As such, they were able to quickly develop confidence and fluency in using this modality. In this new operational context, we continued to pay extremely close attention to the development, maintenance and protection of a high degree of group psychological safety [31]. We plan to undertake further research to better understand the various challenges faced by debriefers delivering online SBE.

Limitations

Both the HuFSHI and CSQ are self-report measures which limits their objectivity. Increased scores signify increased human factors self-efficacy and subject matter competence, respectively, by self-appraisal, as opposed to through an objective assessment. It is also plausible that the Dunning–Kruger effect [32] was at play, biasing our relatively early-stage participant group to overestimate their capabilities and give themselves higher scores in the post-course survey. Additionally, the CSQs are not a validated measure, and individual items were specific to the course content. Low response rates for CSQs noted on the in-person deliveries may have been due to data loss and difficulties in completing this measure using tablets distributed during the sessions. Lastly, this study relied solely on quantitative data. It may be the case that supplementing this with qualitative data could yield a richer understanding of the differences between the chosen modalities.

Future research recommendations

Further research is required to better understand the differences between these modalities. Specifically, it will be important to understand what was driving higher HuFSHI scores in online delivery and if this effect persists across multiple studies. Whilst online SBE may offer scope for improved self-efficacy in human factors skills, there are likely to be limitations in terms of practical skills acquisition. The different development processes of group cohesion online represent another key area of future research for online SBE, and this has relevance for interactive experiential online learning more broadly. We also recognize that the incorporation of qualitative data into future studies should be considered by researchers.

Conclusion

Our understanding of the educational differences between in-person and online mental health SBE remains at an early stage. Using a relatively large data set for the field, our self-report data suggest that online mental health SBE potentially represents an effective adjunct, or even alternative, to in-person delivery. For psychiatric training schemes covering large geographical areas – or where funding or resources are limited – online SBE may present an attractive option. To our knowledge, this is the first comparison of delivery formats for mental health SBE for psychiatric trainees to be made on so large a scale. Further evidence is needed, but it is anticipated that this will be an interesting area of educational innovation research given the wider shift to both hybrid higher education and, indeed, workplaces.

Declarations

Acknowledgements

None declared.

Authors’ contributions

DB and OPO’S (joint first authors) conceived the project and analysed the data, in addition to leading on manuscript drafting. These stages were supported by HI. OPO’S, NT, JP, AB and SP were involved in project design, delivery and data acquisition. HI was responsible for project design and final approval of the manuscript.

Funding

This research did not receive any specific grant from funding agencies in the public, commercial or not-for-profit sectors.

Availability of data and materials

None declared.

Ethics approval and consent to participate

Ethics approval: PNM1314173.

Competing interests

The authors have no conflicts of interest to declare.

References

1. 

McNaughton N, Ravitz P, Wadell A, Hodges B. Psychiatric education and simulation: a review of the literature. The Canadian Journal of Psychiatry. 2008 Feb;53(2):8593.

2. 

Dave S. Simulation in psychiatric teaching. Advances in Psychiatric Treatment. 2012 Jul;18(4):292298.

3. 

Boulay C, Medway C. The clinical skills resource: a review of current practice. Medical Education. 1999 Mar;33(3):185191.

4. 

Ahmed R, Atkinson S, Gable B, Yee J, Gardner A. Coaching from the sidelines. simulation in healthcare. The Journal of the Society for Simulation in Healthcare. 2016 Oct;11(5):334339.

5. 

Brown D, Wong A, Ahmed R. Evaluation of simulation debriefing methods with interprofessional learning. Journal of Interprofessional Care. 2018 Jul 19;32(6):779781.

6. 

Poland S, Frey J, Khobrani A, et al. Telepresent focused assessment with sonography for trauma examination training versus traditional training for medical students: a simulation-based pilot study. Journal of Ultrasound in Medicine. 2018 Feb 1;37(8):19851992.

7. 

Chipps J, Brysiewicz P, Mars M. A systematic review of the effectiveness of videoconference-based tele-education for medical and nursing education. Worldviews on Evidence-Based Nursing. 2012 Mar 12;9(2):7887.

8. 

Cobbett S, Snelgrove-Clarke E. Virtual versus face-to-face clinical simulation in relation to student knowledge, anxiety, and self-confidence in maternal-newborn nursing: a randomized controlled trial. Nurse Education Today. 2016 Aug 9;45:179184.

9. 

Arrogante O, López-Torre E, Carrión-García L, Polo A, Jiménez-Rodríguez D. High-fidelity virtual objective structured clinical examinations with standardized patients in nursing students: an innovative proposal during the COVID-19 pandemic. Healthcare. 2021 Mar 20;9(3):355.

10. 

Wiederhold B. Connecting through technology during the coronavirus disease 2019 pandemic: avoiding “Zoom fatigue”. Cyberpsychology, Behavior, and Social Networking. 2020 Jul 10;23(7):437438.

11. 

Bailenson J. Nonverbal overload: a theoretical argument for the causes of Zoom fatigue. Technology, Mind, and Behavior. 2021 Feb 23;2(1).

12. 

Royal College of Psychiatrists. A Competency based curriculum for specialist core training in psychiatry. London, UK: Royal College of Psychiatrists. 2017.

13. 

Kolb DA. Experience as the source of learning and development. Upper Saddle River, NJ: Prentice Hall. 1984.

14. 

Bansal D, Vega M, Attoe C, Cross S, Parish S. Discovering careers in mental health: a qualitative pilot study of a novel simulation-based education programme. International Journal of Healthcare Simulation. 2022 Dec;1(2):1418.

15. 

Billon G, Attoe C, Marshall-Tate K, Riches S, Wheildon J, Cross S. Simulation training to support healthcare professionals to meet the health needs of people with intellectual disabilities. Advances in Mental Health and Intellectual Disabilities. 2016 Sep 5;10(5):284292.

16. 

Kowalski C, Attoe C, Ekdawi I, Parry C, Phillips S, Cross S. Interprofessional simulation training to promote working with families and networks in mental health services. Academic Psychiatry. 2017 Nov 2;42(5):605612.

17. 

Ortega Vega M, Williams L, Saunders A, Iannelli H, Cross S, Attoe C. Simulation training programme to improve the integrated response of teams in mental health crisis care. BMJ Simulation and Technology Enhanced Learning. 2020 Aug 21;7(2):116118.

18. 

Pendleton D, Schofield T, Tate P, Havelock P, Scholfield T. The new consultation: developing doctor-patient communication. Oxford: Oxford University Press. 2003.

19. 

O’Sullivan O, Virk K, Evans G, Iannelli H, Hodgman C, Billon, G. PP12 developing digital simulation: from design and testing to piloting remote delivery. BMJ Simulation and Technology Enhanced Learning. 2020 Nov;6(S1):A20.

20. 

Reedy G, Lavelle M, Simpson T, Anderson J. Development of the Human Factors Skills for Healthcare Instrument: a valid and reliable tool for assessing inter professional learning across healthcare practice settings. BMJ Simulation and Technology Enhanced Learning. 2017 Oct 3;3(4):135141.

21. 

IBM Corp. IBM SPSS Statistics for Windows, version 27.0. Armonk, NY: IBM Corp. 2020.

22. 

Rasmussen J. Evaluating outlier identification tests: Mahalanobis D squared and Comrey Dk. Multivariate Behavioral Research. 1988 Apr 1;23(2):189202.

23. 

Vera M, Kattan E, Cerda T, et al. Implementation of distance-based simulation training programs for healthcare professionals. Simulation in Healthcare: The Journal of the Society for Simulation in Healthcare. 2021 Dec 1;16(6):401406.

24. 

Kim S, Park C, O’Rourke J. Effectiveness of online simulation training: measuring faculty knowledge, perceptions, and intention to adopt. Nurse Education Today. 2017 Mar 6;51:102107.

25. 

Boffelli A, Kalchschmidt M, Shtub A. Simulation-based training: from a traditional course to remote learning - the COVID-19 effect. Higher Education Studies. 2020 Nov 29;11(1):8.

26. 

Wiederhold B. Connecting through technology during the coronavirus disease 2019 pandemic: avoiding “Zoom fatigue”. Cyberpsychology, Behavior, and Social Networking. 2020 Jul 10;23(7):437438.

27. 

Vandenberg S, Magnuson M. A comparison of student and faculty attitudes on the use of Zoom, a video conferencing platform: a mixed-methods study. Nurse Education in Practice. 2021 Jun 30;54:103138.

28. 

Fraser K, Meguerdichian M, Haws J, Grant V, Bajaj K, Cheng A. Cognitive Load Theory for debriefing simulations: implications for faculty development. Advances in Simulation. 2018 Dec 29;3(1).

29. 

NHS Long Term Plan. 2019a [cited 2022 Mar 16]. Available from: https://www.longtermplan.nhs.uk/.

30. 

National Health Service Improvement. Interim NHS People Plan. 2019b [cited 2022 Mar 16]. Available from: https://www.longtermplan.nhs.uk/wp-content/uploads/2019/05/Interim-NHS-People-Plan_June2019.pdf.

31. 

Kolbe M, Eppich W, Rudolph J, et al. Managing psychological safety in debriefings: a dynamic balancing act. BMJ Simulation and Technology Enhanced Learning. 2019 Apr;6(3):164171.

32. 

Dunning D. 2011. The Dunning–Kruger effect: on being ignorant of one’s own ignorance. In: Olson JM, Zanna MP, editors. Advances in experimental social psychology. Vol. 44. Cambridge, MA: Academic Press. p. 247296.