Simulation educator onboarding and longitudinal professional development is a challenge for most healthcare simulation programmes. The Simulation Educator Needs Assessment Tool (SENAT) was created for self-assessing individuals’ knowledge and perceived competency in simulation-based education.
Messick’s unified validity framework was used as a validity framework. First, this tool underwent two rounds of content validity studies following the Lawshe’s method. Participants at both rounds (N = 22) were experienced simulation educators who had achieved Certified Healthcare Simulation Educator – Advanced status. Second, the internal structure validity (scale reliability) was reported following SENAT completion by 147 simulationists.
The final SENAT contained 29 items with a satisfactory content validity index for each item (>0.455). Two subscales were found with good to excellent reliability: Self-assessment of Learning Needs (α = 0.90; excellent) and Competence with Simulation Modalities (α = 0.81; good).
The SENAT can impact simulation quality by providing a professional development roadmap for individuals, as well as provide data needed to develop mentoring conversations. Aggregate data from groups of educators can support planning for programmatic professional development.
Simulation literature, including standards of best practice, promotes the use of a needs assessment prior to initiating the process of developing a simulation-based experience (SBE). Specifically, the Healthcare Simulation Standards of Best Practice (HSSBP) for Simulation Design [1] states that a needs assessment is foundational to a well-designed SBE, outlining several required elements of this process. The HSSBP for Outcomes and Objectives [2] echoes that the needs assessment informs the development of learning objectives. The Association of SP Educators (ASPE) refers to these two standards in their own Standards of Best Practice document [3]. These standards also promote the use of trained simulationists for the development, implementation and evaluation of SBE.
Recently, the HSSBP: Professional Development [4] was published to guide simulationists in how to address their training and educational developmental needs. This standard identified that a needs assessment should be performed as the initial step in simulation professional development. Additionally, the need for trained simulationists has been identified in the Accreditation Standards of the Society for Simulation in Healthcare (SSH) [5], ASPE [3], the Association for Simulated Practice in Healthcare (ASPiH) [6], the National Simulation Guidelines for Prelicensure Nursing Programs [7], and throughout the HSSBP [1]; therefore, there is broad recognition of the need for standardized training of simulationists. The National Council State Board of Nursing (NCSBN) longitudinal simulation study published findings supportive of replacing traditional clinical hours with simulation experiences with certain caveats, one of which was that faculty were formally trained in simulation pedagogy [8].
The missing link in the literature is a needs assessment for simulationists to determine baseline knowledge of and perceived competency with the various concepts related to simulation pedagogy. While previous studies had focused on the needs assessment of the simulation curriculum/training resources [9,10], the researchers were unable to find a needs assessment of simulationists through literature review. This research study set out to identify the knowledge and skills required of simulationists as documented in the literature and then to develop a needs assessment tool that can be used for initial and ongoing assessment of developmental needs.
Various resources were used to identify areas of knowledge, skills and competencies important to include in this needs assessment. The SSH acknowledges simulation educators’ knowledge and skills through their certification programme. The Certified Healthcare Simulation Educator (CHSE) is designed to recognize the competency level achieved after approximately 2 years of experience as a simulation educator. The blueprint for the CHSE exam provides an outline of the domains of importance for competency and includes professional values and capabilities, healthcare and simulation knowledge/principles, educational principles applied to simulation, and simulation resources and environments [11]. An advanced certification (CHSE-A) is awarded for those with sustained contributions to the art and science of simulation pedagogy [12].
Shortly after the release of findings from the NCSBN study [8], the NCSBN convened an expert panel to create national guidelines for simulation use in undergraduate nursing programmes. This document outlines the programmatic need for ‘qualified lead faculty and sim lab personnel’ and states that faculty must be prepared to lead simulation [7, p.40]. A faculty preparation checklist identifies that faculty should know how to create and facilitate the active learning environment and gives suggestions for professional development opportunities.
The Facilitator Competency Rubric [13] was created as an observational rubric to assess nursing facilitators’ competency in the areas of preparation, prebriefing, facilitation, debriefing and evaluation on a novice-to-expert scale. This rubric was populated through numerous interactive workshops to distinguish behaviours that were different between beginner, competent and expert simulationists.
The SSH Accreditation Program has required standards for teaching and education, with one standard specifically focused on qualified educators [14]. To demonstrate that simulation educators are qualified, the programme needs to describe the individuals and provide a biosketch for each that shows their simulation background, describe an evaluation and feedback process for the educators, and provide supporting documentation that professional development opportunities are provided. Further, the programme needs to show a process of orientation and development for educators, as well as how they are mentored. Various criteria from these documents, the literature and the researchers’ experience led to the development of the Simulation Educator Needs Assessment Tool (SENAT) items.
This tool was developed for those who implement simulation (simulationists) and most often serve as the simulation facilitators, subsequently defined as simulation educators. The goal was to create a tool usable with any simulation programme team member who facilitates learning. The simulation educator term is defined in the Simulation Healthcare Dictionary 2.0 as:
(1) Person who uses the modality of simulation to educate learners, utilizing evidence-based strategies. (2) Person who supports healthcare professionals who are learning to manage clinical situations and provide care that is safe, effective, efficient, timely, patient-centered, and equitable. May teach an individual learner or a group of learners practicing to work as a team. [15, p.16]
Based on the literature review, the tool was designed to contain three sections: Section 1 – three questions about the respondent’s simulation background; Section 2 – a set of 13 self-rating items related to concepts important for simulation educator competency; and Section 3 – two questions regarding other topics of interest and prioritization of identified needs. The researchers adopted four-point Likert-type anchors: ‘strongly agree’, ‘agree’, ‘disagree’ and ‘strongly disagree’. This study was approved by the University of Tennessee Health Science Center Institutional Review Board (IRB # 21-08215-XM).
The research team conducted two rounds of content validity study between October 2021 and February 2022. The participants had achieved CHSE-A status due to sustained contributions to the art and science of simulation pedagogy; therefore, it was believed they had the most expertise to analyse the content of the survey items. Eighty-eight simulation educators with CHSE-As were invited to participate in this study via email and 22 completed each round of the content validity study. Despite the same number of respondents, there were some different participants in each round. The demographic information of the participants for each round is presented in Table 1.
Round | Demographic information | ||||
---|---|---|---|---|---|
Round 1 (N = 22) | Location (count) | ||||
U.S. Northeast (5) | U.S. South (13) | U.S. Midwest (1) | U.S. West (3) | ||
Employed Simulation Environment (count) | |||||
Academic centre – multidisciplinary (10) | Hospital-based centre (2) | Medical school programme (1) | Nursing programme (7) | Other (2) | |
Round 2 (N = 22) | Location (count) | ||||
U.S. Northeast (4) | U.S. South (12) | U.S. Midwest (1) | International* (4) | No Response (1) | |
Employed Simulation Environment (count) | |||||
Academic centre – multi-disciplinary (9) | Hospital-based centre (3) | Nursing programme (8) | Other (2) |
* The ‘International’ category included ‘Turkey’ (1), ‘United Arab Emirates’ (1), ‘Pakistan’ (1) and ‘Singapore’ (1).
Messick’s Unified Validity Theory [16] was adopted to guide the validation process. Messick [17] argues that all validity is construct validity. Specifically, there are five main sources of validity evidence: content, response process, internal consistency, relations to other variables and consequence. This needs assessment tool study has primarily focused on the evidence of content validity and internal structure validity.
The researchers utilized Lawshe’s quantitative method [18] to systematically examine the consensus from a panel of content experts about whether the content of the items/questions is ‘essential’, ‘useful but not essential’ or ‘not necessary’. Then, the content validity ratio (CVR) was calculated for each item (see calculation method in Lawshe’s paper) [18]. The researchers used the CVR cut-off = 0.455 to evaluate the experts’ agreement level on the needs assessment questions [19].
After the first round, the research team calculated CVRs and reviewed all results, with special attention to results less than the 0.445 a priori cut-off, as well as expert comments. The research team edited and made changes to enhance item quality. Details for the first round of SENAT CVRs can be found in Table 2. Following edits, a second round was conducted and the same process was repeated. After two rounds of content validity evaluation by experts, the researchers achieved satisfactory content validity for each item (>0.455) and the final SENAT contained a total of 29 items. Second-round CVR results are located in Table 2.
Round 1 | ||||
---|---|---|---|---|
Question content | Essential | Useful, but not essential | Not necessary | CVR |
Q1. Please provide your primary professional appointment or position: academic educator; administrator; director; emergency responder (pre-hospital); full-time clinical provider (hospital-based); researcher; other: (type in response) | 16 | 6 | 0 | 0.73 |
Q2. Have you previously implemented any simulation-based event or learning activity? No; once or twice; three to five times; more than five times | 19 | 3 | 0 | 0.86 |
Q3. Was simulation-based education utilized in your personal education training? Yes; no | 8 | 12 | 2 | 0.36 |
* Q4. I converse using simulation terminology as identified by the Healthcare Simulation Dictionary. | 15 | 6 | 1 | 0.68 |
Q5. I adhere to the Society for Simulation in Healthcare (SSH) Simulationist Code of Ethics. | 20 | 2 | 0 | 0.91 |
Q6. I implement the Healthcare Simulation Standards of Best Practice from the International Nursing Association for Clinical Simulation and Learning (INACSL). | 18 | 3 | 1 | 0.82 |
Q7. I implement Standardized/ Simulated Patient methodology based on the Association for Standardized Patient Educators (ASPE) standards for best practice. | 15 | 6 | 1 | 0.68 |
Q8. When identifying the potential for simulation-based education, I begin with a needs assessment. | 20 | 2 | 0 | 0.91 |
Q9. I am able to design a simulation case or scenario. | 16 | 6 | 0 | 0.73 |
Q10. I conduct comprehensive simulation prebriefs. | 15 | 7 | 0 | 0.68 |
Q11. I debrief simulation or clinical activities using an evidence-based model. | 19 | 3 | 0 | 0.86 |
Q12. I use various debriefing frameworks based on the simulation design and learning objectives. | 17 | 5 | 0 | 0.77 |
Q13. I use reliable assessment tools that are valid with my learner population to assess knowledge, skills, attitudes and/or behaviour changes. | 18 | 4 | 0 | 0.82 |
Q14. I am able to implement manikin-based simulation (if applicable). | 17 | 5 | 0 | 0.77 |
Q15. I able to implement task-trainer methodology (if applicable). | 17 | 5 | 0 | 0.77 |
Q16. I am able to implement virtual reality methodology (if applicable). | 17 | 5 | 0 | 0.77 |
Q17. Based on your survey responses, please identify the topics you would like to discuss with the Simulation Team/ Simulation Mentor, as areas of development and self-improvement. Educational needs assessment; Simulation terminology; SSH Simulationist Code of Ethics; Healthcare Simulation Standards - INACSL; ASPE best practice standards for SP methodology; Prebrief design; Debriefing; Debriefing frameworks; Creating assessment rubrics; Manikin simulation; Task trainer/ procedural simulation; Virtual reality simulation; Other | 18 | 3 | 1 | 0.82 |
Q18. From your selected topics in the previous question, please place the topics in the order of priority for discussion with the simulation Team/Mentor. This ordering of topics will allow customization of your professional development/ mentoring plan. For example, if you choose ‘Debriefing’ and ‘ Code of Ethics’ as your topics, please number the topics in your personal priority of discussion/ mentoring, i.e. 1. Debriefing, 2. Code of Ethics. | 14 | 7 | 1 | 0.64 |
Round 2 | ||||
Question Content | Essential | Useful, but not essential | Not necessary | CVR |
Q1. Please provide your primary role in simulation-based education (specify): | 19 | 2 | 1 | 0.73 |
Q2. How many times have you facilitated a simulation-based activity in the past year? (just enter the number) | 17 | 5 | 0 | 0.55 |
Q3. On a scale of 1–5: in general, how confident are you when you implement simulation-based activities? | 20 | 2 | 0 | 0.82 |
Q4. Have you participated as a learner in any type of simulation activity? | 9 | 11 | 2 | –0.18 |
Q5. What educational resources do you use to learn more about simulation? (select all that apply) books; journals; webinars; podcasts; workshops; courses; conferences; journal club; other (specify). | 19 | 3 | 0 | 0.73 |
** Q6. I use the Healthcare Simulation Dictionary as a reference to provide clear oral and written communication in my professional simulation practice. | 17 | 4 | 1 | 0.55 |
Q7. I adhere to the Society for Simulation in Healthcare (SSH) Simulationist Code of Ethics. | 20 | 2 | 0 | 0.82 |
Q8. I implement the Healthcare Simulation Standards of Best Practice from the International Nursing Association for Clinical Simulation and Learning (INACSL). | 19 | 2 | 1 | 0.73 |
Q9. I implement Standardized/ Simulated Patient methodology based on the Association for Standardized Patient Educators (ASPE) Standards of Best Practice (SOBP). | 20 | 1 | 1 | 0.82 |
Q10. I implement the Association for Simulated Practice in Healthcare (ASPiH) Standards for Simulation-Based Education. | 13 | 8 | 1 | 0.18 |
Q11. When considering a new simulation-based educational activity, I begin with a needs assessment. | 20 | 1 | 1 | 0.82 |
Q12. I am able to write measurable learning objectives that are the appropriate level for my learners. | 19 | 2 | 1 | 0.73 |
Q13. I am able to align the simulation modality with the learning objectives. | 22 | 0 | 0 | 1.00 |
Q14. I can design a scenario or case that provides context for the simulation experience. | 21 | 0 | 1 | 0.91 |
Q15. I can develop the appropriate level of fidelity to support the learning objectives of the simulation experience. | 21 | 1 | 0 | 0.91 |
Q16. I facilitate the SBE according to the objectives, level of the learners’ experience and knowledge. | 20 | 2 | 0 | 0.82 |
Q17. I create and deliver simulation prebriefings. | 20 | 2 | 0 | 0.82 |
Q18. I debrief simulation activities using an evidenced-based model, framework or theory. | 20 | 2 | 0 | 0.82 |
Q19. I use debriefing models, frameworks or theories based on the simulation design, learner cohort and learning objectives. | 18 | 4 | 0 | 0.64 |
Q20. I am able to differentiate between formative, summative and high-stakes assessment strategies. | 21 | 1 | 0 | 0.91 |
Q21. Before utilizing any evaluation tool in simulation-based education, I consider tool reliability and validity for my learners. | 21 | 1 | 0 | 0.91 |
Q22. In the following simulation modalities, please rate how strongly you agree or disagree with each of the following statements. 1. manikin-based simulation … 9. other | 20 | 2 | 0 | 0.82 |
Q23. Are there any other areas of simulation you would like to know about? (select all that apply) research; operations; accreditation; writing for publication; simulation or educational organizations (e.g. SSH); Other (specify). | 16 | 6 | 0 | 0.45 |
*Round 1: questions 4–16 have the same question stem: ‘Review each statement and provide to what degree you agree with each statement’. Response options for questions 4–16 include a 4-point Likert scale: ‘Strongly Agree’, ‘Agree’, ‘Disagree’ or ‘Strongly Disagree’.
** Round 2: questions 6–20 have the same question stem: ‘Review each statement and provide to what degree you agree with each statement’. Response options for questions 6–20 include a 4-point Likert scale: ‘Strongly Agree’, ‘Agree’, ‘Disagree’ or ‘Strongly Disagree’, and an additional anchor ‘Not Applicable’.
After the content validity study, the SENAT was disseminated via an online survey platform to the healthcare simulation community using multiple methods including emails, simulation professional community listservs, hardcopies at a simulation conference and social media platforms (Facebook and LinkedIn) to assess reliability (as internal structure validity). From May to July 2022, 239 simulationists responded and 147 completed the SENAT (completion rate: 62%). Cronbach’s α , an internal consistency reliability index [20], was reported for the two subscales from the SENAT. The results showed good to excellent scale reliability: For the ‘Self-assessment of learning needs’ subscale, α = 0.90; for the subscale ‘Competence with simulation modalities’, α = 0.81. Descriptive statistics about respondents’ primary role in simulation, how many simulations they facilitated last year and resources they used to learn more about simulation are shown in Table 3. Over 80% of the survey respondents self-identified as ‘Director/Assistant Director’, ‘Educator’ or ‘Facilitator/Coordinator’ in simulation. Eighty-four per cent of the survey participants reported they were ‘confident’ or ‘very confident’ when implementing simulation activities (16% reported ‘moderately confident’).
Primary role | Director/Assistant Director | Educator | Coordinator/Facilitator | Faculty | Administrative | Dean/AssistantDean | Researcher | Other1 | |
---|---|---|---|---|---|---|---|---|---|
Question | Responses | ||||||||
N | 53 | 35 | 31 | 13 | 4 | 3 | 2 | 6 | |
# of Sim | 0 | 2–10 | 11–20 | 21–50 | 51–100 | 101–200 | >200 | Other2 | |
N | 5 | 41 | 27 | 37 | 20 | 9 | 6 | 2 | |
Resources | Books | Journals | Webinars | Podcasts | Workshops | Courses | Conferences | Simulation Colleagues | Other3 |
N | 83 | 127 | 123 | 40 | 106 | 79 | 123 | 131 | 27 |
Notes: Full survey questions: Primary Role: ‘Please specify your primary role in simulation education’; # of Sim: ‘How many times have you facilitated a simulation activity in the past year’; Resources: ‘What resources do you use to learn more about simulation? Select all that apply’.
1. ‘Other’ includes: ‘Simulation support’, ‘simulationist’, ‘simulation champion’, etc.
2. ‘Other’ includes: ‘Many times’ and ‘daily’.
3. ‘Other’ includes: ‘Social media’, ‘Simulation Video Exemplars’, ‘Listservs’ and ‘Trial and error’ (among 27 respondents who selected ‘Other’, 11 specified what other resources they used).
The HCSSBP Professional Development states ‘Initial and ongoing professional development supports the simulationist across their career, allowing the simulationists to stay current with new knowledge, provide high-quality simulation experiences, and meet the educational needs of the learners’ [4, p.5]. The first criteria for this standard support educational needs assessment in order to create an individualized professional development plan for the simulation educator. Establishing the professional development needs of simulation educators should be performed in an intentional and collaborative manner.
Discussing the data collected from the SENAT can be the first step to initiate professional development conversations. These discussions may be between the educator and mentor, amongst a group of simulation educators and a mentor/ simulation expert, or as an individual self-reflection activity. The SENAT results can identify the educational gaps and highlight the resources that can be provided for the educator(s)’ development. For example, these resources might be to enhance knowledge, such as exposure to the simulation best practice standards, or to enhance skills, such as providing assistance in choosing a template for scenario design or even to provide self-reported competency deficits as evidence towards attainment of funding to attend a debriefing workshop. Ultimately, the SENAT data can be used to design individual or group simulation professional development curriculums.
The SENAT can be utilized by anyone serving as a simulation educator, whether that person is within an organization and works in simulation on a daily basis or someone who is not within an organization and implements simulation on a more episodic basis, perhaps only once or twice a year. The tool was created with the intention to assess simulation educators’ individual needs for knowledge and skill improvement, no matter where they are located or how large or small their simulation programme. The tool was created to provide a way to identify educational needs and subsequently initiate a development plan – this could be done in a myriad of ways, depending on the individual’s situation and organizational resource availability. Simulation educator resources are available in textbooks, organizational and vendor websites, conferences and webinars, to name a few. Costs range from free to several thousand dollars.
Mentorship is very important to the professional development of the simulation educator. The SENAT includes two open-ended questions. The first question asks about previous simulation experiences, either positive or negative. Responses can provide insight for mentors when creating professional development activities. For example, if a simulation educator experienced previous debriefings in which participants were made to feel inept, the mentor may choose to focus first on the importance of psychological safety prior to expanding debriefing knowledge. The second question asks if there are other areas of simulation to discuss, promoting identification of priorities and interests of the simulation educator that could be incorporated in their professional development plan, thereby making the plan individualized and accomplishment more internally motivating.
Another criterion of the HCSSBP: Professional Development calls for ‘re-evaluation of the professional development plan regularly using formative and summative methods’ [4, p.7]. Need assessments should never be a one-time event, but continue across the trajectory of the simulation educator’s career. Longitudinally using the SENAT provides a way not only to address the person’s developmental needs but indirectly shows the effectiveness and impact of training and educational activities that have been completed.
Additionally, the SENAT can provide data supportive of the SSH accreditation standards; it is a way to provide initial and longitudinal analysis of simulation educators’ knowledge and skills and their identified professional development needs. The tool implementation speaks to the SSH Accreditation Teaching and Education Standard 3.c. and 3.d:
3.c. The Simulation Program has a process to assure ongoing development and competence of its simulation educators, annually at a minimum.
3.d. The Simulation Program has a process to assure orientation and development of those who participate in the delivery of educational activities but are not competent simulationists.
The SENAT provides a way to assess the simulation educators in a simulation programme both in onboarding, as well as an annual process. It can assist in the determination of the professional development content areas that an institutional cohort identifies, so that simulation professional development content and courses are appropriate for the educational needs. Data collected annually from the SENAT can provide evidence of progression of simulation skills and knowledge, especially when coupled with the educator’s performance assessment when implementing simulation.
While we used CHSE-A achievers as our expert panel during the validation process, it is possible that those educators experienced their onboarding process several years in the past and may not recall their developmental needs; however, the use of the literature to develop the survey may have mitigated that. The responses received from the social media campaign offer an opportunity for someone other than the intended audience to respond and contribute to the data. A self-reported needs assessment tool is subjective to some extent and lacks objective criterion for possible comparison. We do view this tool as part of self-discovery of simulation educational development needs, and instrumental to mentorship/career development.
The SENAT was created based on standards of best practice, accreditation criteria, the CHSE blueprint, and NCSBN guidelines and subsequently underwent validation by CHSE-As worldwide. The SENAT can now be used as a benchmark of the essential knowledge and skills of a simulation educator. The SENAT provides multiple options for newly hired educators, as well as those experienced in simulation. It can be utilized for individual assessment and subsequent mentoring or as a self-assessment roadmap for personal development. The SENAT serves an important role guiding simulation educators in meeting simulation accreditation requirements. The SENAT is a resource which offers benefits to individual simulation educators as well as simulation programmes.
We would like to acknowledge the following individuals for their role as expert reviewers for the SENAT content validity process: M. Anderson, T. Andrighetti, S. Beroz, J. Carey, A. Cowperthwait, J. Craig, M. Elcin, S. Forneris, B. Hallmark, A. Herrington, A. Kleinheksel, S. Koh, Z. Kurji, S. Mascarenhas, J. McCarthy, A. Monachino, P. Nawathe, J. Perretta, T. Roberts, D. Schocken, R. Schondel, C. Shum, J. Victor, P. Watts, J. Wells, L. Wilson, P. Zaveri.
None declared.
No funding support was received for this publication.
None declared.
None declared.
None declared.
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.