Emergency Preparedness Nursing Education: Learner and Faculty Perspectives

Abstract

Over the past decade, entry-to-practice emergency preparedness competencies have been identified as an essential component of nursing education. In this paper the author reports upon a small Canadian study which explores the perspective of undergraduate learners and faculty members who participated in and/or facilitated an Emergency Preparedness Simulation (EPS) module during a primary health care praxis course. The central purpose of this study was to explore the related experiences of learners and faculty who participated in or facilitated an Emergency Preparedness Simulation (EPS) module academic year and their perspectives on the effectiveness of the simulation in preparing learners to respond to emergencies in the future. The EPS module included a seminar followed by a mass-casualty simulation experience. The mass-casualty simulation experience included a “Teddy Bear” triage and an “Explosion” triage. The constructivist data analysis identified four related patterns for both the learner and faculty participants: Strengths (S), Objections (O), Suggestions (S), and Feelings (!) [SOS!]. Three themes were identified in each pattern: relevance, design, and engagement. In comparing the learner and faculty perspectives, there is a clear congruence between the strengths identified, the objections identified, and the power of feelings for both learners and faculty who participate in the emergency preparedness scenarios. Learners and faculty had different suggestions. Learners suggested more time on developing skills, particularly around first aid of individual clients, and recommended all students begin with the “Teddy Bear” triage. Faculty suggested a re-thinking of the “Explosion” triage simulation to emphasize community based emergency preparedness and responsiveness. Such re-focusing would support the integration of key primary health care principles and values including equity, social justice, and social determinants of health. Learners and faculty valued the EPS module and recommended it continue to be a learning component of the primary health care course.

Share and Cite:

Macdonald, G.J. (2015) Emergency Preparedness Nursing Education: Learner and Faculty Perspectives. Open Journal of Nursing, 5, 1012-1023. doi: 10.4236/ojn.2015.511108.

1. Introduction

The increasing number of emergencies in the community related to natural, manmade, and technological disasters has prompted new standards in emergency preparedness nursing education locally and globally [1] [2] .

Since September, 2007 all undergraduate nursing programs in Ontario, Canada have been directed by the College of Nurses of Ontario (CNO) to incorporate emergency preparedness within the undergraduate nursing curriculum [3] . The College of Nurses of Ontario sets the competencies for entry-level Registered Nurse practice in Ontario. By definition, a nursing competency identifies “the knowledge, skill, ability and judgment required for safe and ethical nursing practice” [3] . An entry-level Registered Nurse must be prepared to demonstrate knowledge of emergency preparedness and to participate in emergency preparedness and disaster planning [3] . The Canadian Association of Schools of Nursing (CASN) identifies applying public health sciences in nursing practice as an expected competency, including “health emergency preparedness and disaster response” [4] .

Globally, many nursing education programs are incorporating some aspects of emergency preparedness in their nursing education programs and in their accreditation standards. The American Association of Colleges of Nursing (AACN) incorporates emergency preparedness in their competency standards. American accreditation requirements include “disaster and mass-casualty education” [5] . The AACN standards state graduates are expected to be able to “prepare for and minimize health consequences of emergencies” [6] .

The addition of emergency preparedness has created new challenges for faculty. Community nursing faculty members have, unexpectedly, become responsible for implementing effective educational opportunities that meet the new accreditation standards for emergency preparedness. In particular, they have needed to ensure nursing curricula incorporate “…mass-casualty education” [5] .

In this paper the author reports upon a small Canadian study which explores the perspective of undergraduate nursing learners and faculty members who participated in and/or facilitated an emergency preparedness simulation (EPS) module.

2. Background, EPS Module, & Study Introduction

Senior year nursing students have developed health assessment skills, communication skills, and leadership skills that are valuable during an emergency situation whether it is a pandemic, bio-terrorism, violence, or natural disaster such as ice/wind storms. Nevertheless, most nursing students have little experience or preparation to respond to an emergency.

To meet the challenge of embedding emergency preparedness skills into the curriculum faculty designed a pilot emergency preparedness simulation module for all senior year students in a second entry undergraduate nursing program. All senior year students were expected to participate in this emergency preparedness simulation module in their primary health care praxis course, with their seminar peers. There were approximately 150 students in the senior year of the BScN program. Half completed the EPS module in the fall term, and half in the winter term. There were five sections of the primary health care course in the fall term, and five in the winter term. Each section was facilitated by a faculty member and normally fifteen students. All five seminar groups completed the EPS module in the same week. Half of the students in each course section started with the “Teddy Bear” triage, and half with the “Explosion” triage.

The learning objective for the EPS module was outcomes based, stating that by the completion of the EPS module learners would be prepared to demonstrate the Ontario College of Nurses expectations of new graduates in terms of emergency preparedness. As previously stated, this meant having the required knowledge and preparation needed to participate in emergency preparedness and disaster planning [3] .

The first component of the EPS module was a three hour Monday morning seminar. The seminar began by reflecting upon local and global disasters prompted by images on power point slides. This led to a rich discussion. The faculty then facilitated a discussion of the seminar readings that investigated the role of nurses in planning for and responding to emergencies and disasters. More specifically, the readings considered emergency preparedness [7] , mass casualty [8] , ability and willingness to report to duty [9] , and debriefing [10] . Faculty high- lighted ethical standards for emergency response during the seminar [11] . At the end of the seminar students were reminded of the time of their simulation experience later in the week, and advised that the simulation would relate to the seminar theme of preparing to participate in emergency preparedness and disaster planning.

The second component of the EPS module was participation in a mass-casualty simulation experience. Each section of the course was booked into a specific two hour simulation lab time, on a clinical practice day. Students had to travel to the university prior to or after their clinical practice that day and were given credit for a half day of clinical practice. The simulation experience included a welcome and briefing prior to both the “Teddy Bear” mass casualty triage and the “Explosion” mass casualty triage. After each simulation exercise that the students completed, the faculty facilitated a thorough and thoughtful debriefing. Students completed evaluations of the simulation experience in the final ten minutes. The simulation experience was guided by the National League for nursing theoretical framework for simulation design [12] . This theoretical framework was further developed by Jeffries & Rogers [13] . It is now referred to as the NLN/Jeffries Simulation Framework [14] . The NLN/Jeffries Simulation Framework identifies effective educational practices, simulation design characteristics, and participant outcomes [14] .

An internally funded, descriptive educational research study was designed to provide needed feedback from learners and faculty on the educational effectiveness of the 2007/2008 emergency preparedness simulation module (EPS). Ethics approval was obtained from the University of Toronto, Office of Research Ethics. Completion of the Survey Monkey questionnaire provided informed consent for the learner participants and ensured participant anonymity. Faculty participants signed a consent form prior to the start of their focus group.

2.1. Purpose

The central purpose of this study was to explore the related experiences of learners and faculty who participated in or facilitated an Emergency Preparedness Simulation (EPS) module in the 2007/08 academic year and their perspectives on the effectiveness of the simulation in preparing learners to respond to emergencies in the future. As well, the study identified learner perspectives on the simulation design, educational practices, and learner satisfaction and confidence. This study was a descriptive educational study, including the collection and analysis of quantitative and qualitative data from learner participants and qualitative data from faculty participants.

This study was anticipated to provide valuable insight into the related experiences and perspectives of the learners and the faculty members involved in the EPS module. It was anticipated that the study findings would guide planning for the EPS module in future years, enhance the educational experience of future BScN learners, and contribute to the safety of Canadians during emergencies.

2.2. Review of Literature

The focus of simulation research in nursing education in Canada, and internationally, has been primarily in undergraduate nursing education [15] -[17] with interest in simulation in the practice environment growing, particularly in relation to emergency preparedness [17] [18] . Emergency preparedness is a particular concern because of the increasing number of emergencies being faced in the community related to violence/terrorism [18] , natural disasters [19] , pandemic preparedness [20] and technological disasters [21] .

Simulation research suggests that key learner outcomes of simulations relate to knowledge development, skill performance, learner satisfaction, critical thinking, and self-confidence [22] . Active learning, high fidelity simulations that closely mimic real practice, and immediate constructive feedback, are proposed to result in more positive student outcomes [22] . Faculty involvement in the simulation is encouraged with faculty taking on roles in the simulation [23] , guiding simulation debriefing [24] [25] , and taking responsibility to assess the ongoing impact of participating in the simulation on the students and faculty members, as participation may be anxiety provoking and/or exhausting [23] .

Prior to the research proposal development, no published studies were located that specifically described faculty perspectives related to the design and/or implementation of simulated learning or the related experiences/perspectives of learners and faculty involved in an emergency preparedness simulation. Subsequently, Jose & Dufrene [25] completed an integrative review of disaster preparedness in undergraduate nursing education globally. They found that all disaster preparedness nursing education studies incorporated simulations within their design, typically engaging students actively in responding to a disaster scenario [25] . Faculty-facilitated debriefing was an integral component of these studies with a focus on facilitating reflection upon the simulation experiences of the students [25] . Adams, Canclini, and Frable [21] argue that nursing programs must ensure that there are appropriate learning experiences that facilitate students learning about emergency preparedness nursing, including imagining how they will participate in future emergencies as a Registered Nurse. Simulation based learning (SBL) has been confirmed as an appropriate clinical learning approach in pre-licensure nursing programs, one that leads to effective learning and successful passing of the NCLEX-RN registration exam [26] .

3. Study Methodology

3.1. Guiding Research Questions

The guiding research questions for the study were:

1) What is the nature of the related experiences of learners who participated in the Emergency Preparedness Simulation (EPS) module and the faculty who facilitated the EPS module?

2) What are the learner and faculty perspectives on the effectiveness of the EPS module as a learning approach to prepare learners to respond to emergencies in the future?

3) What are the learner perspectives on the simulation module design, educational practices, and learner satisfaction and self-confidence?

3.2. Participants

All senior students/learners in the BScN program at the researcher’s university who participated in an Emergency Preparedness Simulation module during their Primary Health Care course (N = 150), in the 2007/2008 academic year, were invited to participate. Learners were invited to participate via an e-mail from the faculty registrar as the faculty research protocol did not permit the principle investigator (PI) to contact students directly. Learners then had to initiate a direct e-mail to the PI and state that they would like to participate in the study. Following this, the learners’ contact data (e-mail) was uploaded into Survey Monkey by the research assistant (RA).

Nine learners volunteered to participate in the study. This was a disappointing response rate of 6%. Barriers to participation were thought to be the fact that data collection occurred during the final two week of the BScN program, when students were completing a clinical practicum off campus, and the requirement to read the registrar’s e-mail and take the initiative to respond directly to the PI.

All faculty (n = 6) who facilitated a simulation during an EPS module, in the 2007/08 academic year, were invited to participate in the study. They were invited to attend a two hour focus group in July, 2008. Three faculty participated in the study (n = 3) for a response rate of 50%. Two of these faculty also attended an optional focus group to review the preliminary data findings and thus provided face validity for the findings. Focus group tapes and notes were transcribed into a word document creating paper transcripts for qualitative analysis. A constructivist analysis was used to identify patterns in the qualitative data and then themes within each of the patterns.

3.3. Learner Methods

Learners (n = 9) in their final year of a second-entry BScN program completed survey questions using the secure, on-line survey tool, Survey Monkey. Survey questions for the learner participants included one demographic question which asked if they completed the EPS in the fall or winter term of the 2007/08 academic year, two open-ended qualitative survey questions, and fixed-choice questions from three tested and reliable NLN/Laerdal instruments that have established their value in educational research, specifically, “… all three measures are both reliable and valid” [27] . The three instruments are: The Simulation Design Scale (SDS), the Educational Practices Questionnaire (EPQ), and the Student Satisfaction and Self-Confidence in Learning (SCLS) instrument [27] . Data collection was completed in June, 2008. Learner participants were not invited to attend the optional focus group to review the preliminary data analysis and as they had graduated and ethical approval had not included contacting these alumni.

Survey Monkey was set to ensure all responses remained anonymous. Learner participants had a two week window to complete the survey questions after access to the survey questions in Survey Monkey was opened. One reminder was sent to learner participants via Survey Monkey to highlight the deadline date. Once the deadline date was reached, the survey questions on Survey Monkey were removed. Data was stored initially on Survey Monkey, in a secure, password-protected account, accessible only to the research team. Data was then trans- ferred into word documents stored in the research team members’ secure, password-protected computers. Hard copies were stored in secure, locked cupboards.

3.4. Faculty Methods

Faculty were initially contacted by the PI via e-mail with an attached letter providing a description of the study and an invitation to participate in one focus group. Faculty were also advised that they would have the option to participate in a second focus group to respond to preliminary data analysis, but this was not required to be eligible to participate in the study. Faculty indicated their willingness to participate in the research study by responding to the PI via e-mail. The focus group was held in July 2008, on the university campus, at a time which was convenient for faculty participants.

The PI and the RA attended the focus group along with the faculty participants. It was anticipated that from 3 - 6 faculty participants would attend the focus group and three attended (n = 3), two in person and one via telephone conference call. The two hour focus group was moderated by the PI. Faculty participants were invited to respond to six open-ended questions. During the focus group the RA audio-recorded the discussion, using a tape recorder and a digital voice recorder, and recorded highlights in writing and on a computer. Following the focus group the recorded dialogue and notes were cleaned and transcribed into a word document available for qualitative analysis. In May, 2009 the faculty participants were invited to meet for an additional, optional, focus group discussion to validate the preliminary findings. Two faculty participants attended this optional focus group. Feedback from this additional focus group was recorded via notes taken by the PI and was included in the final data analysis to support faculty face validity for the findings.

4. Data Analysis

Overall, a constructivist analysis guided the data analysis. As Pickard and Dixon [28] confirm, the individual is central in a constructivist analysis and the challenge of the researcher is to offer an analysis of the individual’s lived experience. The researcher seeks to explicate the meaning of the individual’s experience, necessitating a holistic approach that attends to the broad context of the experience, including an ecological analysis that identifies the multiple realities experienced by the individual, attending to their complexities and interconnections, and thus identifying the multiple realities and finally the meaning of the individual’s experience [28] . It is essential to collect data that will provide the rich text needed for a trustworthy analysis [29] [30] .

Lincoln [31] argues that constructivist is consistently relativist. Analysis in constructivist inquiry is responsive to the data, in contrast to much positivist, post positivist, and critical theory research which begins the analysis by choosing to examine the data through the lens of a predetermined theoretical model [31] [32] . The patterns that emerge from a constructivist analysis may demand a deeper critical analysis that acknowledges concerning contradictions and, if so, the analysis may include a balance of constructivist and deconstructivist inquiry [32] . What is clear is the importance of attending to the data to inform the analysis, and to attend to the multiple realities of the participants’ experiences. Constructivist researchers value a subjective perspective, researcher and subject form interdependent relationships, and the subjective experiences of the researcher are made explicit and valued [30] . To judge the quality of an interpretive, constructivist inquiry, soundness of the inquiry is judged by the trustworthiness, meaning the establishment of credibility through measures such as triangulation, peer debriefing, and most importantly, member checks; transferability of the findings; dependability established by the inquiry audit, and confirmability that the “…subjective knowledge of the researcher can be traced back to the raw data…” [30] . The PI and RA kept detailed field notes throughout the research process, including recording their own responses to the faculty focus group questions prior to interviewing the faculty participants to ensure that they have processed their lived/heuristic experiences. By expressing their own experience prior to inviting the faculty participants to express their experience, the researchers are prepared to be able to open and listen to the experience of the faculty participants and learner participants.

Qualitative data for learners and faculty were analyzed separately and then results were compared as reported in this paper.

4.1. Learner Quantitative Data Analysis

Pickard & Dixon [28] suggest that tensions remain among qualitative researchers about collecting qualitative and quantitative data within a study, and thus mixing paradigms, leading to limitations, but argue this may be needed to answer central questions in the research study. In this study, the researcher included three valid questionnaires, composed of fixed-response questions, for the learners, along with one demographic question and a number of open-ended questions. This would require integration during data analysis but was considered the best data collection methodology to provide a response to the final research question “What are the learner perspectives on the simulation module design, educational practices, and learner satisfaction and self-confi- dence?” As well, the results were to establish a baseline that future research might be compared to and to create an opportunity to examine the relationship between the quantitative and qualitative responses. And so the overall constructivist analysis was intended to include the statistical analysis of the quantitative questions.

Due to the low learner response rate (6%, n = 9) the quantitative data was not considered to be representative of the experience of learners in the class. Guided by the advice of a statistician, the raw data was reviewed, using the following numeric values: strongly disagree (−2), disagree (−1), undecided (0), agree (1), strongly agree (2). A positive average confirms learner agreement with the simulation tool statements while a negative average indicates disagreement with the simulation tool statements. A positive average indicates strengths of the simulation from the learners’ perspectives. The demographic question asked learners to identify which term they had completed the simulation but due to the low response rate this data was not included in the analysis. The two open-ended qualitative survey questions were analyzed using a constructivist analysis and are reported under the qualitative findings.

Each of the three instruments used drew upon a 5-point scale that rated student participants’ attitudes or beliefs: 1-strongly disagree, 2-disagree, 3-undecided, 4-agree, and 5-strongly agree [27] . The Simulation Design Scale (SDS) (20 items) uses a 5-point scale to assess the quality of simulations relating to objectives/information, learner support, problem-solving component, feedback/debriefing, and fidelity/realism [22] . The tool has established content validity and reliability [27] . The Student Satisfaction and Self-Confidence in Learning (SCLS) (13 items) uses a 5-point scale to assess learner satisfaction with simulation learning and self-confidence in learning during the simulation. The tool has established content validity and reliability [27] . The Education Practices Questionnaire (EPQ) (16 items) uses a 5-point scale to assess the quality of simulations related to educational practices including active learning opportunities, collaboration with peers, diverse ways of learning, and clarity of expectations. The tool has established validity and reliability [27] .

4.2. Learner Quantitative Findings

Overall, all three simulation tools report positive averages indicating that the simulation was of high quality from the perspective of the nine participants in the study. The categories reviewed included objective and information, support, problem solving, feedback/guided reflection, fidelity (realism) from the SDD; active learning, collaboration, diverse ways of learning, and high expectations from the EPSS; and from the SSLS satisfaction with current learning and self-confidence in learning. Of the 49 subcategories, 44 reported positive averages and 5 reported negative averages. The five subcategories reporting negative averages were: the chance to discuss the simulation objectives with my teacher, availability of the instructor to discuss individual learning needs, the opportunity to goal set during the simulation, and the scenario resembled a real-life situation.

The researcher acknowledges that the timing of the data collection, from four to eight months following the emergency simulation, and the fact that learners may have discussed their experience with their peers, may impact the validity of the responses of the learner participants to the NLN/Laerdal Instruments. Learner participants were advised that if they had trouble responding to any of the questions they did not have to complete them.

Overall, all three simulation tools report positive averages indicating that the simulation was of high quality from the perspective of the nine participants in the study. The learner quantitative data findings provided limited but valued guidance to local faculty planning future emergency preparedness simulations in the study institution. The low response rate limited further data analysis.

4.3. Learner Qualitative Data Analysis

Qualitative data from the two open-ended learner survey questions were reviewed, and a constructivist analysis was used to identify patterns emerging from the data. Open-ended survey questions data stored in word documents were line numbered and printed in hard copy. Qualitative data analysis included reviewing the data from the open-ended learner questions and identifying emerging broad patterns related to the experiences and perspectives of the learners. Coloured highlighters were used to mark connections/emerging patterns within the data. Once the broad patterns were identified/coded, data was organized within these patterns, by identifying themes. Each pattern was explored in depth to establish the meaning of the learner experience.

4.4. Learner Qualitative Results

This qualitative analysis of the learner open ended questions led to four patterns: Strengths, Objections, Suggestions, and Feelings (SOS!). The pattern of Strengths identified the learners’ perspectives of the positive components of participating in the EPS module. Objections identified the concerns learners shared about participating in the EPS. Learners did have a number of Suggestions for improving the learning experience of students in the EPS. And finally, students shared strong Feelings about participating in the EPS, both positive and concerning. Within each of these four patterns was evidence of three themes: relevance, design, and engagement. Relevance spoke to the valuing of the EPS module by learners, design spoke to the organization of the lab and flow of the triage simulations, and engagement reflected the learners’ ability to pay attention and create meaning from their experience in the EPS module. The four patterns and three themes are integrated in this SOS! analysis.

In the strengths pattern, learners identified relevance “the idea of having an EPS is very essential” and design features “The teddy bear triage was effective”… “The debriefing was helpful and when we repeated our [explosion] scenario”. Engagement was also a strength “I now know that during an emergency it is not feasible to perform CPR”… “I recognize my potential role as a leader in such a situation because of my knowledge base and experience.”

In the objections pattern, learners identified relevance by their concerns that their learning was not optimal―“I still feel under qualified to provide first aid for trauma, bleeds, burns, breaks, dislocations etc.” The design was not optimal either―“During the simulation our group really wasn’t sure what to do so it seemed awkward”… “The set-up was very unrealistic”. And one learner objected to the “Teddy Bear” triage simulation, directly stating that engagement was a concern―“… the scenario itself was not engaging.”

Learner suggestions were focused on the simulation design, identifying changing the order of the simulations and requesting faculty provide demonstrations: “If I had seen the bears first [before the Explosion simulation]” and “time … for actual teaching/demonstrations by the facilitators … what a response should look like given our lack of experience and knowledge …”

And learners identified strong feelings associated with the simulation including feeling engaged prior to the simulations and feeling upset about the [Explosion] simulation design: “I was pretty eager to do this module prior to the day of …” but … “during … I was feeling very ill-prepared … helpless …”

4.5. Learner Discussion

It was clear from the strengths learners identified, that the EPS module was a valued and relevant learning experience for most of the learner participants. Emergency preparedness is an engaging topic and learners were engaged in the EPS module. Learner responses did not mention the seminar. Responses related to the two simulations which they anticipated with excitement. The “Teddy Bear” triage worked well for most students. It was safe, challenging, and the design flowed well.

Objections addressed the design of the “Explosion” triage simulation. Learners were uncertain, felt unable to take charge of the situation, and didn’t feel confident in their first aid skills. They also wished their teachers had been more supportive and had given them more guidance. This scenario didn’t work as well particularly for learners who started with this simulation first.

Learners suggested that the “Teddy Bear” triage be the first simulation for all students. They also suggested faculty be more involved in the “Explosion” triage as faculty are in the “Teddy Bear” triage.

And finally, learners experience a range of powerful emotions while completing this EPS module. Learners didn’t like feeling powerless. They wanted more support from faculty, more role modelling, more direction.

5. Faculty Perspectives

5.1. Results

The constructivist data analysis revealed four patterns in the faculty perspectives, aligning closely with the learner qualitative patterns: Strengths (S), Objections (O), Suggestions (S), and Feelings (!) or SOS!. Within each of these four patterns were three themes: relevance, design, and engagement. The patterns and themes are integrated into the discussion following the direct faculty quotes.

5.1.1. Strengths

Faculty identified a number of strengths related to the topic and the enhancement of the course by using the simulation lab which the learners valued:

“Okay, prior to it I, I was quite excited because it’s quite close to my heart as you know I am in public health and we are also going through a similar experience and I was really happy to hear that [the university] was a taking it on.”

“I was actually intrigued about how we would use the lab to do something in community. To me that was quite novel and so from that perspective I was really quite excited to see how that would play out.”

5.1.2. Objections

Faculty objections were related to feelings about their own competency to teach the EPS module, their role in the simulations, and how they would support students.

“I didn’t know what my role was until the day before, a couple of days before the actual class. So it was that part that I was a little bit apprehensive.”

“… really not knowing what to expect and having really no trauma experience or experience in this field I was probably anxious about what I would be expected to know and how I would be expected to support the students and that was that was before we actually started…”

“And then when we came to the actual [explosion] mass casualty I was, I thought it was, I thought it left a lot open for the students to feel powerless… partly because the activity was structured around one person who got to be in charge and they held all the information.”

5.1.3. Suggestions

Faculty had suggestions to share which would incorporate values and principles of primary health care, including focusing on community level analysis:

“… I would say that this curriculum needs to be expanded to include other forms of disasters like―you know it was a bomb at that time but it could be―other kinds of disasters like a train derailment, that kind of thing.”

“… in the fall I remember, putting in a couple of articles around like the critical analysis using race, gender, class and disaster management. Things that lead to disaster, social determinants and I don’t think in the winter term there was an opportunity to do this. I mean I thought that that was an element that was missing and it was not necessarily going to be encompassed by the way this was set up. So I think there has to be or it would be useful for the students in linking that more clearly to okay so what kinds of analysis is also important here. It is more that a how to understanding okay, so who is it that are more vulnerable etc.”

5.1.4. Feelings

Faculty experienced a range of feelings during the lead up to and during the facilitation of the “Teddy Bear” and “Explosion” simulations.

“…but I don’t really feel as if I have all the knowledge, skills and confidence to do this.”

“Yeah, casualty. It was a bit of feeling that scattered kind of feeling and I was trying to put myself in the real situation when that happens and that was a bit traumatic and the students when they were in that situation they were sort of very disorganized and they were kind of scattered and they took on the attitude of laissez faire… I was a little unsure of what to do as a faculty member and I was asked not to say anything.”

“Now, I seem to recall teddy bear triage going about the same in the fall and the winter. A very positive experience. The [Explosion] casualty experience was a bit different in the fall than the winter. Seems to me in the winter, there was a second chance and so that changed the outcome for the students, and because I was made aware of that before things started, I think I felt a bit different.”

5.2. Discussion

The faculty experience identified strengths relating to relevance of the emergency preparedness module, design of the “Teddy Bear” triage, and personal engagement with learning new skills and developing a new competency. Faculty suggested that in future the design of the simulations would have all students complete the “Teddy Bear” triage prior to the “Explosion” triage as students had recommended.

Faculty objections related to the design of the “Explosion” triage which limited feedback to students and was not considered safe for students; limited relevance to the course learning objectives as the design focused on responses to individual clients rather than taking the broad community perspective of the course including critical analysis of PHC principles and values and key SDOH such as race, gender, class issues. Concerns about confidence, competence, and preparation for the EPS were also voiced as objections.

Faculty participants contributed suggestions for future EPS modules that were invaluable including making the critical analysis a central part of the modules, so that in keeping with the course objectives faculty do consider issues of vulnerability/social justice. As well, the faculty suggested that the design of the “Explosion” triage needed to be clearer and proposed that a setting such as MSF (Doctors without Borders) or Red Cross might be a more viable simulated setting in which to conduct the triage.

Faculty reported a range of feelings including liking the simulation, feeling excited, and happy to more problematic feelings such as feeling disempowered because of the lack of control over the design of the “Explosion” triage, a design some felt was not safe for the students.

6. Learner and Faculty Interpretations

6.1. Learner Interpretations

Learners identified key strengths of the “Teddy Bear” Triage, objections to the design of the “Explosion” triage, valuing of including the emergency preparedness simulation in the course, and the emergence of powerful feelings associated with the two simulations, including feeling that faculty should be demonstrating emergency preparedness skills for the learners. They suggested specific changes to improve the learner experience including starting all students in the “Teddy Bear” triage, and then proceeding to an “Explosion” triage that was more realistic, that focused more on first aid, one that included active participation of the faculty who would then provide role modelling and offer support and guidance to the learners.

6.2. Faculty Interpretations

Overall, faculty were engaged with the EPS module and became far more engaged with the topic of emergency preparedness, a new competency for a number of the faculty. Faculty identified the design of the “Teddy Bear” triage as one that worked well―it was safe, structured, clear, and non-threatening. The design and facilitation of the “Explosion” triage was problematic. Faculty were concerned about students and the lack of supports outside of the faculty to support students. Faculty also acknowledge that planning and leading the “Explosion” triage stimulated a range of powerful feelings, some positive but predominantly feeling concerned for the safety of students and missing congruency between the simulations based learning (SBL) and the primary health care principles and values guiding the course.

Faculty suggestions focused on continuing to incorporate the EPS module in the Primary Health Care course but shifting the focus of the “Explosion” triage to one that is more community oriented, away from a first aid approach, to look at community level responses. This confirmed that faculty felt there was value in preparing students for emergency preparedness in their future practice. Faculty proposed new readings to be integrated into the seminar, to ensure a positive link between the EPS learning and the course expectations. As well, faculty recommended considering how to identify the impact of emergencies on our most vulnerable clients such as our homeless population, within the context of a community response to emergency preparedness. The SOS! analysis was an effective one to interpret both the learner and the faculty perspectives on the emergency preparedness simulation.

7. Study Conclusion: Comparing the SOS! Findings for Learners and Faculty

In comparing the learner and faculty perspectives, there is a clear congruence between the strengths identified, the objections identified, and the power of feelings for both learners and faculty who participate in the emergency preparedness scenarios. However, while the learners asked for more demonstration of emergency preparedness skills, particularly around first aid of individual clients, the faculty offered another perspective when identifying suggestions.

Faculty advocated for a re-thinking of the Explosion triage to shift the focus of the learning away from individual to community based preparedness and responsiveness. Such re-focusing would tap into the strengths of the faculty teaching the course, would reduce the tension students felt about being prepared to provide care to critically injured clients in an unpredictable environment, and would support the integration of key course themes around equity, accessibility, social justice, social determinants of health, race, gender and class.

The SOS! analysis effectively supports the interpretation of both learners and faculty participants in this study. While the sample size was limiting, the study was able to inform future emergency preparedness simulations in the local context. It is less clear that the study will be of benefit for other nurse educators and students in other nursing programs. However, this study may provide some guidance for nurse educators who are just beginning to integrate emergency preparedness into their curriculum. There was a clear support by both faculty and learners for the value of including emergency preparedness into the BScN curriculum.

Future research studies are needed to confirm the value of emergency preparedness simulations for BScN curriculum, to explore learner perspectives and to continue to develop our understanding of nurse educator perspectives.

8. Limitation

The study response rate was low related to timing and university student recruitment policies. This negatively impacted the value of the quantitative data collected. While few learners participated in the study (n = 9), their qualitative data remains of value. The participation of faculty was small (n = 3) but it represents a RR of 50%, which is acceptable. The study data was collected seven years ago. However, as the outcomes of the study continue to guide the current EPS module, and as students and faculty continue to evaluate this EPS very positively, the study is considered to be of current value to nurse educators.

Acknowledgements

In memory of Ms. Betty Burcher, a Lecturer at the Bloomberg Faculty of Nursing, University of Toronto who was the second author for the study proposal. Funding support was provided from the Bloomberg Faculty of Nursing, University of Toronto, NERF Fund ($6810.00).

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Adams, L.M., Canclini, S.B. and Frable, P.J. (2015) Skip the Infection, Get the Injection: A Case Study in Emergency Preparedness Education. Nurse Education in Practice, 15, 58-62.
http://dx.doi.org/10.1016/j.nepr.2013.12.004
[2] Weiner, E. (2006a) Preparing Nurses Internationally for Emergency Planning and Response. OJIN: The Online Journal of Issues in Nursing, 11, Manuscript 3.
http://www.nursingworld.org/MainMenuCategories/ANAMarketplace/ANAPeriodicals/OJIN/Functional
Menu/AboutOJIN
[3] College of Nurses of Ontario (2007, 2014) Competencies for Entry-Level Registered Nurse Practice, Toronto: Author. http://www.cno.org/docs/reg/41037_EntryToPracitic_final.pdf
[4] Canadian Association of Schools of Nursing (CASN) (2014) Entry-to-Practice Public Health Nursing Competencies for Undergraduate Nursing Education.
http://www.casn.ca/2014/12/entry-practice-public-health-nursing-competencies-undergraduate-nursing-education-2/
[5] Austin, E.N., Bastepe-Gray, S.E., Nelson, H.W., Breitenbach, J., Ogle, K.T., Durry, A., Green, S.D., Crabtree, L.A., and Haluska, M. (2014) Pediatric Mass-Casualty Education: Experiential Learning through University-Sponsored Disaster Simulation. Journal of Emergency Nursing, 40, 428-434.
http://dx.doi.org/10.1016/j.jen.2014.05.015
[6] American Association of Colleges of Nurses (AACN) (2008) The Essentials of Baccalaureate Education for Professional Nursing Practice. 1-62.
http://www.aacn.nche.edu/education-resources/BaccEssentials08.pdf
[7] Rodriquez, D. and Long, C.O. (2006) Emergency Preparedness for the Home Healthcare Nurse. Home Healthcare Nurse, 24, 20-27.
http://dx.doi.org/10.1097/00004045-200601000-00006
[8] Riba, S. and Reches, H. (2002) When Terror Is Routine: How Israeli Nurses Cope with Multi-Casualty Terror. Online Journal of Issues in Nursing, 7, 1-14.
http://www.nursingworld.org/MainMenuCategories/ANAMarketplace/ANAPeriodicals/OJIN/Functional
Menu/FAQs.aspx
[9] Qureshi, K., Gershon, R.M., Sherman, M.F., Straub, T., Gebbie, E., McCollum, M., Erwin, M.J. and Morse, S.S. (2005) Health Care Workers’ Ability and Willingness to Report to Duty during Catastrophic Disasters. Journal of Urban Health, 82, 378-388.
http://dx.doi.org/10.1093/jurban/jti086
[10] Hollister, R. (1996) Critical Incident Stress Debriefing and the Community Health Nurse. Journal of Community Health Nursing, 13, 43-49.
http://dx.doi.org/10.1207/s15327655jchn1301_4
[11] Canadian Nurses Association (2008) Ethics in Practice: Nurses’ Ethical Considerations in a Pandemic or Other Emergency. CNA, Ottawa.
http://cna-aiic.ca/~/media/cna/page-content/pdf-en/ethics_in_practice_august_2008_e.pdf?la=en
[12] Jeffries, P.R. and Rodgers, K. (2007) Theoretical Framework for Simulation Design. In: Jeffries, P.R. and National League for Nursing, Eds., Simulation in Nursing Education: From Conceptualization to Evaluation, National League for Nursing, New York, 21-33.
[13] Jeffries, P.R. and Rogers, K.J. (2012) Theoretical Framework for Simulation Design. In: Jeffries, P.R. and National League for Nursing, Eds., Simulation in Nursing Education: From Conceptualization to Evaluation, 2nd Edition, National League for Nursing, New York, 25-41.
[14] Ravert, P. and McAfones, J. (2014) NLN/Jeffries Simulation Framework: State of the Science Summary. Clinical Simulation in Nursing, 10, 335-336.
http://dx.doi.org/10.1016/j.ecns.2013.06.002
[15] Canadian Association of Schools of Nursing (CASN) (2007) CASN/ACESI Clinical Placements Projects, Project 3: Inventory of the Use of Simulated Clinical Learning Experiences and Evaluation of Their Effectiveness.
www.cihc.ca/files/complementary/CASN Inventory of Simulation.pdf
[16] Miller, J.L., Rambeck, J.H. and Synder, A. (2014) Improving Emergency Preparedness System Readiness through Simulation and Inter Professional Education. Public Health Reports, Supplement 4, 129-135.
[17] Weiner, E. (2006) Addressing Emergency Preparedness and Response Competencies for Nurses through Simulation Experiences. Clinical Simulation in Nursing Education, 2, 10-14.
http://dx.doi.org/10.1016/j.ecns.2009.05.017
[18] Albores, P. and Shaw, D. (2007) Government Preparedness: Using Simulation to Prepare for a Terrorist Attack. Computers & Operations Research, 35, 1924-1943.
http://dx.doi.org/10.1016/j.cor.2006.09.021
[19] Qureshi, K.A. and Merrill, J.A. (2002) Emergency Preparedness Training for Public Health Nurses: A Pilot Study. Journal of Urban Health: Bulletin of the New York Academy of Medicine, 79, 413-416.
http://dx.doi.org/10.1093/jurban/79.3.413
[20] Maunder, R.G., Leszcz, M., Savage, D., Adam, M.A., Peladeau, N., Romano, D., Rose, M. and Schulman, R.B. (2008) Applying the Lessons of SARS to Pandemic Influenza. Canadian Journal of Public Health, 99, 486-488.
[21] Adams, L.M., Canclini, S.B. and Frable, P.J. (2015) “Skip the Infection, Get the Injection”: A Case Study in Emergency Preparedness Education. Nurse Education in Practice, 15, 58-62.
http://dx.doi.org/10.1016/j.nepr.2013.12.004
[22] Jeffries, P.R. and Rizzolo, M.A. (2006) Designing and Implementing Models for the Innovative Use of Simulation to Teach Nursing Care of Ill Adults and Children: A National, Multi-Site, Multi-Method Study. National League for Nursing and Laerdal Medical, New York.
[23] Austin, E.N., Bastepe-Gray, S.E., Nelson, H.W., Breitenbach, J., Ogle, K.T., Durry, A., Green, S.D., Crabtree, L.A., and Haluska, M. (2014) Pediatric Mass-Casualty Education: Experiential Learning through University-Sponsored Disaster Simulation. Journal of Emergency Nursing, 40, 428-434.
http://dx.doi.org/10.1016/j.jen.2014.05.015
[24] Dreifuerst, K.T. and Decker, S.I. (2012) Debriefing: An Essential Component for Learning in Simulation Pedagogy. In: Jeffries, P.R. and National League for Nursing, Eds., Simulation in Nursing Education: From Conceptualization to Evaluation, 2nd Edition, National League for Nursing, New York, 105-130.
[25] Jose, M.M. and Dufrene, C. (2014) Educational Competencies and Technologies for Disaster Preparedness in Undergraduate Nursing Education: An Integrative Review. Nurse Education Today, 34, 543-551.
http://dx.doi.org/10.1016/j.nedt.2013.07.021
[26] Jeffries, P.R. (2015) Reflections on Clinical Simulation: The Past, Present and Future. Nursing Education Perspectives, 36, 278-279.
http://dx.doi.org/10.5480/1536-5026-36.5.278
[27] Franklin, A.E., Burns, P. and Lee, C.S. (2014) Psychometric Testing on the NLN Student Satisfaction and Self-Confidence in Learning, Simulation Design Scale, and Educational Practices Questionnaire Using a Sample of Pre-Licensure Novice Nurses. Nurse Education Today, 34, 1298-1304.
http://dx.doi.org/10.1016/j.nedt.2014.06.011
[28] Pickard, A. and Dixon, P. (2004) The Applicability of Constructivist User Studies: How Can Constructivist Inquiry Inform Service Providers and Systems Designers? Information Research, 9, Paper 175.
http://InformationR.net/ir/9-3/paper175.html
[29] Freeman, M., deMarrais, K., Preissle, J., Roulston, K. and St. Pierre, E.A. (2007) Standards of Evidence in Qualitative Research: An Incitement to Discourse. Educational Researcher, 36, 25-32.
http://dx.doi.org/10.3102/0013189X06298009
[30] Lincoln, Y.S. (1995) Emerging Criteria for Quality in Qualitative and Interpretive Research. Qualitative Inquiry, 1, 275-289.
http://dx.doi.org/10.1177/107780049500100301
[31] Lincoln, Y.S. (1990) The Making of a Constructivist: A Remembrance of Transformations Past. In: Guba, E.G., Ed., The Paradigm Dialog, SAGE Publications, Newbury Park, 67-87.
[32] Macdonald, G. (1996) Sustaining Energy for Caring: The Experience of Mothers Who Are Nurses. Ed.D. Dissertation, Ontario Institute for Studies in Education, University of Toronto, Toronto.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.