Tasman Medical Journal

ISSN: 2652-1881

Simulation exercises increase staff confidence, knowledge and skills in managing mass casualty incidents: a pretest-posttest study

Amy L Sweeney, Yuet L Lui, Nathan Watkins, Peter McNamee, Amanda Samsuddin, Cindy Huang, Scott Henry and Amy J Johnston

ABSTRACT

Introduction: Hospitals are obliged to have agile and scalable response arrangements for managing natural and instigated disasters. While many hospitals have disaster plans, few exercise these plans or test their staff under realistic conditions. Moreover, despite evidence of many emergency simulation training scenarios, there is little empirical evidence that simulation activities enhance staff clinical responses or the organisational management systems that support disaster management. This study explores changes in perceived preparedness of multidisciplinary hospital-wide team members to manage mass-casualty incidents, as a result of an interdisciplinary, interdepartmental simulation exercise.

Methods: This is pretest-posttest study evaluating the effect of a mass casualty simulation on the change in participants’ self-assessed skills, knowledge and confidence in responding to a mass casualty incident. It was carried out at three public teaching hospitals in southeast Queensland, Australia. Three Emergo Training System (ETS) mass-casualty exercises were the basis for their research. Quasi-experimental pre- and post-simulation cohort surveys were administered to capture participants’ self-rated confidence, skill and process knowledge on a 5-point Likert scale. Changes were analysed using paired t-tests. Data was analysed anonymously. The post-exercise survey included free-text comments.

Results: A total of 232 individuals participated in one of the exercises, and of these 129 (55.6%) completed both the pre-exercise and post-exercise surveys. Statistically significant increases in self-assessed confidence, skills, and knowledge scores (p < .001) were identified. Three themes were evident from free-field comments: the importance of pre-briefing, communication throughout and the need for greater realism in the scenarios.

Conclusions: This study supports the use of simulation using ETS as one way to improve staff’s self-efficacy and perceived preparedness, through increasing self-assessed knowledge, confidence and skills to manage a mass casualty incident at their health facility.

Tasman Medical Journal 2021; 3(4): 112-119

Full Text

Introduction
Emergency hospital systems are expected to respond effectively to intentional or accidental mass casualty incidents (MCI). Such incidents are largely unpredictable, and increasing worldwide.1-3  Mitigation of the harms associated with these disasters often requires appropriate high-quality healthcare processes to be enacted rapidly, safely and without duplication or error.4 To prepare clinicians and health systems, effective training and rehearsal using the multi-systems involved in an actual response is necessary. Response frameworks (for example, Homeland Security Exercise and Evaluation Program,5 Disaster Management Indicator Model,6 Hospital Incident Command System7 and disaster preparedness courses such as the Major Incident Medical Management Support (MIMMS) courses offered by MIMMS Australia8 have been created to address this need. Simulation exercises also aim to model surge capacity response,9 and test application of frameworks, plans and learnings. Some systems such as the Emergo Train System® (ETS) are designed to enable testing, but reviews of the literature reveal surprisingly little evidence of benefit for broad multidisciplinary response teams.10,11 There is little evidence that simulation translates into better preparedness or that improved preparedness makes differences to actual responses. Consensus is lacking on which measures should be used to evaluate responses to infrequent community disasters.  Given the critical importance of timely and effective health system responses to incidents, and the significant investment governments and communities make in funding simulation activities, these uncertainties need to be resolved. This paper uses incident simulation as an investigational method. The objective of our study was to assess the impact of simulation on staff perception of preparedness to respond to a MCI.

Method
Study design
This is pretest-posttest study evaluating the effect of three mass casualty exercises on the change in the self-assessed skills, knowledge and confidence of the participants in responding to one of three MCI.

Setting
The study was set in three public teaching hospitals, two located in the City of Gold Coast, South East Queensland, and one located in Brisbane, Australia, with an approximate total base population of 2.9 million in 2017. The first hospital (A) is a tertiary hospital and trauma centre with 750 beds and more than 100,000 ED presentations annually. The second (B), is a 200 bed hospital with 65,000 emergency department (ED) presentations annually. The third hospital (C) is also a tertiary hospital with 85,000 ED presentations annually. Hospitals A and C had disaster management plans in place at the time of the study.

Participants and participant orientation
Participants were full or part-time staff at each site, invited to participate by the exercise organisers. Between 3 and 10 additional external clinicians with a special interest in disaster management were also invited to attend as unpaid observers.

A list of expected staff participants was available, and each participant signed in for the exercise upon arrival. Non-participating staff not on the list were included in relevant denominator values. All participants and  observers were given a pretest survey in a briefing area. The coordinators and architects of the exercise were excluded from the study. Participants were front-line nurses, doctors and ancillary staff (administrators, non-clinical ward staff, blood bank technicians,and pharmacists) from the disciplines of emergency medicine, intensive care, trauma, surgery, pathology,  or radiology, hospital executives, and observers.  Participants did not attend more than one exercise. No real people acted as patients for this exercise, which employed the ETS methodology.  Casualties from the incidents were represented by 10 cm high magnetic “patients” (called “gubers” – see Box 1) placed on white boards, labelled as various areas of the ED, theatre and wards.11

At each exercise, staff were orientated to the simulation structure and rules (but not the actual scenario) during a 20-minute lecture immediately preceding the exercise. The lay-out of the whiteboards in different rooms, and their representation of the actual beds available within the ED, theatre, wards, ICU, and imaging areas was described.

The clinical information content on the guber-patients was explained: general appearance (obvious injuries, consciousness, age bracket) were listed on the front of the guber, and age and vital signs were listed on the back (Box 1). The rules for allocation of resources (including staff, equipment, and procedures)  to each guber-patient were described, and an exercise moderator well versed with the rules was assigned to one or more areas for assistance. The timing assigned to each clinical procedure, imaging, and transport, and the summation of delays on the whiteboards were explained.

Description of exercises
Each simulation activity aimed to test each hospital’s response to an intentional MCI, toidentify systems and processes needing adjustment, and to improve staff response efficacy.  For each exercise, a scenario plan was developed including details of the disaster, number of casualties, and (for each casualty) expected treatments, destinations and outcomes. Each exercise was limited to testing in-hospital response only, and excluded pre-hospital responses. During each exercise, the incident response was overseen and supported by a Health Emergency Operations Centre (HEOC). Each exercise ran for 3 − 4 h in a set of non-clinical rooms at each hospital.  Each exercise had three to four facilitators who rotated between the simulation areas.  The facilitators were senior clinicians with experience in using the ETS.

For each exercise, durations of clinical investigations (X-ray/computerised tomography (CT) scan, blood pathology), transfer times between areas (for example, ED to intensive care unit (ICU), and transfer requirements (e.g., having appropriate staff and equipment to accompany the patient), were set out based on hospital policies and statistics of actual times taken to perform imaging, tests and transfers. 

Table 1 below describes the three Emergo Train System mass casualty incident simulation exercises.

Staff available to care for guber patients were represented by corresponding nurse and doctor gubers, as was equipment (e.g. CT scanners and ventilators). Staffing at the start of the exercise and actual patients in each area of interest (ED, ICU, and operating theatre (OT)) were based on census and staffing data for each hospital at 6 pm on May 1 2017 (hospital A), April 22 2018 (hospital B), and on 14 November 2018 (hospital C) to ensure that the exercise reflected real and typical conditions patient flow and bed and staff capacity. A fuller description of how the ETS simulation exercise operates can be found on-line.12

The A Hospital exercise occurred in May 2017, simulating response to an intentional vehicular attack taking place at a local mass-gathering event. Victim numbers, injury severities and times of arrival were modelled on the Pasteur 2 Adult Hospital experience during the Nice terror attack on 16 July 2016.13  Injury types were modelled on a summary report of intentional vehicle attacks.14

The B Hospital exercise occurred in February 2018 and was modelled on a “bombs and blasts” scenario.  Eighty-one guber victims presented to the ED, following an accident in which a train loaded with commuters derailed and slammed into a power plant, which exploded. During this exercise, the B Hospital response was supported by the HEOC, which in turn was supported by the State Health Emergency Coordination Centre (SHECC), the role of which is to coordinate provision of wider health system support to local hospitals responding to crisis.  During this exercise SHECC coordinated transfer of “patients” to other health services, for example, paediatric cases to Queensland Children’s Hospital in Brisbane. Again, data around staffing, patient lengths of stay in each clinical zone and care processes (such as intubation, X-ray, suturing, and requirements for induction agents) were tracked.  Movement into each secondary zone was limited by predetermined “bed” spaces and local staffing.

The C hospital exercise occurred in November, 2018 and involved an aviation incident resulting in a “bombs and blast” set of injury types. The exercise included ED, ICU, ward and theatre staff.

Measures
The three outcome measures of interest for this study were the changes in staff perception of their confidence, knowledge and skills before, and immediately following, participation in an EMERGO exercise, measured using a survey tool. The attributes were each measured on a 5-point Likert scale ranging from 1 (not confident/knowledgeable/skilled) to 5 (very confident/knowledgeable/skilled) and linked for each individual. Only those individuals who remained at the end of the exercise completed the post-exercise survey.  The surveys are available on-line as appendices I and II at the HTML version of this paper.

Survey development and administration
A multi-stage survey development process was undertaken.15 The first stage was to establish the most effective underpinning educational theory to inform the survey. Given the focus on experiential learning and on moving staff to a reasonable level of competence in disaster management, a five-stage capacity from beginner to expert was used to underpin the survey questions.16  These were reflected in a five point Likert scale assessment for each element.15 It used the key simulation learning outcome components of confidence, knowledge and skill as the elements of investigation.11 Consultation with participant pools and the available literature determined that the survey needed to be brief, paper-based and administered on-site as part of the simulation exercise to optimise engagement with the components under investigation. The linking code was entered at the time of completion of both the pre- and post- survey.

Once drafted, the survey went through several phases of refinement including input from experts in emergency disaster management, clinical education processes and clinical simulations. The wording, phrasing and length of questions were revised to help ensure content and face validity.15 A high level of consistency was maintained between the pre- and post-simulation versions of the survey to best capture the impact of the training. The post-survey at hospital B was modified to capture information specifically requested by that site. Questions 1 – 4 remained the same.

Survey process
The participant surveys were administered immediately before and immediately after the exercises. Pre-simulation surveys (appendix i) were provided to staff at sign-on and prior to the team briefing. To ensure staff responses were immediate and not influenced by senior staff debrief content or overall simulation summaries, the post-simulation survey (appendix ii) provided at the end of the exercise, and prior to a structured debriefing. 

Data analysis
Simple descriptive statistics (counts and proportions) were produced in SPSSv24.0.  Pre- and post-survey data were linked for each individual staff member, and paired t-test analyses were performed to assess change in skills, confidence and knowledge. A field for open comment about the value of the simulation was also included. These comments were transcribed verbatim and analysed thematically by two authors, using an open-ended interpretative approach.17 This methodology enabled the researchers to capture detailed, contextualized and rich descriptions of participants’ key personal experiences and perceptions.

Results
Overview of participants
According to participant lists, 232 individuals participated in one of the exercises (88 at A, 71 at B, and 73 at C).  Of these, 129 (55.6%) completed both pre- and post-exercise surveys. Most participants were clinical staff (69%) (nurses, doctors, radiologists, pharmacists and pathology staff). The non-clinical staff (31%) included executive team members, observers, exercise support and other ancillary members. Characteristics of participants are presented in Table 1.

Survey results
Average self-assessed confidence, skills, and knowledge scores increased significantly (p < 0.001) after the exercise (Fig. 1). Of the two hospitals that asked whether or not participants found the exercise valuable, 113 of 116 respondents to the post-exercise survey (97%) marked “yes”.

Open-ended comments from the survey
Comments on the post-exercise survey were received by 29 staff from A and B hospitals. The themes that emerged were:

1.  Pre-briefing
The importance of a comprehensive pre-exercise session on how the exercise works was expressed by many staff. In general, staff participating in the exercise needed a better understanding of how the exercise runs – how they were to interpret the gubers, how to attach clinical treatments to gubers and assign wait times to the exercise boards, when other team specialties should be called for consultations and the telephone numbers to do this in the exercise, a better understanding of the “rules of engagement” and what was “in-exercise” and what was “out-of-scope”.

2.  Communications
Concern was raised about the inability to reach other key people for decision-making due to telephone lines constantly engaged. A clear hierarchy of communication and telephone numbers for all team leaders were required. Instructions were required on how to call back large numbers of staff. The physical presence of certain specialty staff at triage (e.g. surgeon, ICU, paediatrics) would have been beneficial. Initial announcement of the MCI was delayed: communication from the hospital command and control team into the clinical areas was poor. Communications to ancillary departments (pharmacy, social work, blood bank) needed improvement.

3.  Realism
Some services were not realistically accounted for (e.g. blood bank, pathology, pharmacy and the order of controlled medications). The partnership with ambulance services is critical, but pre-hospital care was not part of the exercises. In settings that are highly electronic and computerised, the lack of computers to look up laboratory results or radiology reports was disconcerting. Some participants would have preferred actors over gubers. Some wounds that would have been obvious in a patient were overlooked as text on gubers.  At A, the inability to transfer the casualties to other sites was frustrating, as transfers would happen in real life. At B, transfers occurred and were important, as B ran out of ventilators after 17 “intubations” and six patients had to be ventilated using a bag-valve mask. Many gubers went missing when being transferred to imaging or theatre.

Discussion
This study has demonstrated that a multi-casualty disaster simulation may improve staff self-impressions of efficacy, increasing perceived preparedness by increasing self-assessed knowledge, confidence and skills in the management of a mass casualty incident at their health facility. This is in keeping with other literature that reported the effects of disaster simulation on clinical staff performance.18

Staff at our study sites showed that ETS requires less space and human resources than other types of simulation, but suffers from lack of realism and enforced timing requirements that decrease the flexibility of responses. Simulation exercises can increase a learner’s sense of knowledge and comfort with disaster topics more than other learning and teaching methods.  This may increase staff participation in the event of an actual disaster.19 While some data suggest that increasing realism adds to the learning outcomes of disaster simulation,20 combination activities,21,22 three-dimensional-virtual reality23 and video training24 can all enhance engagement with clinical systems and processes used in a disaster.  It would be useful to compare and contrast the costs and benefits of the different types of exercises that can be used in this context.  Regardless of the effect on individuals, however, a major strength of exercises such as this lies in identifying where systems would potentially fail before they do so in real life.

This paper adds to the evidence base that MCI exercises increase staff preparedness. But,there is scarce evidence that better preparedness makes any difference to actual response. Furthermore, measuring performance during a real disaster requires an internationally accepted set of meaningful indicators that are not onerous to capture. Some authors have reported on real-life measures of performance in disasters, and described the requirements25 and/or response.13,14,26-29 The information provided by clinicians involved in the 2016 Nice disaster allowed the development of scenarios to test other hospitals’ systems, including our own.13 Taking the time to document and share learnings from real incidents is invaluable and should be applauded. Meanwhile, in the absence of agreement on core performance measures, a list of clinical competencies required to respond to disasters should be developed.30

Limitations
This pretest–posttest study is subject to several limitations.Threats to its internal validity include the inability to locate a validated tool to assess knowledge, confidence and skills in simulation. Although care was applied to the survey development, the reproducibility and validity of the test against the outcomes is uncertain.Secondly, the post-test occurred immediately after the intervention with no further test administered at a later time. Therefore, these increases in confidence, skills and knowledge may not be retained. Participants were not randomly selected from staff rosters, but rather were “hand-picked” by their management teams, introducing the possibility of selection bias. It is likely the staff completing these exercises were more experienced and more likely to be in leadership roles, and their improvements might not be transferrable to more junior nursing and medical staff, in general. Additionally, participation in the follow-up survey was relatively low (56%), and only those participants that stayed for the entire exercise were offered a follow-up survey. This is another potential source of bias, as these participants were self-selected to remain and may have been more interested in the topic. Whilst this study showed improvements in confidence, knowledge and skills, without further research it is not possible to generalize these improvements to staff responses in actual MCIs.

This study had no control group of staff that did not participate in the exercise. A control group might have been a group of clinicians that simply took the pre-test survey, spent a few hours doing their usual clinical shift, and then took the post-test survey. Future evaluations of exercises should have this design component as a minimum.

Conclusions
This study lends support to the assumption that the exercises applied have a positive effect on staff attitudes, self-awareness and knowledge, and hence form the basis of an optimal hospital-wide response in practice to a critical incident.  However, this remains to be proved.  Organizational investment in simulation and thorough evaluation of disaster simulations can help identify system opportunities and improve staff knowledge of, skills for and confidence in managing mass casualty incidents. Application of a short survey can help health organizations capture staff perceptions of simulation and capture valuable learning points which may enhance future simulated and real-life systems responses.

REFERENCES

  1. Leaning J, Guha-Sapir D. Natural disasters, armed conflict, and public health. New Engl J Med. 2013; 369: 1836-1842.
  2. Guha-Sapir D, d’Aoust,O. Centre for Research on the Epidemiology of Disasters. Demographic and health consequences of civil conflict. https://openknowledge.worldbank.org/bitstream/handle/10986/9083/WDR2011_0011.pdf (2011). Accessed June 18, 2020
  3. Lin PI, Fei L, Barzman D, Hossain M. What have we learned from the time trend of mass shootings in the U.S.? PLoS One. 2018;13:e0204722.
  4. Kearns RD, Cairns BA, Cairns CB. Surge capacity and capability. A review of the history and where the science is today regarding surge capacity during a mass-casualty disaster. Front Public Health. 2014; 2: 29.
  5. Homeland Security Exercise and Evaluation Program. Feb 2020. https://www.fema.gov/emergency-managers/national-preparedness/exercises/hseep [Accessed 29/06/21].
  6. Nilsson H, Vikström T, Jonson CO. Performance indicators for initial regional medical response to major incidents: a possible quality control tool.  Scand J Trauma Resusc Emerg Med  2012; 20: 81 https://doi.org/10.1186/1757-7241-20-81.
  7. California Emergency Medical Services Authority. Hospital Incident Command System. 2021. https://emsa.ca.gov/disaster-medical-services-division-hospital-incident-command-system-resources/ [Accessed 29/06/21].
  8. Teaching a systematic approach to disaster medical management. Major Incident Medical Magagement and Support. https://mimms.org.au. Accessed 30 June 2021.
  9. TariVerdi M, Miller-Hooks E, Kirsch T. Strategies for improved hospital response to mass casualty incidents. Disaster Med Public Health Prep. 2018; 12: 778-90.
  10. Langan JC, Lavin R, Wolgast KA, Veenema TG. Education for Developing and Sustaining a Health Care Workforce for Disaster Readiness. Nurs Admin Quarterly. 2017; 41: 118-127.
  11. Perry RW. Disaster exercise outcomes for professional emergency personnel and citizen volunteers. J Contingencies Crisis Management. 2004; 12: 64-75.
  12. EmergoTrain System. http://www.emergotrain.com. Accessed June 18, 2020.
  13. Carles M, Levraut J, Gonzalez JF, Valli F, Bornard L. Mass casualty events and health organisation: terrorist attack in Nice. Lancet. 2016; 388 (10058): 2349-50.
  14. Almogy G, Kedar A, Bala M. When a vehicle becomes a weapon: intentional vehicular assaults in Israel. Scandinavian J Trauma, Resusc Emerg Med. 2016; 24: 149.
  15. DeVellis RF. Scale Development: Theory and Applications. 4th ed. London: Sage Publications, 2016.
  16. Benner P. From novice to expert. Am J Nurs. 1982; 82: 402-7.
  17. Lincoln YS, Guba GG. Naturalistic inquiry. Beverley Hills, Ca., USA: Sage Publications, 1985.
  18. Bartley BH, Stella JB, Walsh LD. What a disaster?! Assessing utility of simulated disaster exercise and educational process for improving hospital preparedness. Prehospital and disaster medicine. 2006; 21: 249-55.
  19. Behar S, Upperman JS, Ramirez M, Dorey F, Nager A. Training medical staff for pediatric disaster victims: a comparison of different teaching methods. Amer J Disaster Med. 2008; 3: 189-99.
  20. Brody A, Kashuk JL, Moore EE, Biffl W, Johnson JL et al. Live victim volunteers enhance performance improvement (PI) in mass casualty incident (MCI) drills. J Surg Res. 2010; 158: 253.
  21. Channan P, Price I, Welsford M, Kulasegaram M, Sherbino J. USING a didactic lecture and live field simulation to teach disaster medicine to emergency medicine residents. Can J Emerg Med. 2010; 12: 244.
  22. Aluisio AR, Daniel P, Grock A, Freedman J, Singh A et al. Case-based learning outperformed simulation exercises in disaster preparedness education among nursing trainees in India: a randomized controlled trial. Prehospital Disaster Med. 2016; 31: 516-523.
  23. Farra S, Miller E, Timm N, Schafer J. Improved training for disasters using 3-D virtual reality simulation. Western J Nurs Res. 2013; 35: 655-671.
  24. Curtis H, Chason K, Richardson LD. Competency-based video modules for educating emergency medicine residents in disaster medicine. Acad Emerg Med. 2010; 17: S211.
  25. Ramsey G. Blood transfusions in mass casualty events: recent trends. VOXS 2020; 115: 358-66.
  26. Doughty, H. Rackham, R. Transfusion emergency preparedness for mass casualty events. ISBT Science Series Sep 2018; VOXS 2019; 14: 77-83. doi.org/1111/voxs.12448.
  27. Fratta A. Post-9/11 Responses to Mass Casualty Bombings in Europe: Lessons, Trends and Implications for the United States. Studies in Conflict & Terrorism. 2010; 33: 364-385.
  28. Campion EM, Juillard C, Knudson MM, Dicker R, Cohen MJ et al. Reconsidering the resources needed for multiple casualty events: lessons learned from the crash of Asiana Airlines Flight 214. JAMA Surgery 2016; 151: 512-7.
  29. El Sayed M, Chami AF, Hitti E. Developing a Hospital Disaster Preparedness Plan for Mass Casualty Incidents: Lessons Learned From the Downtown Beirut Bombing.  Disaster Med Public Health Prep 2018; 12: 379-85. doi:10.1017/dmp.2017.83.
  30. Murphy JP, Rådestad M, Kurland L, Jirwe M, Djalali A, Rüter A. Emergency department registered nurses’ disaster medicine competencies. An exploratory study utilizing a modified Delphi technique. Int Emerg Nurs. 2019; 43: 84-91. doi​: 10.1016/j.ienj.2018.11.003.

APPENDICES

COMMENTS

Leave a Reply

Your email address will not be published. Required fields are marked *

Publication Categories

AUTHOR INFORMATION

Amy L Sweeney

OCCUPATION
Nurse and Public Health Practitioner
INSTITUTIONAL AFFILIATIONS
Gold Coast Hospital and Health Service

Griffith University School of Medicine

Yuet L Lui

OCCUPATION
Medical Student
INSTITUTIONAL AFFILIATIONS
Griffith University School of Medicine

Dr Nathan Watkins

OCCUPATION
Emergency Staff Specialist
INSTITUTIONAL AFFILIATIONS
Gold Coast Hospital and Health Service

Peter McNamee

OCCUPATION
Disaster and Emergency Manager
INSTITUTIONAL AFFILIATIONS
Gold Coast Hospital and Health Service

Amanda Samsuddin

OCCUPATION
Medical Student
INSTITUTIONAL AFFILIATIONS
Griffith University School of Medicine

Cindy Huang

OCCUPATION
Medical Student
INSTITUTIONAL AFFILIATIONS
Griffith University School of Medicine

Scott Henry

OCCUPATION
Clinical Nurse
INSTITUTIONAL AFFILIATIONS
Gold Coast Hospital and Health Service

Dr Amy J Johnston

OCCUPATION
Senior Research Fellow
INSTITUTIONAL AFFILIATIONS
Princess Alexandra Hospital

Test Pop-up