Between 370,000 and 750,000 cardiopulmonary resuscitations are attempted each year in US hospitals. Approximately 80% of these patients do not survive to discharge. However, for many, pulseless ventricular tachycardia or ventricular fibrillation (VT/VF) is the first monitored arrhythmia, and is treatable with prompt defibrillation The American Heart Association recommends defibrillation therapy within 2 minutes of cardiac arrest onset. Yet, for 30% of patients, defibrillation is delayed more than 2 minutes, reducing their chance of survival to hospital discharge by half. There are few more important objectives in caring for a hospitalized patient at risk than prompt response to potentially fatal arrhythmias; however, there is little reliable evidence to guide best practice for this objective. There is a ned for a better understanding of the impact of modifiable factors, including monitor watchers' workload, communication pathways, and supportive technologies, on monitoring effectiveness. Simulation of cardiac arrhythmias in real hospital units allows us to measure the effect of these factors on response times without putting patients at risk. Our objective is to identify and test determinants of effective cardiac monitoring systems. To achieve this objective, we will first compare 10 monitoring systems including remote telemetry, local monitoring by nurses, and local monitoring with automated notifications. In each setting, we will use data from interviews, observations, and 4 VT/VF simulations to conduct task analyses. The task analyses will allow us to characterize each system according to the sequence and timing of events from the beginning of an arrhythmia until a clinician arrives in the patient's room, and to establish the relevant system constraints. Using the findings from the task analyses, we will choose 3 systems that represent the greatest potential for consistently rapid response times and conduct detailed observations and 20 VT/VF simulations in each one. Data collected will include the proportion of critical and non-critical alarms responded to and the distribution of time spent on different nursing tasks. These data will inform the development of a computer simulation model of each system that will allow us to identify the most efficient system. The simulation models will also allow us to conduct sensitivity (`what-if') analyses. Finally, in a new patient care unit that was ot included in the study, we will implement the monitoring system determined to be the most efficient by the simulation models. We will conduct 40 VT/VF simulations before implementation and 40 additional simulations after implementation. We hypothesize that response times to the critical arrhythmias will be shorter using the new system. The expected outcome of this study is a catalog of monitoring factors that can favorably and unfavorably impact response times to critical arrhythmias. The knowledge gained will inform efforts to develop and study interventions to improve arrhythmia response time, and will ultimately help to develop evidence-based monitoring standards. The application of such standards is expected to improve survival after in-hospital cardiac arrest.
To increase the potential for timely detection and treatment of cardiac events, hospitals have implemented a number of different cardiorespiratory monitoring systems for patients who meet at-risk criteria. However, decisions regarding how to structure and staff monitoring systems have historically been made with little supporting evidence. We propose to use simulation to identify and test determinants of effective cardiac monitoring. The knowledge to be gained will inform efforts to develop evidence-based monitoring standards.
|Segall, Noa; Bennett-Guerrero, Elliott (2017) ICU Rounds: ""What We've Got Here Is Failure to Communicate"". Crit Care Med 45:366-367|