Tag Archives: ecri

Alarm fatigue hurts patient care, overwhelms nurses

Andrew Van Dam

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

In the wake of several high-profile incidents, The Boston Globe‘s Liz Kowalczyk has assembled a thorough investigation of alarm fatigue in hospitals. Alarm fatigue, for the record, is the idea that the huge arsenal of patient monitors in any given hospital floor are going off so often that nurses become slower in their responses to the alarms. For example, in one 15-bed unit at Johns Hopkins, staff found that, on average, one critical alarm went off every 90 seconds throughout the day.

With the help of ECRI, Kowalczyk has managed to attach some numbers to the issue.

The Globe enlisted the ECRI Institute, a nonprofit health care research and consulting organization based in Pennsylvania, to help it analyze the Food and Drug Administration’s database of adverse events involving medical devices. The institute listed monitor alarms as the number-one health technology hazard for 2009. Its review found 216 deaths nationwide from 2005 to the middle of 2010 in which problems with monitor alarms occurred.

But ECRI, based on its work with hospitals, believes that the health care industry underreports these cases and that the number of deaths is far higher. It found 13 more cases in its own database, which it compiles from incident investigations on behalf of hospital clients and from its own voluntary reporting system.

Kowalczyk also looks at potential solutions to the problem and how some institutions are trying to make changes to eliminate alarm fatigue, including cutting back on unnecessary monitors and having monitor warnings appear on nurses’ pagers or cell phones.

To back up the numbers, Kowalcyzk got some telling quotes from frustrated nurses.

“Yes, this is real, and, yes, it’s getting worse,’’ said Carol Conley, chief nursing officer for Southcoast Health System, which includes Tobey Hospital. “We want to keep our patients safe and take advantage of all the technology. The unintended consequence is that we have a very over-stimulated environment.’’

“Everyone who walks in the door gets a monitor,’’ said Lisa Sawtelle, a nurse at Boston Medical Center. “We have 17 [types of] alarms that can go off at any time. They all have different pitches and different sounds. You hear alarms all the time. It becomes . . . background.’’

Kowalcyzk’s investigation points out that, while alarms do tend to go off when there’s a real problem, it appears that they do so at the expense of also going off when there isn’t.

Monitors can be so sensitive that alarms go off when patients sit up, turn over, or cough. Some studies have found more than 85 percent of alarms are false, meaning that the patient is not in any danger. Over time this can make nurses less and less likely to respond urgently to the sound.

For more specifics on device design issues, see the final subheading, titled “Looking for solutions.”

For a one year, the Joint Commission made routine alarm testing and training part of their accreditation requirements, but dropped the stipulation in 2004 when it felt the problem had been solved.

Other parts of the series:

Video, presentations from comparative effectiveness conference available online

Andrew Van Dam

About Andrew Van Dam

Andrew Van Dam of The Wall Street Journal previously worked at the AHCJ offices while earning his master’s degree at the Missouri School of Journalism.

Earlier this month, ECRI’s 17th annual conference tackled the thorniest detail of comparative effectiveness research, namely that it’s rarely a simple matter of A > B. Groups and individuals respond differently.

With a theme of “Comparative Effectiveness and Personalized Medicine,” the nonprofit and its partners at NIH and Health Affairs, among others, sought to better understand how big research ideas will interface with the person-by-person decisions through which such work will ultimately be implemented.

The conference has a detailed postmortem online, including two days of video (Fair warning: Together, they’re a good 700+ minutes of conference) and slides from a number of the presentations. I strongly recommend using the conference schedule listed on the slides page as a rough guide to finding the most relevant bits of video.

In case you’re looking for a place to start, here are two of the most relevant presentations:

The online Q and A is also interesting, though there are only a handful of answers up at present. The most relevant one so far comes from Vivian Coates (Vice President, Information Services and Health Technology Assessment, ECRI Institute), in response to a query about a central listing of comparative effectiveness projects.

The CER inventory contract was awarded to the Lewin Group Center for Comparative Effectiveness Research (CER) in June, 2010. Over the 27 month period of the contract, Lewin will design, build and launch a web-based inventory that catalogs CER outputs and activity, including research studies, relevant research methods, training of researchers, data infrastructure and approaches for dissemination and translation of comparative effectiveness research to health care providers and patients.