The use of virtual assistant devices like Amazon Echo in health care settings has been featured in a number of news stories in recent years. But a lawsuit brought against Amazon this summer by four health care workers — in which they alleged they didn’t realize the devices could record their conversations — indicates that while plenty of people use these devices to check weather, play music or games, or research information, not everyone understands exactly how they operate.
In a class-action suit filed in Washington state on June 30 (Scott et al. v. Amazon.com, Inc.; case #2:21-cv-00883), the health care workers, including a New Jersey substance use counselor and a Georgia-based health care company customer service representative, alleged that their Amazon smart speaker devices recorded their private conversations about patients without their knowledge or intent, according to a write-up on classaction.org.
Furthermore, they said these records were sent to Amazon, where artificial intelligence and human employees or contractors may have listened to the conversations. According to the case, the plaintiffs felt that Amazon failed to disclose that the devices could activate and begin recording even when users have not uttered the word “Alexa” to activate them. Had the plaintiffs known this, the case said, they would not have purchased the devices.
The suit also alleged that Amazon’s conduct violated federal and Washington state wiretapping, privacy and consumer protection laws.
The company did not respond to inquiries in news reports of the case, but in 2019, they did announce an ongoing effort to ensure that transcripts would be deleted from Alexa’s servers after customers deleted voice recordings, according to an article in Healthcare IT News. Amazon executives also noted in 2020 that customers can opt out of human annotation of transcribed data and that they can automatically delete voice recordings older than three or 18 months, the article said.
Users should be aware that virtual assistant devices are always listening for their “wake words,” like “Alexa” or “Siri,” depending on the brand or model, said Emre Sezgin, Ph.D., a digital health scientist with Nationwide Children’s Hospital in Columbus, Ohio. But as anyone who owns one of the devices may have observed, the technology is still being fine-tuned, and they sometimes activate if they hear similar words. Not to take the risk of exposing protected health information, any health care worker discussing private information about patients in the vicinity of the devices can mute or unplug them, Sezgin said, and reactivate them later. While recordings could be audited by Amazon, it’s mainly for quality assurance, he said, such as assessing how well the devices did answering questions or prompts.
Meanwhile, the devices do offer several potential benefits for health care settings, said Sezgin, co-author of a 2020 commentary in the journal NPJ Digital Medicine on readiness for voice assistants to support health care delivery during crises. One is convenience. Users don’t have to type anything or touch a screen; they can just begin asking questions in their natural language. In this way, it’s more inclusive than other technologies because users don’t have to be literate or have good hand and finger dexterity for typing or swiping. The devices also can involve family members and caregivers in the room, who can hear answers to a user’s health questions.
Virtual assistants have been adopted by health care institutions to respond to health information-seeking users by providing health care tips and guidelines, health news, updates about hospital operations, first aid guidance and other medical communications, Sezgin’s commentary noted. During crises, the commentary said, the devices could facilitate the control of infection and reduce hands-on documentation burden by assisting physicians with dictating visit notes, ordering tests, and charting or navigating electronic records hands-free. They also could assist in nurse triaging by assessing patients’ risk levels through conversational assessments and could use voice as a digital biomarker that could be leveraged for the continuous screening and detection of pandemic symptoms, such as identifying respiratory disorders.
Some health systems’ uses of virtual assistants were cited in a recent article in Becker’s Health IT:
- Penn State College of Medicine built an Amazon Alexa program to deliver care interventions to breast cancer patients in their homes. The program, called Addressing Metastatic Individuals Everyday, or Nurse AMIE, interacts with patients via voice to address symptoms such as pain, fatigue, sleep, and anxiety and depression, and offer interventions.
- ChristianaCare in Delaware developed an Alexa skill called Home Care Coach. The program is a provider-driven, patient-customized, proactive care plan delivered through Alexa. A web interface allows providers to customize patient care plans. Patients then can ask Alexa questions about their prescribed medicines, exercises and more and get personalized prompts to meet their needs.
- Northwell Health in New York rolled out a program in 2017 to help patients identify nearby urgent care centers and emergency rooms with the shortest wait times.
The COVID-19 pandemic spurred more innovation with the devices. The Mayo Clinic in April 2020 made available its “Mayo Clinic Answers on COVID-19” skill through which users can ask questions about the coronavirus and receive information from experts at Mayo Clinic and the CDC, according to Becker’s Health IT. This included a COVID-19 self-assessment tool. Northwell Health equipped COVID-19 patients’ rooms with Amazon devices that feature two-way video calling capabilities that let clinicians check on patients, CNN reported. The tool also helped with asking patients about their health history or how they felt after receiving medications.
There are some caveats, Sezgin said. Voice may not be the best way to communicate in a public space, and while the technology uses artificial intelligence and is continuously being refined to understand voice commands, there are times when the devices may not provide the correct responses, which can lead to frustration. In addition, a stable internet connection is required and, for health systems to use these well, IT infrastructure to integrate with health records and remain compliant to patient privacy and other regulations.
Good questions for journalists to ask
When writing about health care systems’ use of these devices, go deep on questions about how they will work within hospital regulations and/or overcome any obstacles to encourage usage, Sezgin advised.
“There’s still some work to do, especially in the health care domain,” he said. “But I strongly believe we will get there.”