'Alexa, monitor my heart': Researchers develop first contactless cardiac arrest AI system for smart speakers

The researchers envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event. If it detects agonal breathing, it can call for help. Credit: Sarah McQuate/University of Washington

People experiencing cardiac arrest will suddenly become unresponsive and either stop breathing or gasp for air, a sign known as agonal breathing. Immediate CPR can double or triple someone's chance of survival, but that requires a bystander to be present.

Cardiac arrests often occur outside of the hospital and in the privacy of someone's home. Recent research suggests that one of the most common locations for an out-of-hospital cardiac arrest is in a patient's bedroom, where no one is likely around or awake to respond and provide care.

Researchers at the University of Washington have developed a new tool to monitor people for cardiac arrest while they're asleep without touching them. A new skill for a smart speaker– like Google Home and Amazon Alexa — or smartphone lets the device detect the gasping sound of agonal breathing and call for help.

On average, the proof-of-concept tool, which was developed using real agonal breathing instances captured from 911 calls, detected agonal breathing events 97% of the time from up to 20 feet (or 6 meters) away. The findings are published June 19 in npj Digital Medicine.

“A lot of people have smart speakers in their homes, and these devices have amazing capabilities that we can take advantage of,” said co-corresponding author Shyam Gollakota, an associate professor in the UW's Paul G. Allen School of Computer Science & Engineering.

“We envision a contactless system that works by continuously and passively monitoring the bedroom for an agonal breathing event, and alerts anyone nearby to come provide CPR. And then if there's no response, the device can automatically call 911.”

Agonal breathing is present for about 50% of people who experience cardiac arrests, according to 911 call data, and patients who take agonal breaths often have a better chance of surviving.

“This kind of breathing happens when a patient experiences really low oxygen levels,” said co-corresponding author Dr. Jacob Sunshine, an assistant professor of anesthesiology and pain medicine at the UW School of Medicine. “It's sort of a guttural gasping noise, and its uniqueness makes it a good audio biomarker to use to identify if someone is experiencing a cardiac arrest.”

The researchers gathered sounds of agonal breathing from real 911 calls to Seattle's Emergency Medical Services. Because cardiac arrest patients are often unconscious, bystanders recorded the agonal breathing sounds by putting their phones up to the patient's mouth so that the dispatcher could determine whether the patient needed immediate CPR.

The team collected 162 calls between 2009 and 2017 and extracted 2.5 seconds of audio at the start of each agonal breath to come up with a total of 236 clips. The team captured the recordings on different smart devices — an Amazon Alexa, an iPhone 5s and a Samsung Galaxy S4 — and used various machine learning techniques to boost the dataset to 7,316 positive clips.

“We played these examples at different distances to simulate what it would sound like if it the patient was at different places in the bedroom,” said first author Justin Chan, a doctoral student in the Allen School. “We also added different interfering sounds such as sounds of cats and dogs, cars honking, air conditioning, things that you might normally hear in a home.”

For the negative dataset, the team used 83 hours of audio data collected during sleep studies, yielding 7,305 sound samples. These clips contained typical sounds that people make in their sleep, such as snoring or obstructive sleep apnea.

From these datasets, the team used machine learning to create a tool that could detect agonal breathing 97% of the time when the smart device was placed up to 6 meters away from a speaker generating the sounds.

Next the team tested the algorithm to make sure that it wouldn't accidentally classify a different type of breathing, like snoring, as agonal breathing.

“We don't want to alert either emergency services or loved ones unnecessarily, so it's important that we reduce our false positive rate,” Chan said.

For the sleep lab data, the algorithm incorrectly categorized a breathing sound as agonal breathing 0.14% of the time. The false positive rate was about 0.22% for separate audio clips, in which volunteers had recorded themselves while sleeping in their own homes. But when the team had the tool classify something as agonal breathing only when it detected two distinct events at least 10 seconds apart, the false positive rate fell to 0% for both tests.

The team envisions this algorithm could function like an app, or a skill for Alexa that runs passively on a smart speaker or smartphone while people sleep.

“This could run locally on the processors contained in the Alexa. It's running in real time, so you don't need to store anything or send anything to the cloud,” Gollakota said.

“Right now, this is a good proof of concept using the 911 calls in the Seattle metropolitan area,” he said. “But we need to get access to more 911 calls related to cardiac arrest so that we can improve the accuracy of the algorithm further and ensure that it generalizes across a larger population.”

The researchers plan to commercialize this technology through a UW spinout, Sound Life Sciences, Inc.

“Cardiac arrests are a very common way for people to die, and right now many of them can go unwitnessed,” Sunshine said. “Part of what makes this technology so compelling is that it could help us catch more patients in time for them to be treated.”

###

Dr. Thomas Rea, a professor of general internal medicine at the UW School of Medicine and the medical director of King County Medic One was also a co-author on this paper. This research was funded by the National Science Foundation.

Photos available (if clicking the link doesn't work, copy and paste it into your browser):

https://drive.google.com/open?id=1m3DfNX-5rmD7z8ZlDpD98zQGxY7RwrJU

MEDIA CONTACTS:

Sarah McQuate (smcquate@uw.edu)
James Urton (jurton@uw.edu)
University of Washington
206-543-2580

For more information, contact the research team at cardiacalert@cs.washington.edu.

http://www.washington.edu/news/ 

Media Contact

James Urton EurekAlert!

All latest news from the category: Information Technology

Here you can find a summary of innovations in the fields of information and data processing and up-to-date developments on IT equipment and hardware.

This area covers topics such as IT services, IT architectures, IT management and telecommunications.

Back to home

Comments (0)

Write a comment

Newest articles

New organoid with all key pancreas cells

Researchers from the Organoid group (previously Clevers group) at the Hubrecht Institute have developed a new organoid that mimics the human fetal pancreas, offering a clearer view of its early development….

Unlocking the potential of nickel

New study reveals how to use single atoms to turn CO2 into valuable chemical resources. Nickel and nitrogen co-doped carbon (Ni-N-C) catalysts have shown exceptional performance in converting CO2 into…

‘Spooky action’ at a very short distance

Scientists map out quantum entanglement in protons. Particles streaming from collisions offer insight into dynamic interactions and collective behavior of quarks and gluons. Scientists at the U.S. Department of Energy’s…