X

AI can quickly and accurately analyze heart scans, study says

Researchers find machine learning can classify heart anatomy on an ultrasound scan faster, more accurately and more efficiently than a human.

Abrar Al-Heeti Technology Reporter
Abrar Al-Heeti is a technology reporter for CNET, with an interest in phones, streaming, internet trends, entertainment, pop culture and digital accessibility. She's also worked for CNET's video, culture and news teams. She graduated with bachelor's and master's degrees in journalism from the University of Illinois at Urbana-Champaign. Though Illinois is home, she now loves San Francisco -- steep inclines and all.
Expertise Abrar has spent her career at CNET analyzing tech trends while also writing news, reviews and commentaries across mobile, streaming and online culture. Credentials
  • Named a Tech Media Trailblazer by the Consumer Technology Association in 2019, a winner of SPJ NorCal's Excellence in Journalism Awards in 2022 and has three times been a finalist in the LA Press Club's National Arts & Entertainment Journalism Awards.
Abrar Al-Heeti
3 min read

Artificial intelligence is already set to affect countless areas of your life, from your job to your health care. New research reveals it could soon be used to analyze your heart.

ABSTRACT OF HEART ACTIVITY AS SHOWN BY ELECTROCARDIOGRAM & ECHOCARDIOGRAM. (DOPPLER)

AI could soon be used to analyze your heart.

Getty

study published Wednesday found that advanced machine learning is faster, more accurate and more efficient than board-certified echocardiographers at classifying heart anatomy shown on an ultrasound scan. The study was conducted by researchers from the University of California, San Francisco, the University of California, Berkeley, and Beth Israel Deaconess Medical Center.

Researchers trained a computer to assess the most common echocardiogram (echo) views using more than 180,000 echo images. They then tested both the computer and human technicians on new samples. The computers were 91.7 to 97.8 percent accurate at assessing echo videos, while humans were only accurate 70.2 to 83.5 percent of the time.

"This is providing a foundational step for analyzing echocardiograms in a comprehensive way," said senior author Dr. Rima Arnaout, a cardiologist at UCSF Medical Center and an assistant professor at the UCSF School of Medicine.

Interpreting echocardiograms can be complex. They consist of several video clips, still images and heart recordings measured from more than a dozen views. There may be only slight differences between some views, making it difficult for humans to offer accurate and standardized analyses.

AI can offer more helpful results. The study states that deep learning has proven to be highly successful at learning image patterns, and is a promising tool for assisting experts with image-based diagnosis in fields such as radiology, pathology and dermatology. AI is also being utilized in several other areas of medicine, from predicting heart disease risk using eye scans to assisting hospitalized patients. In a study published last year, Stanford researchers were able to train a deep learning algorithm to diagnose skin cancer.

But echocardiograms are different, Arnaout says. When it comes to identifying skin cancer, "one skin mole equals one still image, and that's not true for a cardiac ultrasound. For a cardiac ultrasound, one heart equals many videos, many still images and different types of recordings from at least four different angles," she said. "You can't go from a cardiac ultrasound to a diagnosis in just one step. You have to tackle this diagnostic problem step-by step." That complexity is part of the reason AI hasn't yet been widely applied to echocardiograms.

The study used over 223,000 randomly selected echo images from 267 UCSF Medical Center patients between the ages of 20 and 96, collected from 2000 to 2017. Researchers built a multilayer neural network and classified 15 standard views using supervised learning. Eighty percent of the images were randomly selected for training, while 20 percent were reserved for validation and testing. The board-certified echocardiographers were given 1,500 randomly chosen images -- 100 of each view -- which were taken from the same test set given to the model.

The computer classified images from 12 video views with 97.8 percent accuracy. The accuracy for single low-resolution images was 91.7 percent. The humans, on the other hand, demonstrated 70.2 to 83.5 percent accuracy.

One of the biggest drawbacks of convolutional neural networks is they need a lot of training data, Arnaout said. 

"That's fine when you're looking at cat videos and stuff on the internet -- there's many of those," she said. "But in medicine, there are going to be situations where you just won't have a lot of people with that disease, or a lot of hearts with that particular structure or problem. So we need to be able to figure out ways to learn with smaller data sets."

She says the researchers were able to build the view classification with less than 1 percent of 1 percent of the data available to them.

There's still a long way to go -- and lots of research to be done -- before AI takes center stage with this process in a clinical setting.

"This is the first step," Arnaout said. "It's not the comprehensive diagnosis that your doctor does. But it's encouraging that we're able to achieve a foundational step with very minimal data, so we can move onto the next steps."

The Smartest Stuff: Innovators are thinking up new ways to make you, and the things around you, smarter.

Tech Enabled: CNET chronicles tech's role in providing new kinds of accessibility.