Google's AI tech hopes to catch early signs of cancer
As a doctor, what I care about most is improving patients' lives.
And that means good care and accurate diagnosis.
That's why I was so excited two years ago at IO when we shared our work in diabetic retinopathy.
This is a complication of diabetes.
That puts over 400 million people around the world at risk for vision loss.
Since then, we've been piloting this work with patients in clinical settings.
Our partners at Verily recently recently receive European regulatory approval for the machine learning model.
And we have clinical deployments in Thailand and India, they are already 3000 of patients in addition to diabetes one of the other areas we think EA can help doctors is an oncology, today we like to share our work on another project in [UNKNOWN].
Where AI can help catch lung cancer earlier.
So, lung cancer causes more death than any cancer.
It's actually the most common cause of death globally accounting for 3% of annual mortality.
We know that when cases are diagnosed early patients have a higher chance of survival.
But unfortunately, over 80% of lung cancers are not caught early.
Randomized controlled trials have shown that screening with low dose CTS can help reduce mortality.
But there's opportunity to make them more accurate.
So in a paper we are about to publish in nature medicine.
We describe a deep learning model that can analyze CT scans and predict lung malignancies.
To do it, we trained a neural network with de identified lung cancer scans from our partners at the NCI, the National Cancer Institute, A Northwestern University.
By looking at many examples, the model learns to detect malignancy, with performance that meets or exceeds that of trained radiologists.
So concretely, how might this help.
Very early stage cancer is miniscule And can be hard to see, even for seasoned radiologists which means that many patients with late stage lung cancer have subtle signs on earlier scans.
So take this case, where asymptomatic patients with no history of cancer had a CT scan for screening.
This scan was interpreted as normal.
One year later that same patient had another scan.
It picked up a late stage cancer, one that's much harder to treat.
So we used our AI system to review that inital scan.
So let's be clear, this is a tough case.
We showed this initial scan to other radiologists and five out of six missed this cancer.
But our model was able to detect these early signs one year before the patient was actually diagnosed.
Translate to an increase survival rate of 40% for patients like this.
So clearly this is a promising but early results.
And we're very much looking forward to partnering with the medical community to use technology like this to help improve outcomes for patients.
Nest Wifi puts Google Assistant into your router
New Nest Mini promises better sound, smarter features
What’s new with Google Wear OS? Tiles, that’s what
Android Q beta: What's new?
The Google Nest Hub Max soups up the smart display
Google Search gets AR, and Google Lens wants to be your assistant
The Google Nest Hub Max could teach you how to do anything with...