Curt Langlotz at SIIM 2018: AI's impact will be 'real and profound'

The Society for Imaging Informatics in Medicine (SIIM)’s 2018 annual meeting wrapped up June 2 with a keynote address from Curt Langlotz, MD, PhD, with Stanford University, on the rise of artificial intelligence (AI).

He specifically discussed how computing abilities outside healthcare gave rise to imaging’s growing obsession with AI and why the threat of a technological takeover is beginning to subside.

Health Imaging discussed the address with Langlotz, with the exchange edited for length and clarity.

Health Imaging: Why is AI the hottest topic in medical imaging? Is there anything aside from the advancement of technology itself contributing to this?

Curt Langlotz, MD, PhD: The excitement is appropriate and directly related to the power of the technology itself. Once people saw what was happening in computer vision outside of healthcare, the health applications were obvious. However, individuals outside our field tend to underestimate the complexity of what radiologists do. A news article reporting that a neural network can distinguish tuberculosis (TB) from normal might run under a headline "AI equivalent to radiologists in reading chest x-rays." There is much more to reading a chest x-ray than a binary TB detection task. But despite all the hype, the technology is real and will have a profound impact on how we practice over the next 10 years.

Where in medical imaging will AI have the greatest impact?

AI will have an impact throughout the life cycle of medical images: image reconstruction, image quality control, image triage, detection of abnormalities, classification of disease and correlation of images with other data types, such as electronic health record (EHR) data and genomics. 

I believe the earliest impact will be before the radiologist even sees the images. Some of the image reconstruction work shows astounding results—the potential to dramatically reduce imaging times and radiation dose. 

Of course, a key problem will be connecting high performance computation with the installed base of imaging devices, particularly older devices whose computing capabilities are limited.

What are other top barriers keeping AI from clinical use?

The barriers to the clinical use of AI aren't much different than for other new diagnostic technologies. It should undergo rigorous clinical evaluation and earn the appropriate regulatory approvals. The problem with AI algorithms is there are tens of thousands of diseases to detect and characterize. Another important barrier, which I think will be overcome sooner, is the challenge bringing AI advice into the clinical workflow. In the long run, we will need new standards, but there are some clear ways to get the information to the radiologist in the short run.

What do imaging professionals need to do to integrate AI into imaging, rather than be replaced by it?

Consider what happened when MRI first became available. At least one MR vendor planned to sell directly to non-radiologists. The theory was that the MR abnormalities were so obvious, a radiologist wouldn't be needed to interpret the study. Of course, we now know that it’s essential to understand the physics of MR to use it effectively. Radiologists have owned MR physics education and MR research, and as a result are indispensable to the interpretation of MR studies, the design new pulse sequences, and the selection of MR imaging protocols.  

A similar scenario will play out with AI. Radiologists need to educate themselves about the strengths and weaknesses of AI, so they can spot its flaws and use it as an effective tool. Radiologists won't necessarily be able to build AI algorithms themselves, any more than we know how to build MR scanners. But we will need to learn the underlying principles of AI algorithms, their strengths and weaknesses, and the implications they have for clinical care.

Do you see most AI-based technology coming from vendors in the healthcare industry, outside disruptors or somewhere else entirely?

Right now, the interest is coming from several key vendor groups: There are the cloud computing vendors—such as Google, Amazon and Microsoft—the modality vendors—such as GE, Siemens, Philips, Toshiba—and the PACS vendors—including IBM—and the startups. Nvidia and the cloud vendors will supply much of the computing power. In the next few years, I believe we will see some consolidation, in which the companies with deep pockets accrue people and technology from companies whose business models don't work out as expected.  

There is no question the technology is amazing, but the most common business model, which often involves creating and selling a narrow pipeline of clinical imaging applications, is not so amazing. There are probably some infrastructure companies who will do well selling pickaxes and shovels for the gold rush. But I suspect there will be a correction in expectations in the next few years, which would be healthy. Over the course of my 35 years in AI, I have been through three AI hype cycles, and two AI “winters.” This time, winter may look more like a Silicon Valley winter rather than the bitter cold ones I remember growing up in Minnesota. The technology is just too compelling to be ignored.