Expectations around AI for radiology considered in the realm of the realistic

Artificial intelligence (AI) is already changing the way radiologists do their jobs. And while much of the action is in mundane task relief, exciting opportunities are emerging for the technology to push radiology’s role in personalized medicine beyond what it could accomplish with human eyes alone.

That’s one takeaway from a panel discussion on AI in medical imaging held Jan. 18 in Boston. The hosting event, a two-day summit presented by Insight Exchange Network and held at the 110-year-old Harvard Club, covered AI’s inroads into healthcare as a whole.

The session on AI specifically in diagnostic imaging invited panelists to share their thinking on opportunities, challenges and predictions for the future.

Katherine Andriole, PhD, director of research strategy and operations at the Center for Clinical Data Science of Massachusetts General and Brigham & Women’s hospitals, noted that the buzz at the popular level has been over AI as an image interpreter.

“That’s sort of the sexy piece, but in our view, artificial intelligence and data science tools can be used at every point in the imaging chain,” she said. And that means helping with everything “from ordering through protocoling, maybe some [data point] is embedded at the scanner—maybe AI is helping the radiologist with interpretation or it’s helping with radiology reports, acting as clinical decision support, making you more efficient—all the way back to communicating results to the ordering clinician.”

AI can assist with “a lot of operational kinds of things,” Andriole added. “These things may be boring, but the results can be very beneficial in terms of efficiency, cost savings and even safety.”

Chatbot assistants

Kevin Seals, MD, of UCLA Ronald Reagan Medical Center seconded Andriole’s point before describing how machine-learning algorithms can be integrated with chatbot technology to automatically answer, in a human-like way, questions from referring physicians via texting. He gave as an example an AI chatbot that warns an ordering doctor off a contrast agent that might harm a patient’s liver.

“Rather than a clinician needing to take the time to pick up a phone, find the right number, get the right radiologist on the line—a process that could potentially distract a radiologist who is in the middle of looking at a large, complex study with 5,000 images—a machine learning algorithm can help understand that question and can respond” via texting chatbot, Seals said. “This can take care of a lot of the low-level functionality that a human would otherwise provide. That might be an example of [a use of AI] that is a little less sexy, a little more boring, than making a sophisticated imaging diagnosis, but it is a very useful way to apply machine learning.”

“Part of our job as radiologists is measuring things very precisely in order to stratify patients, say, to get surgery or not get surgery and so forth,” added Alex Bratt, MD, of Weill Cornell Medicine and New York Presbyterian Hospital, referring to the time- and resource-intensive work of clinical quantification. “Computers could potentially be much better than radiologists at some of these [repetitive] tasks. Things like segmentation, doing flow quantification on cardiac MRI—this is a super-rich field. I think that standardizing quantification for robustness and reproducibility is one of the biggest benefits we should be getting from machine learning in the near term.”

Patterns in pixels

Tarik Alkasab, MD, PhD of Massachusetts General Hospital and Harvard Medical School pivoted on that point to consider the role of AI in medical imaging in years to come.

“One of the things that’s going to be really exciting is the amount of information that is contained in the pixels that are obtained as part of an imaging exam,” Alkasab said. Making diagnoses by having an algorithm see patterns in the pixels that are impenetrable to the human eye, he continued, is a “potentially tremendous” application of the technology. 

“I think it’s going to turn out that there is actually a lot more information in the images that are being acquired from patients now than has previously been appreciated or used,” Alkasab added. “This is something that is going to be a really exciting area” of exploration over the longer term.

To this Seals added that one of the “coolest, craziest, most exciting applications of deep learning for medical imaging” will be helping cancer interventionalists select and deliver chemotherapy agents with highly precise predictability over the agents’ effect—or lack thereof—on tumors.

Underscoring Alkasab’s point about pixels, Seals noted that AI can see features in medical images that the human eye could never perceive—“things related to the texture of the tumor and very specific computational, quantitative, interesting characteristics that we cannot perceive but a neural network can perceive.”

“Using special characteristics that are encoded within the image, we can reach a conclusion such as, ‘given these particular features, this particular chemotherapy agent is appropriate,’” and will have the desired effect on the tumor, Seals said. Meanwhile the computation may just as precisely predict the lack of efficacy of another agent on the same tumor.

“Long-term, this is just incredibly exciting,” Seals said. “It has the potential to just change everything.”

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Trimed Popup
Trimed Popup