RLI Summit 2017: What radiology can learn from pathology about AI

Healthcare-provider organizations that fail to turn their exponentially expanding troves of clinical data into actionable intelligence are at risk of getting strangled by their own success at generating and capturing said data. And most health systems already have enough clinical data to make more useful primary observations than any group of humans could ever connect the proverbial dots on.

The good news is that machines are here to help—and they’re getting better at it all the time.

So suggested David Louis, MD, chief pathologist at Massachusetts General Hospital, in a talk delivered to the American College of Radiology’s 2017 Radiology Leadership Institute (RLI) during the group’s annual summit at Babson College in Wellesley, Mass., Sept. 7 to 10.

“When I thought about what might be of interest to radiologists, I realized that some aspects of our field are extraordinarily primitive relative to yours,” Louis said in introducing his material. “You have had digital data for decades now. We, in at least one of the three major areas we work on, don’t have it in a major way.”

Louis also prefaced his presentation with the clarification that some but not all computer-aided pathology fits under the banner of AI or machine learning. For this reason, he prefers the term “computational pathology.”

He noted that, like radiologists, pathologists serve treating clinicians with diagnostic insights encompassing widely varying levels of complexity. And the clinicians have tons of all of it to weed through. Some 60 percent to 70 percent of data in most EMRs traces to pathology tests processed through the clinical laboratory, he said, adding that Mass General alone generates 13 million pathology test results a year.

Further, his group recently tallied its five-year EMR input and found that, together with pathology colleagues at Brigham & Women’s Hospital and other Partners Healthcare sites, they’d populated the EMR with more than half a billion data points.

“We are generating a tremendous amount of data,” Louis said. “How do we ensure that the clinician is using that data in an intelligible way?”

Adaptable algorithms

Louis described three types of algorithms pathology is using at Mass General to help clinicians care for patients.

What he called simple algorithms are used to, for example, alert a clinician when a patient’s creatinine reading remains within the normal range yet rises within the range such that it could point to acute kidney injury.

“Now you might say, ‘That’s silly. The clinician should be watching the creatinines.’ And they should,” Louis said. “But this is a safety feature that would prevent someone from missing a pretty simple trend in laboratory values.”

Up one level are analytic algorithms, which crunch data in ways attending physicians could not do on their own. The example Louis gave here was processing results from genetic sequencing of tumors or patient DNA for germ-line analysis. The lab gets many thousands of DNA sequences off a single tumor, he explained, and the sequences need to be aligned before they can be analyzed.

“There is no way the physician can look at all those raw sequence data reads and put it all together,” he said. “It has to be given to a complex algorithm.”

A couple of steps up from there are integration algorithms. These assimilate information not only from throughout the lab but also from the entire clinical realm, including any relevant data stored in the EMR.

Louis exemplified integration algorithms by describing a breast-cancer outcome calculator designed by his Harvard colleague James Michaelson, PhD. The calculator draws data from many decades’ worth of cancer records at Mass General, including radiology reports, and it can look through death-certificate databases and Social Security information—which automatically updates into the calculator.

“What [Michaelson] has been able to do is model, for individual patients, the equivalent of a Kaplan-Meier curve,” Louis said. “You can now go in, enter a clinical feature such as age, a gross pathological feature like tumor diameter, number of known positive nodes, the histology, some molecular features and some additional pathology features—and it gives you your own curve showing what is your likely prognosis.”

Louis said about one-fifth of breast cancer patients in the country are now using this calculator online, and Michaelson has modified it for a number of other cancer types as well.

Dynamic data

If clinical diagnosticians are to successfully tap computer-aided techniques, they must make sure their data are updated “in a dynamic way,” Louis said. For example, if a clinical research paper comes out months after some variant is found to have pathological significance unknown up to that time, the system needs to incorporate the finding quickly and completely.

Toward that end, Mass General and others have built a module that takes into account such changes in clinical significance.

“A doctor will get a report saying, ‘In the past we told you that this was of unknown significance. We now know, because of additional data that has come out in the genetic databases, that it is a pathogenic mutation,’” Louis said. “So we are not only doing things that a single time point, but [making adjustments] as information changes.”

What’s more, the information changes don’t have to be out in the literature but, instead, could be unique to the patient.

“For example, you might have a polymorphism in someone’s genetic code that doesn’t have a significance unless that person has hypertension or hyperlipidemia,” he explained. “What you want is for the interpretation to happen again the first time that patient has a high cholesterol reading reported in the EMR.”

It’s the combination of running algorithms and updating them in a dynamic way to help clinicians benefit by all the data his group generates that they’ve termed “computational pathology,” Louis said.

“And again,” he stressed, “not all of this has to do with machine learning or artificial intelligence.”

AI: Eventual job robber?

The fact that some of it does indeed have to do with AI prompted Louis to cover ground many in the audience likely were awaiting with bated breath: Are computers going to put pathologists and radiologists out of business?

“I think the short answer is no,” Louis said. “It’s really hard to introduce these things into clinical practice, let alone get past all of the regulatory hurdles. The whole regulatory landscape is a nightmare.”

Meanwhile, most of the clinical applications for which AI will be appropriate involve yes-or-no kinds of questions, as for screenings, Louis said. “And for those to be done intelligently,” he said, “you have to have radiologists and pathologists posing the questions.”

Nonetheless, change is afoot—and if past is prologue, some of the shifts in standard operating procedures will be nothing short of tectonic, Louis predicted.

“It is us, the clinicians, who need to adapt to [the coming changes] in an intelligent way,” he said. “We can say, “Okay, so I’m not doing that [particular task] anymore. But now there are many other things I can do” based on the time freed up by AI and related technologies.

“I think it’s an exciting time in my field,” Louis concluded. “I hope it’s an exciting time in your field. I would say embrace it and enjoy it.”

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Trimed Popup
Trimed Popup