Chest x-ray AI similar, but quicker than radiologists at detecting diseases

A deep learning algorithm showed capability in screening chest x-rays for diseases similar to the interpretations of trained radiologists, but did so in a matter of seconds, according to Stanford University researchers.

Chest radiograph interpretation—a time consuming process—is important for detecting diseases such as tuberculosis or lung cancer which affects millions across the globe. Therefore, corresponding author Matthew P. Lungren, MD, with Stanford and colleagues tested their CheXNeXt algorithm against nine radiologists.

Lungren and colleagues trained their algorithm to detect 14 pathologies on nearly 112,000 x-rays. A group of three radiologists reviewed an independent set of 420 radiographs which were used as the ground truth. That process took about three hours, while CheXNeXt did so in about 90 seconds, according to a statement from Stanford.

Overall, the algorithm identified 10 diseases as well as radiologists. For three diseases (cardiomegaly, emphysema and hiatal hernia) the radiologists’ did a better job. CheXNeXt was superior at identifying one disease—atelectasis.

“The results presented in this study demonstrate that deep learning can be used to develop algorithms that can automatically detect and localize many pathologies in chest radiographs at a level comparable to practicing radiologists,” the authors wrote in the study published Nov. 20 in PLOS Medicine.

Lungren et al. pointed out many of the world’s population don’t have access to CT imaging for lung cancer screening and diagnosis, and therefore rely on less-expensive modalities such as chest x-rays. Once clinically validated, an algorithm, such as the one presented in their study, could change the lives of radiologists and the patients they serve.

“We should be building AI algorithms to be as good or better than the gold standard of human, expert physicians. Now, I’m not expecting AI to replace radiologists any time soon, but we are not truly pushing the limits of this technology if we’re just aiming to enhance existing radiologist workflows,” Lungren said in a statement. “Instead, we need to be thinking about how far we can push these AI models to improve the lives of patients anywhere in the world.”

""

Matt joined Chicago’s TriMed team in 2018 covering all areas of health imaging after two years reporting on the hospital field. He holds a bachelor’s in English from UIC, and enjoys a good cup of coffee and an interesting documentary.

Trimed Popup
Trimed Popup