Prostate MRI software beats out young radiologists using PI-RADS, but seasoned expert still outperforms

Radiologists are dealing with an increasing number of prostate MRI referrals but not every provider is equipped to read such exams. New evidence suggests artificial intelligence can help, particularly for those with less experience. 

The Prostate Imaging-Reporting and Data System was developed to help guide prostate MRI interpretations, yet PI-RADS scores are often deciphered differently. Residents and early career rads particularly struggle to get on the same page, researchers explained in the European Journal of Radiology.

With AI algorithms showing promise for diagnosing prostate cancer, the team tested a prototype software trained on biparametic MRIs from seven institutions.

Overall, deep learning’s diagnostic performance landed between resident and expert level. It showed similar sensitivity and higher specificity than radiologists with varying levels of experience. And while it did not beat a subspecialist it may still be valuable to daily workflows.

“From these results, we believe that [AI] can assist radiologists as a second reader to reduce variability in PI-RADS assessment,” Seo Yeon Youn, a radiologist at Seoul St. Mary’s Hospital, and colleagues explained Aug. 5. “In previous [studies], achieving consensus of deep convolutional neural network and PI-RADS score by radiologist showed better diagnostic performance than those of deep convolutional neural network and PI-RADS score alone.”

For their study, Youn et al. retrospectively enrolled 121 patients who underwent prebiopsy MRI and prostate biopsy. Eight residents, two mid-career radiologists and one prostate imaging expert independently reviewed the bpMRI scans. Those diagnostic results were compared to classifications made by the AI, with pathology reports used as the reference standard.

Clinically significant cancers were spotted in 43 (35.5%) patients. AI performed better (AUROC of 0.83) than second-year residents (0.71), similar to mid-level rads (0.01-0.66), and far worse than the prostate imaging expert (0.91).

Furthermore, the algorithm achieved a sensitivity on-par with all rads and pathology reports at a PI-RADS cutoff value of 4 or greater. Its specificity, meanwhile, was “significantly” higher than third-year residents and board-certified providers and comparable to all others at the same PI-RADS cutoff number.

“This study provides the first comparison between DLA and radiologists with various levels of experience in PI-RADS classification,” the authors wrote, pointing out that the “moderate agreement between DLA and the expert radiologist seems promising and DLA-based PI-RADS categorization may help to reduce inter-reader variability in clinical practice.”

A handful of authors cited employment with Siemens Healthcare, which developed the software.

Read the full study here.

""

Matt joined Chicago’s TriMed team in 2018 covering all areas of health imaging after two years reporting on the hospital field. He holds a bachelor’s in English from UIC, and enjoys a good cup of coffee and an interesting documentary.

Trimed Popup
Trimed Popup