New 2D/3D hybrid neural network can enhance prostate cancer care for the masses

Multiparametric MRI is an invaluable tool for prostate cancer screening, but costly image post-processing steps remain a roadblock to wider adoption. A new study proposes artificial intelligence as a possible remedy.

California and Colorado doctors recently detailed their hybrid 2D/3D deep learning convolutional neural network approach Wednesday in AJR. Their CNN can automatically segment the prostate from memory scans with impressive accuracy, eliminating the 5-10 minute task typically demanded from highly trained radiologists.

First author Alexander Uscinski, MD, with the University of California, Irvine, Department of Radiological Sciences, and colleagues believe their tool can have an outsized impact on this disease, responsible for 26,700 deaths in 2017 alone.

“By reducing the time of segmentation through automation, machine learning techniques may … decrease costs and increase access to both memory and fusion biopsy,” the authors added. “Ultimately, machine learning approaches may increase care equity and improve access to high-quality memory and fusion biopsy outside of the academic and subspecialty clinical setting.”

To develop their U-Net CNN, Uscinski et al. had two abdominal rads manually segment T2-weighted prostate MRI scans and ultrasound MRI fusion transrectal biopsy images. These metrics served as the ground truth for the study and were used to train and validate the neural network.

In total, 7,774 images taken from 287 patients were used to train the hybrid model. Upon testing, the tool proved highly accurate at detecting and segmenting the prostate gland, achieving a mean Dice score of 0.89. The latter measures overlap between hand-segmented images and those completed via neural network.

Discussing their results, the authors cited a number of strengths specific to their research, including the large dataset used to train their hybrid CNN. They did note that segmentation is “only a portion” of the post-processing work needed to prep for biopsy and that future efforts should be geared toward lesion detection and risk stratification.

“Given the direct clinical necessity of prostate memory segmentation and volumetric analysis for surgical planning and fusion biopsy, a 3D/2D U-Net CNN serves as a promising model for future clinical implementation of deep learning to clinical imaging,” the authors concluded.