Natural language processing could be key to unlocking decision support potential

A new system designed to extract detailed imaging observations from narrative text reports using natural language processing (NLP) and integrate them with clinical decision support systems (DSS) has been proven effect using unstructured mammography reporting, according to results of a study published in the April issue of the Journal of American Medical Informatics Association.

The traditional, unstructured method of radiology reporting—a narrative flow of multiple lesion descriptions, characteristics and anatomical locations—can hinder efforts to mine report details and input usable data into decision support systems. This is especially true of mammography reporting, according to lead author Selen Bozkurt of Akdeniz University Faculty of Medicine in Antalya, Turkey. “The interpretation of mammography images is challenged by variability among radiologists in their assessment of the likelihood of malignancy given the abnormalities seen in mammography reports, and methods to improve radiologist performance are needed,” wrote Bozkurt et al. “Although preliminary work to develop DSS for mammography using standardized vocabulary to describe the imaging features is promising, few DSS for mammography have been adopted in clinical practice, likely due to the challenge of interfacing DSS with the clinical workflow.”

Bozkurt and her team conducted the study to evaluate an NLP-based system they developed that aims to automatically extract information about lesions and their related imaging features from free-text mammography reports. They randomly selected 300 mammography reports from a hospital report database on which to test their system, evaluating the completeness of information extraction by calculating the precision, recall, and F measure for extracting the correct information. 

The team’s NLP system detected 815 lesions (780 true positives, 35 false positives, and 17 false negatives), compared to the 797 lesions contained in the gold standard data set analyzed using a structured reporting application. The systems’ precision of detecting all imaging observations was 95 percent, recall was 91 percent, and the F measure was 93 percent.

Once implemented, the researchers hope the system will be effective in providing inputs to DSS from free-text reports, and could ultimately help to reduce variability in mammography interpretations and other radiological reporting. “There is a tremendous amount of data in unstructured free-text mammography reports, and extracting the structured content about the imaging observations and characteristics of each lesion reported could be useful to enable decision support,” wrote Bozkurt et al. “Although our application focuses on the domain of mammography, we believe our approach can generalize to other domains and may narrow the gap between unstructured clinical report text and structured information extraction needed for data mining and decision support.”

John Hocter,

Digital Editor

With nearly a decade of experience in print and digital publishing, John serves as Content Marketing Manager. His professional skill set includes feature writing, content marketing and social media strategy. A graduate of The Ohio State University, John enjoys spending time with his wife and daughter, along with a number of surprisingly mischievous indoor cacti.

Trimed Popup
Trimed Popup