Vague language plagues radiology reporting, with chest and inpatient imaging the top offenders

Ideally, a radiologists’ report should be as clear as possible to help guide patient care. But new research has found substantial variation in how the specialty conveys its uncertainties in these documents, which may lead to negative downstream consequences.

Researchers from institutions in California and Colorado collected more than 600,000 reports over a five-year period to reach their conclusions, published Aug. 19 in the Journal of Digital Imaging. They found marked reporting differences among individual practitioners and subspecialties regarding diagnostic uncertainties. 

For example, vague terms such as “likely” and “nonspecific” were more often used in chest imaging reports compared to musculoskeletal documents. Such rates of variation also held true across patient admission status and anatomic imaging subsections, the authors noted.

Ambiguous terms can result in poor patient care, overutilization of resources and may open radiologists up to potential litigation, according to Andrew L. Callen, with University of Colorado Anschutz Medical Campus in Denver, and colleagues. The group suggested moving toward more basic, universal language to remedy the problem.

“On whole, these data suggest substantial heterogeneity with regard to reporting uncertainty among radiologists,” Callen et al. added Wednesday. “Simplified ‘uncertainty’ lexicons may address this variability. More so, these data pose an important challenge and opportunity for medical educators to articulate best practices.”

The authors included 642,569 radiology report impressions from 171 rads for their research, covering 2011 through 2015. They used natural language processing to assess and count the uncertainty terms within these reports, keying in on phrases such as “likely,” “could,” “borderline,” “nonspecific” and “suggestive,” among others.

Below are some key findings:

  • At least one uncertainty term was found in 33.8% of reports, with “likely” leading the bunch, followed by “could,” “versus,” and “nonspecific.”
  • Thoracic and abdominal reports used vague terminology most often at 55% and 54.1%, respectively. General rad documents, meanwhile, used them sparingly (29.5%).
  • In terms of admission status, inpatient reports topped all others, with uncertainty in 40.3% of communications. Emergency room reports came in second (34.6%), followed by outpatient (28%).
  • Among the more than 170 radiologists, vague terminology use varied from 20% to 55% of reports.
  • Documents with uncertainty were “significantly” longer than those without, the group noted, but report length did not vary between subspecialities or modalities.
  • Reader experience didn’t impact report clarity one way or another, nor did reports by modality.

Callen and colleagues also used their NLP algorithm to determine if radiologists reading report impressions could determine if that document was expressing uncertainty or “hedging,” as they called it.

While phrases indicating a specific diagnosis are often misunderstood by clinicians, this test found that radiologists typically agreed on the “meaning and intent” of uncertain language.

The investigators noted that these widespread reporting differences may reflect more nuanced pathologies and a chance to standardize reporting language. 

“In this light, uncertainty represents a substantial opportunity to identify pathologies among radiology subspecialties that may warrant dedicated lexica,” they concluded. “This notion is supported by the measured improvements in diagnostic accuracy subsequent to structured lexica such as BI-RADS, TI-RADs, and LI-RADs.”