Like much of healthcare, radiology is a state of flux. From reimbursement cuts to adopting EMRs, day-to-day operations are being transformed. Dictation software is evolving in the radiology field as natural language processing (NLP) is being developed to harness content from dictated, free text into a manageable report that can be used in radiology.
What’s in a name?
NLP is a broad term, according to Daniel L. Rubin, an associate professor in the department of radiology at Stanford University in Palo Alto, Calif., who conducts research in NLP. From a 10,000-foot-view, NLP is the “application of computer methods to understand and comprehend free text,” Rubin says, adding that the “holy grail is to have a computer understand the meaning of text equivalently to how a human reading that same text understands it.”
Because the major applications for NLP in radiology tasks are so specific, Rubin describes four computer processing tasks that fall under the broader concept of NLP that can assist radiology workflow:
- Text Classification: A text report or sentences in the report are run through a classifier to label the reports, such as for automated ICD coding.
- Name Entity Recognition: Recognizes findings, diseases, devices and diagnoses in reports.
- Information Extraction: Pulls out from reports particular types of factual statements (such as the anatomic location of an imaging finding) or recommendations; these statements convey elemental facts for future tabulation.
- Information Retrieval: Searches a large database of text reports for those that match certain query criteria.
Because radiology uses a large vocabulary of terms and since reporting styles tend to vary among individual readers, there are challenges to achieving fully structured reporting, says Keith Dreyer, DO, PhD, vice chairman of radiology and informatics at Massachusetts General Hospital (MGH) in Boston. “We feel that NLP is a powerful interim solution until we achieve full structured reporting. That said, I think we’ll utilize NLP for a very long time, particularly as we become more accountable for our outcomes.”
“All of our historical reports and medical literature are in free text and unstructured, so NLP methods will always be needed, at least to extract and help radiologists access, leverage and use these past reports,” Rubin says. “[This] is important not just for research or educational purposes, but also for clinical care, especially when a clinician is confronted with an unknown case.”
“You can’t improve anything you can’t measure,” says John Mattison, MD, assistant medical director, CMIO, at Kaiser Permanente Southern California. Mattison—who has been working on integrating NLP technologies (powered by Nuance) with evaluation and management (E/M) coding metrics for the Kaiser’s HealthConnect EHR—stresses the need to be able to disambiguate data and normalize the construction of free text data across a patient’s entire record.
According to Mattison, when that data are coded consistently into the EHR, clinicians such as radiologists could apply rule-based evidence to pick out previous information on patient encounters, a patient’s history, the course of events leading up to an event and the history of analysis. He says that while it has taken awhile to validate the NLP technologies and E/M coding metrics because they are so complicated, Kaiser Permanente is about a year away from integrating the two systems.
Time to shine, mine, slice, dice and parse language
“It’s no secret that radiologists who have been practicing for 10 to 20 years, for the most part, hate voice recognition because they feel it will slow them down and they don’t want to become editors,” says David J. Marichal, RT, CIO and COO at Radiology & Imaging Specialists in Lakeland, Fla. However, since adopting a new reporting tool, GE’s Centricity Precision Reporting (powered by M*Modal), in November 2008, Marichal says the group’s radiologists couldn’t be happier using NLP to quickly complete reports in the group’s multi-modality imaging centers.
“[The reporting system] simultaneously launches when viewing images and the RIS auto-populates report data so the radiologist doesn’t have to fish or hunt for information,” says Marichal. The reporting system is adaptive, so that it learns on the back-end the patterns of the clinicians using it. This means that the more radiologists use the system, the better it gets at recognizing often-used words and phrases.
“NLP can figure out from