NLP Speaks Volumes
Reaching patient safety and quality goals requires a razor-sharp handle on data. Yet, data reside in an array of systems, software and resources and are often maintained in an unstructured format that is difficult to access and analyze. Natural language processing (NLP), a sort of data management on steroids, comprises a robust toolset that can address these challenges.

NLP translates words (or unstructured inputs) into data, providing a set of technologies that incorporate statistical algorithms to match components of language. Several pioneers are building on NLP’s coding roots to develop a roadmap for future applications in healthcare.

NLP-driven documentation

In August 2011, the University of Pittsburgh Medical Center (UPMC) signed a 10-year joint development agreement with an NLP vendor with a goal of organizing and mining mountains of internal and external data. “When we looked at where we aspired to be—an accountable care organization that leverages the power of information—we realized we needed a good handle on the buckets of information at our disposal,” says Rasu B. Shrestha, MD, MBA, vice president of medical information technology.

The initiative focuses on documentation from three perspectives: creation, storage and mining. It addresses key drivers in healthcare, specifically patient safety and cost. To date, other healthcare organizations have applied NLP in back-end processes,
such as dictation and coding.

“We are going against the flow. Our vision is that medical intelligence will guide the physician to include the most thorough and accurate information in the notes. Incorporating NLP at the front-end of the process can have a tremendous impact,” says Shrestha. By shifting the emphasis to the front-end, the capabilities of NLP then flow through the clinical process.

At the document creation level, Shrestha sees an increasing role for voice recognition in inputting data into the EMR.

NLP also may address a more fundamental need. Currently, UPMC uses a transcription model. “We realized if we invested in modernizing our transcription service [through NLP], we could achieve cost-savings to the tune of several million dollars annually … and that’s even with zero change in physician behavior. Incorporating front-end speech recognition will enable additional efficiencies.”

Other organizations, such as Adventist Health System (AHS), an Orlando, Fla.-headquartered system, have tapped into NLP to optimize coding. Since integrating NLP at 24 sites beginning in February 2010, AHS coders have improved average processing times to eight minutes better than the national average.

The NLP-driven process is more efficient, more complete and accurate, says Migdalia Hernandez, RHIT, corporate health information management director. The NLP system is interfaced to the EMR and pulls relevant data, such as ulcers or arrhythmias, for presentation to coders.

AHS also worked with its vendor to develop rules to teach the system to review each of the more than 20 terms that physicians use as a heading for the diagnostic impression. (No matter what term is used, the impression represents a wealth of coding data and must be reviewed.)

The value proposition

The most potent NLP applications may help physicians apply research in practice. UPMC aims to use NLP to mine its systems and records for information and evidence tied to a disease process or metrics. NLP, says Shrestha, organizes those data in a more user-friendly manner. “We realized a single technology stack could help us meet our goals. With a better starting point for information, the end results have improved.”

Seton Healthcare Family, an Austin, Texas-based hospital system, is tackling a related challenge and shifting from hospital-based care to a healthcare delivery provider. “We are trying to increase the value of the care we deliver. That means connecting the delivery system and pulling together data from entities that aren’t normally connected,” says Ryan Leslie, vice president of analytics and health economics. It requires providers to examine data in the aggregate, as well as on a case-by-case basis.

Like UPMC, Seton is challenged by unstructured data. “Structured data only get us so far in trying to identify high-risk patient populations or gaps in care,” says Leslie. The Seton team recognizes that data that could lead to better outcomes exist in physician notes and other unstructured sources. “An individual clinician might find the relevant data, but there is no easy way to analyze them in the aggregate.” A physician might recognize that a single patient lacks a home-based caregiver, but identifying the entire group of at-risk patients is beyond the capacity of individual providers or traditional IT systems.

In general, 20 percent of EMR data is structured and 80 percent is unstructured. Although it’s easier to mine structured data, such as medications, the “golden nuggets” of information, such as ejection fraction, are often hidden away in an unstructured format in clinical notes.

The problem is that traditional data analytics tools—aggregate views and trend reporting—don’t work with unstructured data. Researchers need a system to mine unstructured and structured data. “When you have both data aggregation and data mining together, you can start making sense of structured and unstructured data to determine predictors of problems such as 30-day readmissions,” says Leslie.

Seton plans to use NLP to expedite the research and learning process, to risk-stratify patients at the highest risk for readmissions and test published models in the local patient population.

One example of unstructured data is the psychosocial environment, says Leslie. A clinician may discharge a patient with an optimal care plan. However, psychosocial factors, such as the patient’s living situation or caregiver status, play an important role in the patient’s ability to comply with the care plan. Seton hopes to leverage NLP to identify and address common psychosocial barriers among hundreds or thousands of patients. Key data will be summarized and presented to clinicians, so they can be incorporated into the patient care plan.

Meanwhile, UPMC aims to link NLP with decision support and use the technology to wade through medical literature and feed evidence to clinicians at the point of care. High-quality contextually relevant evidence, if presented intelligently at the point of care, can push the envelope further in providing more informed and more appropriate care related to a number of parameters, including perhaps diagnoses, treatment options, tests to order and guidelines, says Shrestha.

Whether it’s streamlining coding, capturing data or re-engineering workflow, NLP is poised to translate into better, more efficient, more effective healthcare.
Trimed Popup
Trimed Popup