The radiology report is the heart of what radiologists do and the value they hold to referrers and patients. Despite this essential role, radiologists put millions of dollars in reimbursement at risk each year, as well as open themselves up for potential embarrassment, by allowing errors to slip into their reporting or forgetting key pieces. Now, practices are looking to technology and structured templates to dot every i, cross every t and optimize every report.
Reporting errors can undermine otherwise diagnostically sound interpretations by omitting or muddling important facts. Errors of laterality—for example, when an x-ray of the right hand has a report that mistakenly says left hand—can erode confidence from referrers and patients.
“The patients might be thinking, ‘Did he even look at the right study? Did he even look at my x-ray and not somebody else’s x-ray?,’” says Woojin Kim, MD, assistant professor of radiology at the Hospital of the University of Pennsylvania (UPenn) in Philadelphia. He adds that misstating patient age or sex is the error that seems to upset patients the most, judging from the calls he received during a stint as interim chief of the Division of Musculoskeletal Imaging.
Update from the RSNA Reporting Committee
Kim makes the analogy that a report is akin to an online dating profile. You can graduate from an Ivy League school, but look fairly dumb if your profile is riddled with typographical and grammatical errors, and likewise, a report with mistakes doesn’t convey professionalism or expertise.
Failing to properly document key items can cost more than embarassment, it can cut into reimbursement. Radiologists may forget to note the administration of intravenous contrast or correctly describe the number of radiographic views. In cases where a complete abdominal ultrasound is ordered, all relevant structures must be included in the report even if findings are normal or a body part was obscured or is missing. These mistakes can lead to errors of overcoding or undercoding.
So how common are reporting errors? A few studies shed light on the topic. A study of laterality errors from the department of radiology at Massachusetts General Hospital in Boston looked at more than a million radiology reports produced over the course of a year and demonstrated that while such errors were present in just 0.00008 percent of cases, this still translated to dozens of laterality errors and was higher than self-reported estimates (Am J Roent 2009;192:W239-W244). Another study, from the University of Michigan Health System in Ann Arbor, took a broader view and looked at all significant report errors when using automatic speech recognition technology, including wrong-word substitution and nonsense phrases, and found that more than 20 percent of the reports evaluated contained potentially confusing errors (J Am Coll Radiol 2008;5:1196-99).
Another study involving nearly 13 million abdominal ultrasound reports from 37 practices revealed incomplete physician documentation in 9.3 to 20.2 percent of cases (J Am Coll Radiol 2012;9:403-8). Of the exams titled complete, only 87.4 percent actually fulfilled complete CPT criteria, and 60.6 percent of exam titles were clearly erroneous or too ambiguous to code. The bottom line, according to the authors, is that incomplete physician documentation in the reports they analyzed resulted in a professional loss of income of 2.5 to 5.5 percent.
To err is human, to fix errors gets technical
An awareness of the type and frequency of reporting errors being made is the first step toward curtailing them. Systems have been developed at individual practices and by vendors that can automatically scan reports for completeness and common errors. Kim says that when Penn Presbyterian Medical Center began implementing laterality error tracking software, it saw its error rate drop by 48 percent with no significant interventions into practice one month after implementation. Simply telling radiologists that errors are being tracked and that they are occurring is enough to shore up some mistakes.
In many cases, a small number of individuals are responsible for a majority of the reporting errors, so quality improvement efforts can be more focused. “There’s nothing that changes one’s behavior more than saying to somebody, ‘Here’s a chart. Here’s everybody else. Here’s you,’” says Kim. “Seeing that visually has a powerful effect on one’s behavior.”
Rather than rely on software to catch errors after they’ve already been made, structured reporting can prompt radiologists to include essential information as a report is being created.
These structured reporting templates should be crafted to fit individual practice needs. Charles E. Kahn, Jr., MD, MS, of the department of radiology at the Medical College of Wisconsin in Milwaukee, and chair of the Radiological Society of North America (RSNA) Reporting Committee, says his department has templates for most ultrasound exams and a number of the body CT and MRI exams performed. Some are fairly general and some are more specialized—more specific templates include chest CT for pulmonary embolism and abdomen/pelvis CT for pancreatic cancer staging—which makes sense given that each type of scan is looking for different things.
“Focusing the reporting template to the indication for the exam actually helps you give information that’s more pertinent and more useful for referring physicians,” says Kahn.
The challenge is creating well-crafted templates that are useful without slowing down a radiologist’s reporting. Kahn says structured reporting should be at least as quick as any other report and should not have an overabundance of fields to fill in. It also should follow the typical visual search pattern to match the way a person would talk about the procedure.
“The hope is if you design the reporting templates well, they can actually improve radiologist productivity and help reduce errors,” says Kahn.
To help practices make better reports, RSNA has created a template library to provide a jumping off point. These reports are not intended to be a standard to adopt, but rather examples of quality templates that departments can modify to fit their own unique needs. The template library, which can be found at radreport.org, recently surpassed 1 million downloads, according to Curtis P. Langlotz, MD, PhD, of the department of radiology at UPenn and former chair of the RSNA Reporting Committee.
Langlotz adds that several efforts to standardize reporting are gaining traction by following the success of the fine level descriptors outlined by BI-RADS (Breast Imaging-Reporting and Data System). LI-RADS has been developed for the liver and the American College of Radiology also is developing PI-RADS for the prostate, LUNG-RADS for the lung, HI-RADS for head injury and T-RADS for thyroid disease.
A recent study of the efficiency of structured reporting for coronary CT angiography results showed that structured impressions improved result interpretation agreement with regard to the number of vessels with significant stenosis. The structured impressions had the effect of decreasing tendency toward overestimation of nonsignificant stenosis by clinicians (J Am Coll Radiol 2013;10:432-8).
Both clinicians and radiologists find structured reports to contain better content and greater clarity than conventional reports, according to a study authored by Lawrence H. Schwartz, MD, of Columbia University Medical Center in New York City, and colleagues (Radiology 2011;260:174-81). On a scale of 1 to 10, mean content satisfaction ratings were 7.61 for conventional reports and 8.33 for structured report; clarity satisfaction ratings were 7.45 and 8.25 for conventional and structured reports, respectively.
An added bonus
Schwartz sees structured reporting as a win-win-win scenario; in addition to improving communication with referrers and patients, structured reporting makes report data reusable and available for data mining applications, he says.
Even the simplest form of standardization—using common macros to document clinician notifications—can help practices meet results communication goals established by the Joint Commission, according to Langlotz.
“Standard macros also can facilitate compliance with federal incentive programs, such as the physician quality reporting system (PQRS), which establishes content requirements for certain radiology reports,” he says. PQRS measures including fluoroscopy time and mention of relevant priors can become a focus of standardized templates and scanned easily to determine compliance rates.
A change in how patients view their reports also is bringing a renewed focus to reducing reporting errors. Radiologists have always been sensitive to patients viewing reports, but as patient portals become more commonly used, an increasing number of patients are receiving direct access to reports online soon after a procedure. “That requires a different level of interaction and different level of concern for detail related to diagnostic output,” says Schwartz.
Optimizing reports is a goal for radiology as a whole, and the efforts of the RSNA and other professional associations may spur innovation, but the heavy lifting will have to be done at the local level to make sure each department and practice produces reports that guarantee payment and project professionalism.