WASHINGTON, D.C.—IT has enabled radiologists to read a larger volume of studies from just about anywhere; but in the process, radiologists have often become more distanced from residents, who may render inaccurate preliminary reports without ever finding out their mistakes. Although technology has given rise to this cleavage in training, IT may also deliver the solution, according to a June 2 presentation at the meeting of the Society for Imaging Informatics in Medicine (SIIM).
Despite the non-finality of readings by residents, early clinical decisions are commonly made on the basis of resident-issued reports. Yet faculty radiologists often find mistakes or insufficient information, requiring revisions or amendments to the final report.
PACS’ remote capabilities have enabled faculty to sign off on reports from most any computer, including remote systems, noted Richard E. Sharpe Jr., MD, MBA, from Thomas Jefferson University Hospital in Philadelphia. What is more, a larger workload, more restrictive resident working hours and the growing size of residency programs have all opened a chasm between faculty and residents in many programs.
“The problems is, many trainees often are not aware of changes made to their radiology reports,” Sharpe, the chief radiology resident at Thomas Jefferson, said. He insisted on the importance for trainees of seeing their mistakes and understanding larger trends in residents’ misreads.
In an attempt to bridge the disparity, Sharpe and colleagues developed the radiology report comparator, a system that juxtaposes a resident’s preliminary and a faculty member’s final reports, side by side.
The comparator allows trainees to log in and view their extracted reports, with colored track changes and word count differences to assess the relevant revisions made by faculty.
Over the last eight months, the system has seen 993 distinct logins, Sharpe noted, an average of 16 per trainee. The comparator initially displays 100 pairs of trainees’ most recent reports, with 35 percent of residents reviewing the changes on a daily or weekly basis.
A survey of the facility’s residents, completed by 76 percent of trainees, showed that large majorities of trainees responded that they believed the radiology report comparator provides educational value, offers interesting insights and improves the quality of reports.
According to the survey, the most common edits made to trainees’ reports were stylistic and notes of changes to be made for future reports. The trainees expressed strong liking for the system and reported that it substantially improved their clinical understanding, Sharpe stated.
“The radiology report comparator offers latent educational opportunities that otherwise would be missed,” Sharpe contended. The trainees acknowledged comparing their reports with those of the radiologists more frequently as a result of the software.
Sharpe and colleagues plan to expand the department’s use of the system to track productivity changes resulting from the software, and to use natural language processing to triage more significant revisions.
“The system provides an efficient mechanism to improve reporting skills,” Sharpe concluded. “Trainees learn from comparing the reports and do so more often with the software.”