AJR: Comment tool delivers affordable, successful peer review
Peer review programs have become more prominent over the last decade, thanks largely to the American College of Radiology and The Joint Commission’s accreditation requirements, explained Jonathan O. Swanson, MD, and colleagues from the department of radiology at Seattle Children’s Hospital.
Compliance rates for these programs, however, had not been studied in current literature, so the authors reviewed use of peer review software, which allows users to add comments to reports in the database in real-time, at their facility. Data were collected on 5,278 radiologic studies from 15 radiologists over a 12-month study period.
The average compliance rate for the entire study period was 52 percent, but the rate trended upward over the course of the year. By the last month of the study period, the compliance rate had risen to 76 percent.
The authors also tracked discrepancy rates and comment usage, and found a 3.6 percent discrepancy rate between the original interpretation and peer review. In nondiscrepant peer reviews, comments were voluntarily included in 7.3 percent of cases.
On the peer review system used by the researchers, studies are assigned a score of 1 to 4, with 1 being a nondiscrepant peer review. Comments were mandatory on reviews with a score of 2 through 4, and optional on reviews with a score of 1. The authors pointed out that 62 percent of all comments were on peer reviews with a score of 1, indicating the comment functionality was easy to integrate into workflow.
“The number and type of comments strongly suggest that the participating radiologists found the comment tool to be an effective means of communication with their peers,” wrote Swanson et al.
The fact that comments were so readily used indicates that previous assessments of peer-review software in medical literature, which had emphasized the diagnostic impact of a review system that included comments as opposed to a scoring system alone, are on point.
“Our comment-enriched peer review program attempts to accentuate the potential of peer review to improve performance through feedback and learning,” wrote the authors. “An error-reporting system can either attempt to learn from past mistakes to prevent future harm or can focus on deciding which individuals are making the most mistakes. We have targeted the former approach by emphasizing and facilitating comments in peer review.”
The researchers did not use any financial incentives or penalties to motivate peer-review compliance, but theorize that such motivators could improve compliance. Since not every institution will be able to offer financial incentives, they suggested an alternative: frequent peer review conferences to emphasize the importance of peer review and a monthly report for each individual radiologist.
“The instantaneous feedback is reinforced by individual monthly reports summarizing all the prior month’s comments for each radiologist. During a quarterly peer review conference, clinically important interpretative disagreements, recurrent errors, and excellent diagnostic interpretations are highlighted as learning opportunities for the entire department.”