Utilization-management program points out positives of radiologist involvement

When radiologists collaborate with referring physicians to proactively manage imaging utilization, the participation of the radiologist has more weight tipping the scale toward success than does the specialty of the referrer. And the rad’s input has the greatest impact on primary care physicians who are heavy orderers of imaging exams.

David Friedman, MD, and Nancy Smith of Thomas Jefferson University Hospital found as much upon review of a utilization-management (UM) program as administered in 168,915 clinical cases over a five-year period (July 2009 to June 2014).

Their research report is running in the July edition of the American Journal of Roentgenology.

Friedman and Smith describe their work evaluating a UM program directed by a radiology benefit management company that developed detailed, evidence-based rule sets and provided peer-to-peer decision support for providers ordering advanced outpatient imaging studies.

A customer service rep drew from the evidence-based guidelines to screen outpatient imaging studies for appropriateness, consult with the provider's staff and, where deemed necessary following training and quality audits, forward the order to a nurse serving as clinical coordinator for further screening.

Meanwhile all participating radiologists, 40 to 50 in number (depending on the year), went through one-on-one training with an experienced administrator in the methods, approach and documentation requirements of the program.

If the indication for the study did not meet the rule sets after review by the clinical coordinator, or if no criteria had been established, a board-certified radiologist performed a final evaluation.

The radiologist was allowed to approve the study on the basis of an electronic chart evaluation and, if necessary, called the provider's office for further information.

“The determination of appropriateness was then made,” the authors write. “If a suitable individual was not available to take the radiologist's call, and there was subsequently no callback from the provider’s office within two business days, the study was administratively cancelled.”

The radiologist also had the option of sending evidence-based educational materials to the provider’s office.

Rad impact defined

Friedman and Smith report that, overall, 58.6 percent of orders were approved, 6.8 percent were changed and 13.5 percent were withdrawn by consensus.

Additionally, 6 percent were approved without consensus,15.2 percent were withdrawn because of no callback and 35.5 percent were not performed at the time they were ordered.

Defining the impact of radiologist impact as the combined rate of withdrawals and no callback, the authors recorded impact by specialty as follows:

  • Family practice (25.3 percent) and internal medicine (23.8 percent) had the highest aggregated rates of study changed or withdrawn by consensus.
  • Thoracic surgery (13.3 percent), neurosurgery (11.2 percent) and orthopedic surgery (9.3 percent) had the lowest rates.
  • Internal medicine (18.0 percent), neurology (17.7 percent) and family practice (17.4 percent) had the highest rates of study withdrawn owing to no callback.
  • Pediatrics (7.1 percent) and ophthalmology (7.3 percent) had the lowest rates.

The authors note that overall impact was greatest for family practice (42.7 percent), internal medicine (41.8 percent) and neurology (33.4 percent), while impact was slightest on orthopedic surgery (22.8 percent) and neurosurgery (24.0 percent).

Money saved, dose decreased

In their discussion, Friedman and Smith assume “a modest average global reimbursement rate” of $250 for all types of advanced imaging studies. From this, they estimate the dollar value of the studies not reordered in their UM program to be at least $8.75 million.

In cases in which the appropriateness of a request was in question, the collaborative UM program “had a substantial impact on the performance of advanced outpatient imaging studies, regardless of referring provider specialty,” the authors write. “In aggregate, approximately 20 percent of studies were changed or not performed (i.e., the study originally ordered was not done) as a result of a radiologist’s discussing the case with the referring provider or representative.”

They underscore that an additional approximately 15 percent were not performed because of no callback and that the sentinel effect—the tendency for people to perform best when they know they’re being observed—likely loomed large.

“Importantly, these results were achieved without the use of denials,” write Friedman and Smith. “Such is the power of peer-to-peer collaboration and the sentinel effect. An important corollary of decreased utilization is decreased radiation exposure of the population.”

Dave Pearson

Dave P. has worked in journalism, marketing and public relations for more than 30 years, frequently concentrating on hospitals, healthcare technology and Catholic communications. He has also specialized in fundraising communications, ghostwriting for CEOs of local, national and global charities, nonprofits and foundations.

Trimed Popup
Trimed Popup