Clinical decision support (CDS) tools help trainee physicians in the emergency department order advanced imaging more appropriately. Then again, experienced physicians using the tools hit about the same appropriateness scores as the interns, residents and fellows.
The findings come from researchers at the Icahn School of Medicine in New York. Their study is running in the April edition of the American Journal of Roentgenology.
Senior author David Mendelson, MD, and colleagues reviewed emergency orders placed for CT and MRI over a three-year period at 1,171-bed Mount Sinai Hospital. The study window comprised pre- and post-implementation periods, the latter using ACR Select. The team scored all orders by the ACR’s Appropriateness Criteria, in which a score of 1 represents definitely inappropriate and 9 definitely appropriate.
Noting that the three most common indications for emergency CT and MRI were abdominal pain, headache and suspected pulmonary embolism, the researchers reported an overall increase in the appropriateness of advanced imaging test ordering.
However, it took some time to get there. The mean score for the pre-CDS period was the same as in an initial post-CDS window, 6.2, but the mean jumped to 6.7 in a later post-CDS period (“CDS 2”).
Further, in a segmented regression analysis, mean scores significantly increased when the team compared pre-CDS with post-CDS 2 periods for both trainee and experienced physicians, and there were no significant differences between the cohorts.
“Importantly, in this first study to examine the differential effects of CDS based on provider status, there is no significant difference in appropriateness of advanced imaging use between house staff [trainees] and non-house staff providers [up to attending physicians],” the authors wrote.
In their discussion, Mendelson and colleagues commented that their investigation “reiterates the positive effect of CDS tools in increasing the rate of appropriate imaging use. Although our study revealed no significant difference in the effect of CDS on house staff physicians compared with non–house staff physicians, the results likely reflect the multifactorial influences on ordering behavior in our health system. CDS tools may thus serve as an equalizer in directing quality patient care from physicians across the training spectrum with varying familiarity with Appropriateness Criteria.”
They called for additional research drawing data from other hospitals with more granularity on such variables as trainees’ particular specialty and year of residency.