Nonradiology trainees prefer, excel with simulated chest radiography

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon
 - radiography simulation training
Graphical display of the suggested lung scan pattern.
Source: (JACR 2015;12:1215–1222)

Just as computed simulation helps pilots learn to fly and flight instructors gauge trainee progress, so too the technique can aid the teaching and assessment of nonradiologist healthcare trainees in diagnostic image interpretation.  

The November issue of  JACR includes a report detailing a pilot study on the potential of such simulation.

William Auffermann, MD, PhD, of Emory University and colleagues randomly divided 30 subjects (14 women, 16 men) into a control group and an experimental group.

While most subjects, 22 of 30, were medical students (11 in each group), the pool also included internal medicine residents and fellows, physician-assistant students and nurse practitioner residents.

All subjects underwent training and skills assessment involving identification of pulmonary nodules, or lack thereof, on chest radiographs at simulated PACS workstations.

The simulated nodules were generated by software written in MATLAB (The MathWorks, Natick, Mass.) and inserted into 50 percent of the x-rays.

The simulated PACS workstations consisted of standard 1.7-megapixel computers augmented by ViewDEX software (Sahlgrenska University Hospital, Goteborg, Sweden), which provides PACS-like image presentation and viewing controls.

After completing an initial set of cases, the experimental group received search pattern training (SPT) and the control group did not.

The SPT used static electronic educational slides without an audio component presented at the computer viewing station for around 10 minutes.

The group without SPT received a break from the protocol for a comparable amount of time.

The SPT group demonstrated statistically significant improvement in nodule identification after training at a simulated radiology workstation (change in area under the curve, 0.1079).

In addition, in a post-study questionnaire, subjects indicated their preference for simulated radiology workstations over conventional training methods.

Auffermann et al. concluded that, because diagnostic images are often interpreted on computer workstations, “training to perceive and interpret abnormalities on medical images accurately lends itself well to computer simulation.”

The authors call for further study to identify “the specific educational scenarios in which simulation would be most useful and delineate the structure of simulated environments that would best facilitate learning.”