Poor marks: CME questions in major radiology journals flawed, study finds

Nearly half of all multiple-choice continuing medical education (CME) questions published in leading radiology journals could be flawed, according to results of a study published in the April issue of the American Journal of Roegentology (AJR).

Recent studies have shown that tests ranging from medical school examinations to CME regularly contain questions that are inherently flawed based on universally accepted question-writing standards. This spells trouble for students and radiology professionals alike, says lead author David DiSantis, MD, and his colleagues at the University of Kentucky Medical Center in Lexington. “Flawed items can be as much as 15 percentage points more difficult to answer correctly than questions that adhere to proper guidelines,” wrote DiSantis et al. “Tests that include them have a failure rate elevated as much as 25 percent.”

DiSantis and his team set out to determine how many CME questions published in major radiology journals were potentially flawed according to accepted principles for question creation. To do so, they evaluated 181 total test questions published in the January 2013 issues of AJR, RadioGraphics and Radiology. Three reviewers analyzed each question according to seven distinct sets of guidelines meant to prevent flaws in the multiple-choice question-writing process.

Their results showed that of the 181 questions studied, 78 (43 percent) has at least one flaw, with one question containing as many as four different flaws. “Specific flaws varied widely in prevalence, as follows: unfocused stem, 39; negative stem, 2; window dressing, 1; unequal option length, 23; negative options, 2; clues to correct answer, 13; heterogeneous options, 38,” the authors reported.

Despite their results, DiSantis and his colleagues note that radiology CME questions compare favorably to journals representing the healthcare profession in general, which has shown flawed-question rates as high as 65-100 percent in recent studies. “In comparison with other studies of flawed CME questions in medical journals, our evaluation showed radiology acquitting itself fairly well,” wrote DiSantis et al. “Still, 43 percent of CME questions in leading radiology journals violated standard (multiple-choice question) item-writing principles.”

The researchers believe measures such as Web-based flaw-assessment software, training for question writers and committee review processes are practical solutions that could be used to reduce the number of flawed questions in radiology journals and improve CME for radiology professionals.

John Hocter,

Digital Editor

With nearly a decade of experience in print and digital publishing, John serves as Content Marketing Manager. His professional skill set includes feature writing, content marketing and social media strategy. A graduate of The Ohio State University, John enjoys spending time with his wife and daughter, along with a number of surprisingly mischievous indoor cacti.

Trimed Popup
Trimed Popup