SCCT Feature: Experience is trump card to passing cardiac CT exam
The clinical competence criteria in cardiac CT specified the necessary training to achieve minimum technical competence in cardiac CT. By teasing data from the first cardiac CT board exam administered last September, researchers found that increased clinical experience correlated with a higher pass rate.

The results, which were presented at the 2009 Society of Cardiovascular Computed Tomography meeting in Orlando, Fla., validate both the criteria as meaningful standard of competence and the exam as a valid measure of that competence, according to lead author Allen Taylor, MD, co-director of noninvasive imaging at Washington Hospital Center in Washington, D.C.

Taylor and colleagues from various institutions compared the results from the Certification Board of Cardiovascular CT (CBCCT) exam to the type and amount of self-reported training received and experience (years, cases) in cardiac CT.

The exam pass rate of those who responded (451 examinees; 85 percent) was similar to the overall exam pass rate (81 percent).

Researchers observed high pass rates in those trained via a formal CT imaging fellowship (96 percent), those with more than one month of cardiac CT during clinical fellowships (100 percent), and self-trained, legacy candidates (93 percent). Those receiving training in "hands on" courses (299 examinees) had an 81 percent exam pass rate.

Increased real-world experience was associated with higher passing rates. Taylor said there was a direct bivariate relationship between exam performance and increasing numbers of cases in which a candidate performed the official clinical interpretation.

Level 3 candidates had a statistically higher pass rate (89 percent) than Level 2 candidates (78 percent). Increased time in active clinical practice was significantly associated with progressively higher pass rates. The number of cases reviewed on workstations in CME courses or workshops did not correlate with CBCCT exam results.

"This was a unique opportunity to look at the training standards," Taylor said in an interview. "By all accounts, we got it right."

The data prove that the exam is sensitive to expertise, knowledge and superior performance, but also certification at a level of basic proficiency. "People who had met Level 2 criteria did very well. The more cases they did in practice, the longer they were in practice, the better they did."

Exam participants who had not yet entered clinical practice did not fare as well. In addition, whether these participants completed 50 or 1,250 workstation cases, the pass rate was the same.

Co-author Suhny Abbara, MD, an assistant professor of radiology at Harvard Medical School in Boston, said that when the criteria were developed, the number of cases set forth to interpret had no scientific backing. "We found out that they are meaningful metrics, as those with Level 3 training performed statistically better than those with Level 2 training," he said.

Interestingly, the few radiologists who took the exam--approximately 10 percent of the total--performed slightly better than their fellow cardiologist examinees, Abbara noted.

"We don't really know why that is, but we can speculate a number of reasons including that perhaps only the top radiologists considered taking the test," he said.
Trimed Popup
Trimed Popup