Image quality in mammography, thanks to concerns about radiation dose, faces a Goldilocks problem—where radiologists want to get the best image possible, while also minimizing risk to the patient.
Faculty of Allied Health Sciences in the Radiologic Sciences Department at the University of Kuwait published a study in Radiography on May 29 which explores the effect of exposure factors on image quality in screening mammography units in Kuwait.
"The primary objective of our study was to evaluate the difference of image quality between the two target-filter combinations used in screening mammography," said the study's lead author, K. Alkhalifah, MD.
The mammography units screened in the study used tungsten targets with rhodium and silver as filters. Optimizing radiographic techniques, according to the study, is important due to the high volumes of radiation in which patients who show no symptoms are exposed to, especially in breast cancer screening and diagnostic purposes. Although radiation levels in mammography screenings should remain as low as possible, Alkhalifah and his colleagues asserted that the quality of the images should be otherwise.
"The purpose of optimizing radiographic techniques is to establish standardized imaging protocols and balancing image quality with patient radiation dose," Alkhalifah said.
Alkhalifah and his colleagues analyzed the visibility of fibers and specks in different kVp values with a tungsten-rhodium target-filter combination and a tungsten-silver target-filter combination. Overall, they concluded that a tungsten-rhodium target-filter combination provides better screening image quality than a tungsten-silver target-filter combination.
"Comparing between the two target/filter combinations at 30kVP, there were significant differences in visibility of fibers, specks, and overall image score," Alkhalifah said. "The results of this study will provide useful information for the proper selection of optimal exposure techniques to specific digital mammography units."