Doing more with less. It is the mantra of healthcare today, as providers across the U.S. strive to offer better care at a reduced cost. In this environment, the necessity of comparative effectiveness research becomes increasingly apparent. Can challenges specific to radiology be overcome to cogently conduct comparative effectiveness research, making practice more patient-centered in the process?
Comparative effectiveness research (CER) has secured its spot in analyzing and evaluating the performance of procedures, diagnostic tests, and treatments in relation to one another. It aims to identify the most safe, effective, and in some instances cost-effective approach out of a variety of procedures and strategies, thereby eliminating those that do not better patient management or outcomes. CER also strives to improve the decision making of the multiple stakeholders involved in healthcare—providers, patients, payers, researchers, policymakers, and manufacturers—regarding industry improvement both on an individual and population basis.
Complexity and challenges behind cogent CER
CER holds much promise in bettering radiology by justifying the value of imaging. This specialized research links the path from diagnosis to patient outcome, proving to patients that their scans and other procedures are purposeful—or not.
Despite the benefits of CER, execution remains a challenge. “The outcomes of a comparison study between different imaging strategies may vary depending on patient populations,” says James Rawson, MD, FACR, of Georgia Regents University Augusta. According to Rawson, small subpopulations that benefit from clinical services must be identified in CER.
“To demonstrate radiology’s value and possibly head-off further reimbursement cuts to the specialty, CER has to show a linkage between the actual radiology intervention and the outcome for the patient,” adds Rawson.
That link is another challenge, says Alfred Berg, MD, MPH, of the University of Washington School of Medicine in Seattle and member of the Patient Centered Outcomes Research Institute (PCORI) Methodology Committee. Based in Washington, D.C., PCORI aims to improve healthcare decision making and improve healthcare delivery and outcomes through the production and promotion of high quality, evidence-based information. To date, PCORI has approved 279 awards totaling more than $464.4 million dollars to fund patient-centered CER research projects. “Interventional radiology is at the core of a lot of issues related to CER. It’s often easy to have research that shows that one practice is better than not doing anything, but there is a struggle to find research that directly compares two different modes,” says Berg.
“It’s hard to find research for CER that examines different classes of treatment,” Berg continues. “While a series of reports is needed to create questions that make this type of research possible, the demand for time, patient resources, and funding makes it more difficult. However, this is the kind of information we need, especially in radiology.”
Another obstacle to conducting CER in radiology is rapidly changing technology. Large-scale clinical trials attempt to compare mature diagnostic imaging techniques, but their constant evolution makes this task difficult, says Rawson.
Workflow and exam ordering is another major challenge with radiology versus other medical specialties. While the focus of CER research in radiology is on radiologic exams, radiologists do not order them. Consequently, the ordering behavior of referring physicians needs to change, explains Rawson. “If we want to alter behavior, we need to look at all of the things that can be done to ensure that the desired behavior is the easiest pathway for referring physicians,” he remarks.
A study published in the British Medical Journal by Bessen et al exemplifies this issue. Though the Ottawa ankle rules (OAR) have been widely accepted as a highly accurate clinical decision tool in the identification of high-risk ankle injuries that require CT, the application of these rules in emergency departments varied considerably. The study’s authors discovered that a multifaceted change strategy that included education, use of updated problem specific radiography request forms, reminders, audit and feedback, and “gatekeeping” by radiographers bettered clinician adoption of the OAR.
“Checklists, computerized order entry sets, clinical pathways, and decision support software are all ways to make standardization