Radiology departments are increasingly shopping around for artificial intelligence to bolster the quality of care for patients. Considering an investment in AI is similar to most other health technology solutions, and must be done carefully.
That’s according to imaging experts from MedStar Georgetown University Hospital and the University of California San Francisco, who caution against purchasing a solution solely because AI is a hot topic. The process, they wrote, begins with defining a problem and analyzing the surrounding environment, including workflow and costs.
“If a tangible problem is identified and AI tools targeting that problem are commercially available, then one can begin to evaluate whether purchasing an AI tool makes sense,” they wrote Tuesday in Academic Radiology.
A top consideration should be how AI will work in practice, the authors noted. This involves understanding the research behind the tool.
For example, performance is typically conveyed using a receiver operating characteristic curve, along with sensitivity and specificity data. AI tested in clinical settings is more valuable than in controlled research environments.
Analyzing the data used to develop an algorithm is also important. Has it been trained and tested on patients that match your institution’s population? Small data changes can significantly hamper AI models, the authors explained.
Some practices may want to independently validate performance—if it hasn’t already been done by the vendor. This can be labor-intensive, the authors cautioned. Vendors or consultants can help those without the required experience or resources for the job.
How a tool fits into existing workflows is critical, according to Chief of Imaging Informatics at MedStar, Ross W. Filice, MD, and colleagues. Radiologists should ask how results are displayed. How easy are they to understand? Can the AI explain its recommendations? And will it work alongside other systems?
The tool must function in tandem with radiologists, “without hampering their efficiency or capability,” the authors wrote.
Also, research regarding the vendor, including its existing customers, financial health and product road map, is informative. Many companies are new to healthcare and underestimate the time required for integration.
Purchasers can choose between one of two implementation strategies: “best of breed” or a platform approach. The former gives rads flexibility to choose any vendor or tools, but is less scalable. On the flip side, departments may opt for a vendor that offers its platform for outside companies to deploy their own tools through. This can save costs but will limit the “universe” of tools and customization options.
Regardless of choice, the institution’s IT group must be involved before selecting a vendor.
“Successful purchasers will engage with their IT department before purchase so that infrastructure requirements are considered in the total cost of ownership,” the authors wrote.
Radiology departments can use many different business models for deploying AI, spanning large purchases to continuous subscription fees. Most vendors, the authors noted, tie fees to a per-click or exam volume basis.
Quality improvement is the best reason for purchasing AI but can be hard to measure. Practices should consider if they want to demonstrate these gains and if the vendor will help.
Quality and Safety
Informatics and quality and safety leaders must partner to design a program that monitors the AI, Filice et al. noted.
Understanding whether a tool reinforces existing healthcare disparities can be difficult to detect. AI buyers can evaluate the gold standard data used to train algorithms and continually monitor trends for possible quality issues.
You can read the guidelines in their entirety here, including an AI purchasing checklist.