At Your Service: Will Service Oriented Architecture Add Interoperability to Imaging?
Service Oriented Architecture (SOA) is not a new concept, but the medical field has been slow to adopt it. This could change as more and more radiologists, administrators and IT leaders recognize its value in a field that is requiring a greater degree of interoperability.

While service-oriented architecture may be the “flavor of the month” among radiologists, says Paul Chang, MD, medical director of enterprise imaging and SOA infrastructure at the University of Chicago Hospitals, it’s based on a concept that’s been around for years. As we often lament, healthcare tends to lag behind nearly every other business when it comes to implementing information technology, and SOA is no exception, according to Chang. But, he believes SOA is a concept whose time has come.

“In medicine, we need to be able to support a complex workflow,” says Chang. “And in order to achieve that, we need information from various kinds of systems and databases. I need to look at images from PACS, clinical information from EMRs, pathology reports. I have to talk to billing systems. So, in order to support that complex workflow, we need access to information from systems that don’t natively talk to each other. SOA does that. SOA attempts to address interoperability.”

Chang’s favorite example of SOA success is Amazon. During a single customer transaction of just a few minutes, Amazon’s backend interacts with dozens of different databases—warehouses, banks, delivery companies, he notes. It is seamlessly interoperable—and lightening quick to the user.
But, Chang adds, if Amazon operated the way most PACS do, a customer transaction wouldn’t be so seamless, and would require the customer to log in and out of systems in order to buy the correct book, access the funds necessary to purchase the book, and make the necessary arrangements to ship the product.

From a business perspective, the results for Amazon would be catastrophic, Chang notes, but in radiology, “that’s what we do every day, hundreds of times a day. But when we go to Amazon, it’s all done seamlessly. What’s the difference? It’s SOA.”

In health IT today, the only way to consume information is to use specific applications. “So, if I want to look at images, I have to look at the PACS application,” he points out. SOA adds a level of sophistication and interoperability through “loose coupling.”

“We disengage the relationship between the content—which is the valuable stuff—and the presentation or display of the content,” says Chang. “That is decoupled, so now instead of only being able to access the content through a specific application, you create a middle layer—a SOA layer called an enterprise service bus—that can talk to various databases, gets the content, and creates it in a form that’s universally useable.”

Here’s how SOA works at the University of Chicago. A physician, for example, may look at images on a PACS and at the same time wonder whether the patient is exhibiting a fever. In most cases, that would require the physician to log in separately into the EMR. In this case, Chang can, with a click of a button, get that patient information, namely the patient’s recent temperature, to come up automatically within his PACS applications.

“It’s the same content as the EMR,” says Chang, “but it’s orchestrated to my appropriately idiosyncratic requirements where I don’t have to be the integration agent. SOA does the integration for me and that’s a big advantage.”

In addition to the advantage SOA provides as far as patient quality care and workflow efficiency are concerned, Chang also makes a business case for SOA.

“Patients are no longer passive health consumers,” he says. “They shop around. If, God forbid, a patient has cancer, he wants a one-stop shop. He doesn’t want the traditional routine where he gets a CT exam one day and because there is a delay in delivery he has to come back the next day to see an oncologist. A radiology, department that can offer same-day service adds value and has given itself a competitive business advantage,” says Chang.

To do this, the first step is a “really intelligent” PACS worklist, says Chang. A radiology information system only knows what is being scheduled to be imaged, he explains, and the location from where it was ordered, but it doesn’t know whether a patient has been scheduled to be seen by another scheduling system.

“But all we had to do was build a SOA scheduling service that basically says that for every patient who is scheduled to be seen, when a study comes in through DICOM in my PACS, my SOA intelligent agent can recognize that the patient is scheduled to be seen in the clinic,” says Chang. “I will prioritize this and signal to the PACS worklist that this needs to be at the top of the list. It’s a very powerful tool.”

SOA speeds up algorithms

The Medical Imaging Informatics Innovation Center (MI3C) is a collaboration of the Mayo Clinic and IBM that started three years ago thanks to a kids game. The genesis of the collaboration involved the release of the gaming platform, Playstation 3, says Brad Erickson, MD, PhD, co-director of M13C and head of Mayo’s Radiology Informatics Lab.

 “A lot of mathematical calculations are done in computer graphics and gaming that are similar to the kinds of calculations we use in medical imaging,” he notes. “And the thought was that learning how to use high performance computing tools and applying them to medical imaging informatics would be a good source of collaboration.”
Mayo and IBM have, according to Erickson, built complex, advanced analytic algorithms that align and analyze images quickly and accurately. They decided to use these advanced algorithms in tandem with a SOA-based workflow solution (called AMIS—Advanced Medical Information Systems) to improve the image comparison and analysis process executed by radiologists.

The first project targeted the process of identifying brain aneurysms with MR angiography. In the M13C project, once images were acquired, they were automatically routed to M13C servers where algorithms aligned the images properly then analyzed them to find and visually mark potential aneurysms. It was a clear and quick success, according to M13C researchers: this automatic method of detecting aneurysms resulted in a 95 percent accuracy rate, compared with a 70 accuracy rate for manual interpretation.
Where SOA comes in, says Erickson, is that while MI3C may have devised an algorithm for finding aneurysms in MR angiography images, Mayo does countless MRIs and not all of them are of the head. And those of the head aren’t necessarily MR angiograms and this particular algorithm only works on MRA sequences. Enter AMIS. It looks at every exam that’s been done on an MRI scanner at Mayo and looks at the properties of those exams. And when it finds an appropriate exam, according to Erickson, “it says, ‘Aha! This is one of those MR angiograms of the head.’ It proceeds to fire off the dataset to the algorithm, collects the results once the data have been analyzed, and then sends the data on to a radiologist.

The way SOA works in this process, says Rick Stevens, the lead developer of AMIS, is that the first order of business is to define a set of business rules that would let the researchers detect when they had received an appropriate imaging series that they could perform their analytics on. The business rules define the criteria for recognizing the desired imaging series, which they can then “bind” to an appropriately detailed and described workflow that determines what happens to those images.

“In the aneurysm case, we’re routing those results back to a DICOM system that the radiologist has access to so he can actually view the results of the analysis,” says Stevens “We’re also able to do a separation of the three-vessel region of the brain and route that back to the scanner so that the technologist running the exam can view those results and include those in the study. So it really starts with a basic workflow that includes storing and captured metadata, but we also are able to add additional ‘service’ calls in specific cases where we want to do analytics.”

What this workflow engine did in the brain aneurism project, says Erickson, was to handle the task of checking out all the MR scans that came into it, “properly dole out the MRA exams to computing processes that are trying to find aneurysms, collect the results and post them to the radiologist. It is a very modular, flexible way to add to other computing algorithms.”

And that flexibility is what is supposed to distinguish SOA, says Bill Rapp, chief technical officer of healthcare and life sciences for IBM and co-director of the Medical Imaging Informatics Innovation Center. “What’s particularly useful about this approach,” says Rapp, “is that it is extendable by the customer so that if  others were to use this they can add their own services, extend the workflow, modify the services, replace the service we provide with one of their own choosing, so it’s a very pluggable, customizable type of architecture, which SOA has been touting for a long time as one of its advantages.”

AMIS also is very fast, and very accurate. In seeking out aneurysms, technologists must trace out the three vessel region of the brain. As Stevens explained, in the aneurysm detection project, Mayo was able to identify those vessel regions and automatically create renderings. “So this workflow engine also doles out the MRA to a separate process that identifies those vessel regions and produces renderings, collects them and sends them back to the radiologist,” says Erickson.
The mapping takes the computer four minutes to complete—compared with the 12 to 15 minutes it takes a technologist—which means it can take up to an hour or more for a busy technologist to trace the vessels in between imaging patients. That means the exam is often delayed at least an hour before the radiologist can read it. So not only is the algorithm detection process about accuracy, it’s about speed and efficiency as well, says Erickson.

M13C has other projects in the pipeline as well. One of the major goals of the brain aneurysm project was “to prove out the flexibility of the underlying SOA platform to allow us to plug in new stops, workflows and analytics easily and in a non-disruptive manner,” says Stevens. “And we’ve seen that to be true.” According to Rapp, one new project involves thin-slice lung CT, which has been “fairly easy to add to the mix, because once the algorithm was written, Rick [Stevens] was able to modify the workflow fairly quickly.”

Globus MEDICUS project

The Globus MEDICUS (Medical Imaging and Computing for Unified Information Sharing) project was created and funded in 2003 by the National Institutes of Health (NIH) out of the need to eventually connect more than 40 international medical centers to allow the Children’s Oncology Group (COG) to participate in multi-center clinical trials while linking them to the Image Data Center at the University of Southern California (USC). The project is now expanding to the more than 200 centers within the COG.

The problem at the time, says Stephan Erberich, co-director of medical information systems at the Information Sciences Institute of the USC, was that these clinical trials had imaging endpoints “and it was a very cumbersome process to get these images into standardized review, which is a very important component of clinical reviews, having two independent reviewers who actually judge and review images independently from the institutional reviewer and get an unbiased combined result.”

Erberich and other founders of the project took a look at the technologies available at the time and decided to take a “new and fresh” approach by looking at SOA using the Globus toolkit. MEDICUS created an abstract layer between data, meta-data and users linking DICOM storage service providers and registries. Data, such as images, could be accessed on this layer, and decoupled from the specific health providers’ databases, networks and IT resources, such as PACS.

From a clinical trials perspective, says Douglas Reaman, MD, chair of the Children’s Oncology Group, the big benefit is that independent clinical reviews can now be done “in real time. Radiology reviewers can actually review the films in real time at their desktops at home or at work,” he points out. “They don’t have to travel to some central facility where the films are kept and archived.”

“From a user perspective, it was like a ‘cloud’ concept,” says Erberich. “You go to Google and execute a search, but as to how that search is executed, you really don’t know anything about it. That’s the way it looks from the user perspective. The reviewer gets the images that are relevant to do specific tasks, but in a seamless way.”

What was particularly valuable from an institutional perspective, says Erberich, was the ease of implementation. The technology needed to implement MEDICUS at each of the institutions was minimal, “so there was no need for IT people to install it or configure it. This was important considering that for many of these trial sites there wasn’t a lot of IT support available, so the fact we were able to bring in our own hardware actually speeded up the process of bringing in those images.”

Is SOA the future of radiology?

As one of SOA’s strongest advocates, Chang says the concept has been embraced at the University of Chicago—a concept that is used by every other industry from banking and manufacturing to retail and the military—“without exception.” And he asks “If it’s good for everything else in the world to create this seamless integration without requiring the human user to be the integrating agent—which just adds fatigue, error and inefficiency—why don’t we just use the same technology in medicine?” Stay tuned.