Peace of Mind: Long-term Archives that Deliver

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon

 When it comes to long-term data storage in today’s healthcare imaging environment, it seems as though there’s never enough. The advent of digital imaging modalities and PACS products have compelled IT departments to scale up archive space to meet the increased storage needs of their users.

For many institutions, near-term image archives, generally accepted as maintaining six months to three years of a facility’s most current studies, are ever more quickly reaching capacity. The increasing utilization of diagnostic imaging by a widening variety of medical specialties, coupled with an aging population that requires more healthcare services, suggests that image archive strategies may need to be revisited to mitigate the necessity to manage even greater quantities of image data.

A variety of strategies are available to address long-term archive implementation, depending on a facility’s needs and resources.


Downey moves outside the box



Marjorie Parsons, director of imaging services for Downey Regional Medical Center in Downey, Calif., was a key force in the 199-bed community hospital’s recent conversion to a full digital radiology workflow.

Her department conducts more than 90,000 imaging exams annually. Within the past year, Parsons has deployed a DR Systems PACS, and slashed the utilization of film.

“As part of our new PACS, we installed an 8-terabyte (TB) on-site archive for our study storage,” she says. “The hardware was selected with the help of our PACS vendor and was installed as part of that system. It holds about two year’s worth of exams.”

In addition to implementing the near-term archive, Parsons thoroughly investigated long-term archive strategies before installing PACS. Business continuity and disaster recovery were the prime drivers in her research.

“We wanted complete business continuity to keep the department functioning in the event of a disaster,” she says.

The facility considered purchasing and installing its own on-site backup archive, but physical space is at a premium in the hospital, Parsons says. She also looked at leasing an off-site facility to house a back-up archive, but this, too, was not an acceptable option. That would have required Downey’s IT staff manage the system in a remote location, which would be difficult and put personnel off-site on a frequent basis.

The third alternative, which Parsons selected, was to contract with a managed archive service provider, InSiteOne, to handle Downey’s off-site storage and disaster recovery. With the service, once an imaging study is complete, images are transmitted via a digital connection setup by the vendor to its primary storage facility in Connecticut, and then mirrored to a secondary site in Arizona.

Parsons says the application service provider (ASP) model was chosen because it shifted the expense of a long-term archive from the capital equipment budget to the operating budget. In addition, moving the service to an outsourced provider freed her from concerns about scalability and technology obsolescence.

“We don’t need a crystal ball to predict our future exam volume to purchase archiving hardware,” she says. “This also frees us from the expense and time of keeping our storage system current as technologies change.”


CHW sets sites on standards


Standards compliance is of paramount importance for the healthcare informatics strategy of multi-state healthcare provider Catholic Healthcare West (CHW), says Phoenix-based Steve Hight, CHW director of strategic technology projects. CHW, the sixth largest hospital chain in the U.S., has more than 50,000 employees and 8,000 physicians treating more than five million patients per year.

“Adhering to strict standards mitigates the risks associated with complex systems, which is critical when dealing with the volume of data acquisition that we do,” Hight says.

He is responsible for CHW’s IT architecture strategy and planning as well as for overseeing strategic IT projects. During his tenure with CHW, Hight has led major projects in areas such as data center consolidation, open source technologies, and HIPAA compliance for clinical systems and other critical systems.

All information systems in the CHW enterprise are thoroughly compliant with CHW’s standards, which are specified in detail during the selection process for new best-of-breed technologies, Hight says.

For example, CHW diagnostic imaging uses a heterogeneous mix of PACS products from vendors such as DR Systems, Emageon, and Fujifilm Medical Systems USA, among others, to redundant long-term archives in Rancho Cordova, Calif., and Phoenix, employing Network Appliance technology. Near-term archives, holding approximately a year’s worth of data, are deployed at individual CHW healthcare sites.

“Right now, our growth rate [in enterprise digital imaging] is about 75 TB a year,” Hight notes. “And we expect that to double year-over-year for the foreseeable future as we have an increasing number of specialties contributing images.”

In addition to the increase in non-radiology medical subspecialties generating image data, CHW brings new facilities into its network on an ongoing basis, which impacts the organization’s long-term archive strategy.

CHW’s long-term archive is entirely on serial ATA (SATA) spinning disk, which is mirrored in separate locations in two states. He says this strategy allows for speedy and efficient recovery with no latency in the event of a system outage, and ensures that CHW caregivers and patients will have access to critical medical data across its network in the case of catastrophic disaster.


Metadata points the way at Cleveland Clinic


Robert Cecil, PhD, Cleveland Clinic’s network director, began laying the groundwork for the Cleveland Clinic’s long-term archive in 1996. Working with Sun Microsystems, the facility implemented storage archive manager (SAM) QFS storage-management software.

“SAM QFS is a key technology that lets us do some very critical things,” Cecil says. “It lets us put all the pointers [to our data] on separate media. This lets us independently backup the pointers, which are very small. So, if you lose a disk, you no longer have to recover the entire file system, you only have to recover the data on that one disk.”

According to Cecil, this provides Cleveland Clinic a tremendous advantage in terms of data recovery and data availability. In addition, it allows the archive file system to be easily extended, he says. This last element is crucial for the institution, which has numerous facilities in the U.S. and Canada and is in the process of bringing on a site in Abu Dhabi, capital of the United Arab Emirates.

“For example, if you pull out a 200 gigabyte (GB) disk and put in a 1 TB disk, you don’t have to rebuild the file system because it has no pointers on it,” he says. “You just have to let the system know you have a bigger disk, and it will automatically fill it to the specified capacity.”

Another critical capability of the application is that it allows a single piece of metadata to concurrently point to multiple copies of the data, Cecil says. Because the system is media independent, it allows flexibility in acquiring storage media, which keeps the cost down. It also mitigates availability loss when a disk is lost.

“We have more than 300 disks in a single file system,” Cecil says, “so you do lose one every now and then.”

However, data loss at the institution is so small, that it can’t be measured, he says. Part of the reason for this is that any archive media are burned in for a period of 30 to 90 days and must demonstrate zero loss of data before it is brought onto the enterprise.

Although each site has its own near-term archive capabilities, Cleveland Clinic continuously backs up data from all its facilities to its long-term archive in almost real time, Cecil says. This allows healthcare professionals throughout the enterprise to access diagnostic images from any site worldwide as soon as the exam is completed.

The multinational institution is currently archiving 10 TB of image data per week, Cecil says. Of note, almost 20 TB of image data are being accessed on a read-basis during the same time period.

“Long-term archive is sort of the wrong terminology,” he says. “It’s really a lending library, but every time a book comes in it goes out twice as often. It’s not really a backup scheme; it’s a living, breathing, used-every-day system.”