Business Continuity: Facilities Tackle Terabytes

 Healthcare facilities are constantly increasing the amount of data they have to manage and store securely. From an increase in patient volume to more and more modalities that create larger datasets, double-digit growth year to year is not unusual. But a wide range of solutions for high data availability and redundancy offer a means to limit downtime as much as possible—if even at all.

In the summer of 2006, Iowa Health System experienced a complete failure of its PACS archive environment. It was corrupted beyond repair, says Bob Thompson, director of governance. It took weeks to recover images and months of confirmation. “That was fairly painful and it was only a 10 TB recovery,” he says.

Thompson was already in the process of looking for a new storage architecture based on new modalities that were in the planning stages and growth of about 15 TB a year. The health system was just getting started with PACS and Thompson saw a huge demand for the PACS environment within the state of Iowa. Cardiology PACS means huge image sets and he knew three new dual-source scanners were coming this year. “You rack up the storage really fast, regardless of how you pare down the permanent images.” That demand for storage—and that’s just for PACS—meant they needed a substantial storage solution, Thompson says. Their choice was the IBM GMAS.


Constant connection



Iowa Health System uses the IBM GMAS as its storage solution.

Thompson has dozens of clinics he needs to connect and rural hospitals and non-affiliates that would like to connect to the system. Installation went very well, even with Thompson insisting that his team perform the acceptance testing. “That meant a great deal to us because some of the problems we’ve had is that our storage team has a lot to do. This really automates a lot of that.” That automation results in less dependence on the staff’s expertise and less risk of human error.

Thompson and his team installed primary and secondary systems from Bycast with an initial size of 30 TB each. The solution was “one of the easiest sells I’ve ever done” to Iowa Health System’s administration, he says. “One of the biggest selling points was that the cost was no more than we were already paying. If you counted what you were actually getting, the cost was less.” Plus, Thompson could show that the system was easily expandable, an important concern for the future. Thompson already has expanded GMAS to three major applications and expects storage for each of those to grow substantially.


Drive to digital


Since signing on with InSiteOne more than four years ago, Tower Diagnostic Centers in Tampa Bay, Fla., has gone from almost all analog imaging to almost all digital imaging. The organization was drawn to InSiteOne’s ability to index data in addition to storing it, says Chief Information Officer Don Fulk. Since the organization is not stuck in a proprietary system, in the event of switching to a new vendor, there won’t be migration and conversion fees. Fulk also appreciated InSiteOne’s offsite redundancy in two locations, no need to purchase hardware and no need to dedicate an employee to the system.

Facilities have to decide what data are most important, Fulk says. “We are almost a 24/7 shop. All of our data are critical to us.”


Routine replication


Norton Healthcare in Louisville, Ky., has done more than talk about it when it comes to storage and business continuity planning. “Earlier this year we replaced our two main storage arrays with a DMX3 from EMC Corporation,” says Sean O’Mahoney, manager of client/server information systems. Norton had an 8730 Symmetrix as well as a DMX800. O’Mahoney replaced both and combined them into one DMX3. The DMX800 went to a remote site for replication. Norton also has two Clarion CX700s, two CX300s that are directly attached to servers at some of their hospitals and four Centeras.

All HIS data are replicated at the facility’s remote site. The Centeras archive radiology and cardiology PACS data, which is a switch from an optical platform. “We’ve got our archive and we’re able to use Centera’s native replication to send offsite,” says O’Mahoney. “We don’t have to spin it off to optical or magnetic tape, take it offsite and store it to have a third copy of data. We’re very pleased with it.”

Over the next 12 to 18 months, Norton is investing in a disaster recovery data center that they will either lease or construct. “In some cases, we’re looking at clustering, active-active or active-passive set-ups so that we can run some systems from that data center,” says O’Mahoney. He will move assets from the current replication site to the new center and continue to replicate to those, while having plenty of space and power. The project, however, has been “on and off the books” for three of the more than four years O’Mahoney has been at Norton. “The mindset has changed over time, both due to new perspectives as well as growing realization of the dependence that we have as an organization on the data we have on the computers and network.” With electronic medical records, “it is vital for legal as well as safety reasons that we do everything we can to preserve that data.”

O’Mahoney says that he has a better handle on future growth these days. Now, Norton is building a new hospital and planning for digital mammography—two developments that will demand a lot of storage. The system also bought an imaging center which has seven different pieces of imaging equipment sending data to PACS.


Gwinnett’s growth


Gwinnett Hospital System in Lawrence-ville, Ga., has all primary storage on EVA8000 and EVA5000 storage arrays from HP.  “When Gwinnett bought the arrays five years ago, they brought in the system with 1.5TB capacity,” says Rick Allen, service line director, information systems. Our plan was for that to last three years. Nine months later, we brought in another 1.5 TB and then 10 TB.” Over five years, three more EVAs were completely filled. Now the system has grown to more than 100 TB of data.

That growth is due to a healthy mix of new modalities and volume, Allen says. Volume jumped from 250,000 procedures a year to about 400,000, and the system has added multidetector CT among other equipment. Allen has tried to make it easy to increase storage capacity. Since the system will be opening more outpatient centers, imaging volume will only increase.

“I got here just at the cusp of senior management understanding that there was a need and that a major investment would be made,” he says. Allen still competes against investments such as another CT scanner for IT’s share of the budget, but he has been able to make just as compelling a case in terms of need.

Allen has engineered a solution around a goal of having as brief downtime as possible. “We’re building and scaling around ‘care critical’ data. Every piece of information needed to deliver care is online somewhere.” There are two copies of every image in one data center and two more copies of that same image in a secondary data center.


Mirrored images


Sun Health, with two hospitals and numerous clinics in Phoenix, Ariz., has experienced 18 percent growth in imaging procedures over the past few years and expects to top 300,000 annual procedures in about two years. To accommodate that growth, the organization worked with Network Appliance to create a mirror system, says PACS Administrator Micha Ronen. The two hospitals are seven miles apart. Each facility incorporates 16 servers, and eight support the live system. The other eight are dark and are turned on when a fail over is initiated. The servers are configured in clusters which promote load balancing and redundancy.

Should one server in a pair become unavailable, the second server will ensure continuous operation. Storage fillers total 19.5 TB and comprise two types of filers. A 2 TB FC high speed filer is for the applications and databases, and a second 17.5 TB SATA low speed filer stores the images.

When the project began in late 2004, the system’s IS department could not manage or provide PACS connectivity, Ronen says. So, he and his team built a subnetwork just for PACS, independent of the Sun Health backbone. Finding remote storage too costly and setting a goal of high availability led to the design. It was cost-prohibitive to have a totally automated mirror system with no intervention, Ronen says. “So we ended up building a system where we run several scripts. It takes about 15 minutes to be back online and be fully operational.” The system has been successfully tested. The connection between the two facilities runs in the background, and does not interfere with normal usage.


Future forces


O’Mahoney anticipates changes coming to the field of data availability and storage. He sees some sort of federal regulation regarding email retention in the making. With email having “become a huge enterprise application for us” and the ambiguity of the HIPAA regulations when it comes to email, further guidelines in the next couple of years might be necessary. One of the most challenging aspects, says O’Mahoney, is the ability to not only retain data, but the ability to retain it in a manner that it can be reproduced in a reasonable amount of time.

Another challenge is managing the amount of data that continues to be generated. The volume “almost dictates that our archive strategy be, as we have constructed it, nearline Centera type of storage, replicated offsite, as opposed to backing up tapes at night.” Establishing that type of system probably will be a struggle for smaller facilities, O’Mahoney says. “There are, and will continue to be, [health] systems that can only afford to back up data on tape and store them offsite.” 

With more companies in the news experiencing data breeches, it is vital that patient-related information be secure. One step Norton has taken to prevent unauthorized access is securely sending data that need to be archived offsite via another archive platform rather than through a third party where it could potentially fall into the wrong hands.

 

Business Continuity: Lessons Learned
Any facility that is considering a significant investment is wise to consider the advice of those who have already traveled down that road. In fact, that’s exactly the advice of Rick Allen, service line director, information systems for Gwinnett Health System in Lawrenceville, Ga. “Partner with a company experienced in several industries,” he says. “Healthcare is in its infancy in adopting these kinds of things.” A company experienced in other industries can leverage what they’ve done.

Allen also recommends building a very scalable system. “When you grow 100 times when you expected your data to double, when your needs far exceed your expectations, you need something easy to add to to keep up with demand.” Build for availability and reliability, he says, and make sure you’ve got good copies of your data. “All those core things that go around running a good IS operation. You’ve got to make sure you’ve got those in place.”

Know your requirements before you sit down with a vendor, says Bob Thompson, director of governance for Iowa Health System. “Everybody sells what they’ve got regardless of the quality or the fit. The better firms will shy away if they are not a good fit.” The buyer needs to beware. Beyond the top tier of vendors, there are lots of relatively inexpensive solutions available “that I would be very scared of,” Thompson says. “To get the value we see from our SATA drives, it requires some substance behind the equipment and a good organization behind that architecture.” If those components aren’t there, you could end up with a low-cost solution that destroys your entire archive within three months.

Don Fulk, chief information officer for Tower Diagnostic Centers in Tampa Bay, Fla., recommends purchasing a flexible system to accommodate long-term growth. “PACS vendors change,” he points out. “If you’re locked into a certain vendor, you need to be able to move from one to another and not worry about how to migrate seven years of data.”

Pay special attention to how your company will grow and the long-term plans.
Beth Walsh,

Editor

Editor Beth earned a bachelor’s degree in journalism and master’s in health communication. She has worked in hospital, academic and publishing settings over the past 20 years. Beth joined TriMed in 2005, as editor of CMIO and Clinical Innovation + Technology. When not covering all things related to health IT, she spends time with her husband and three children.

Trimed Popup
Trimed Popup