As healthcare facilities experience explosive growth in terms of the number of studies and the size of those exams, the accompanying mountains of patient data require a place to reside. The information must be housed securely and accurately and often be available at momentís notice. Vendors are stepping up to the challenge.
When the five-site University of New Mexico Hospitals organization, headquartered in Albuquerque, selected CommVault back in 2003, one of the key factors was its “futurability” — its ability to respond to growth and change. That has been put to the test and will continue to be tested in the future. CommVault Galaxy data protection software backs up the entire hospital system, which is already 6 to 8 Terabytes (TB) and growing by 4 to 5 TB a year.
The organization made the switch to CommVault to eliminate the database problems with the previous vendor and to employ an enterprise backup solution rather than departmental backup, says Mike Campbell, director of PC systems and support. Before installing CommVault, UNM Hospitals was only backing up about 60 out of 200 servers. Now, “it’s very easy to do backups for different environments — Novell, Windows and Linux. You use all the same management tools, look at the same windows in the same place.” Now it takes one person less time to back up all 200 servers than it did to back up 60.
Backups are done to tape and the original tapes are copied using CommVault AUXcopy and the copy tapes are sent offsite for disaster recovery. They keep two months’ original backup tapes in the library for restores.
“We are trying to move toward tiered storage,” says Campbell. “We’re not there yet but we recognize that different data types live in different storage.”
The other problem Campbell wanted to avoid was index problems on the backup software. For each incident, someone had to read all the tape and then rebuild the index. “It literally took a month or two to rebuild a database,” he says. Part of the problem was that the organization was too big for the software. So, when considering CommVault, Campbell made sure it had a robust database and indexing scheme.
Many vendors, one solution
Al Jeffcoat, systems manager III at Orlando Regional Medical Center in Florida, uses several disk storage systems from IBM. The IBM SAN Volume Controller (SVC) is designed to combine storage capacity from multiple vendors into a single reservoir of capacity that can be managed from a central point and to help increase storage utilization by providing host applications with more flexible access to capacity.
Jeffcoat has seen improved performance, particularly during management of backend storage. He now can move from controller to controller with no downtime. “Before, to do the same things, we had outages for each individual system.”
Preparing for growth is an ongoing challenge, he says. The facility has experienced a 2,500 percent growth rate in image volume in the last five years so “it’s hard to predict what we’re going to need in six months or a year from now.” In the past, Jeffcoat’s plan has been to buy in bulk so there is additional storage on hand when it’s needed. The growth rate requires him to move quickly and with the IBM system, he says he can snap in components pretty quickly.
Baylor Medicine of College in Waco, Texas, has been a NetApp customer since 2001, says Director of Enterprise Services Mike Layton. A couple of years ago, Baylor was struggling to scale its storage based on current demand. “We had plenty of direct attached storage and plenty of physical disk storage available as standalone, but we couldn’t aggregate or make use of it for the greater good,” says Layton. “We had plenty of capacity but only 20 to 25 percent maximum utilization of that capacity.”
So, Layton set an objective of improving backup and recovery performance by aggregating from single to two-tier storage platform. He wanted to be in a position to replicate the most important data for disaster recovery. “We quickly found out we needed both SAN [storage area network] and NAS [network attached storage].” Baylor had many departments that had 100 to 200 file servers that were each separately serving up and storing their own data. Layton began to collapse storage to a single, clustered solution.
Now, the real challenge, he says, is that storage demand is growing so quickly that until he can build adequate retention policies and expunge data no longer needed, everything has to be kept. So he continues to look at other means to tier storage and lower the price point. “We’re pleased but we can’t get too comfortable.” With a new 250-bed hospital in the works for 2008 or 2009, Layton keeps looking at technologies that will let him push off the need to classify data. The addition of a new facility will require storage that will serve up the exact same file to multiple locations if need be. So, Layton will continue working toward putting in place a modified, tiered storage structure relative to savings.