IT Challenges for fMRI

Twitter icon
Facebook icon
LinkedIn icon
e-mail icon
Google icon

After unlocking secrets to brain function for the past two decades at research sites, functional MRI is gaining value in clinical practice, particularly in managing patients with brain tumors and arteriovenous malformations. The large fMRI data files these systems create, increasing number of scans and complex image processing also are challenging IT professionals to practically manage this massive image bulk.

Since its introduction in the early 1980s, functional magnetic resonance imaging (fMRI) has gained widespread acceptance among neuroscientists as a tool used to detect regional changes in cerebral metabolism related to cognitive, perceptual, behavioral and emotional functionality. The most widely adopted technique uses blood oxygenation level dependent (BOLD) contrast to exploit the different magnetic properties of oxygenated and deoxygenated blood. Resultant activity maps are superimposed as an overlay on the anatomic images produced by the MR scan.

Functional MRI is on the move from the research center to the clinical setting in the management of patients with brain tumors and AVMs (arteriovenous malformations). And at the same time, post-image acquisition processing procedures and management of large image and data files requires sophisticated network configuration by IT professionals to facilitate workflow.

Keith Thulborn, MD, PhD, professor of radiology, physiology and biophysics in the Center for Magnetic Resonance Research at the University of Illinois in Chicago explains that by their very nature, fMRI studies are extremely complex. They include not only the images an MR scanner produces, but also non-image data such as the behavioral patient response to a task, changes produced by patient movement or underlying physiologic functions such as breathing or heart rate, and the requirement that all of that data be correlated with image data in real time.

For example, a patient may be asked to read several sentences and answer a question relating to each sentence to test for comprehension. That task performance will be analyzed for accuracy and to determine response time. The patient who reads three sentences in 30 seconds and answers all questions correctly is different from the patient who reads five sentences, but has an accuracy comprehension rate of 50 percent, demonstrating probable guesses to some questions.

"If you don't know the performance, you can't evaluate the activation map," says Thulborn. "Someone with an accuracy rate of 50 percent will have a different activation map than someone who is reading and accurately answering questions."


SCANNERS RISE TO THE REAL-TIME CHALLENGE

Two components impact the ability to accomplish fMRI studies in real time - the speed of the scanners, and post-processing tasks required to analyze and correlate all of the data produced during the study.

Kyle Salem, PhD, manager of MR research collaborations for Siemens Medical Solutions, explains that although some researchers and clinicians use a 1.5 Tesla magnet, Siemens Magnetom ALLEGRA 3T short-bore, superconductive magnet scanner has been optimized for neuroscience with a gradient of 40mT/m SR 400 to offer a maximum BOLD signal, and enable extremely fast studies. Ninety percent of their market for the ALLEGRA involves research departments in universities that conduct studies in neuropsychology and other neurosciences.

"For IT, the No. 1 issue is the huge volume of imaging studies," explains Salem. "They image the entire brain every 2 to 3 seconds, and the total study takes about 10 minutes. So if you do 30 slices every two seconds, that means 900 images every minute, times a 10 minute study equals 9K images."

With 128 x 128 matrices, this means about 64K per image - creating 576 MB in 10 minutes. For eight channels, about 4.6 GB is received in about 10 minutes, Salem says. What does this mean for managing images per week? For a one-hour study, including about 4 functional runs, about 25 GB of data or more is acquired. For 10 hours of imaging each day (often more in many facilities), you accumulate about 250 GB of data, which are often networked with various servers for processing in individual labs. Thus, more than 1 terabyte in images needs to be routed, processed and stored each week.

Siemens customers use the MR neuro task card in syngo to facilitate productivity of real-time fMRI studies, including 3D image renderings.

"The advent of parallel imaging is both good and bad," Salem concludes. "Better information is generated, but you have to deal with massive quantities of data."

James Pekar, PhD, manager of the F.M. Kirby Research Center for functional brain imaging at the Kennedy Krieger Institute and associate professor of radiology at Johns Hopkins University, is using both the Philips Medical Systems Intera 1.5 Tesla and 3.0 T MR scanners to support a large number of research applications. The Kennedy Krieger Institute serves as an international resource for children with a variety of brain-related disorders from mild learning disabilities to rare genetic disorders.

Pekar believes that their 3T scanner holds advantages for most of the protocols their investigators are engaged in performing. However, managing the data require sufficient computational resources.

"With these studies, you can easily acquire a gigabyte or half a gig in a single session," says Pekar. "A typical reconstructed matrix size might be 128 x 128 x 40 slices, and you might have that for a couple hundred time points, and then you may have five or six of those in a session. So you'll need a computer with 2GB of RAM (Random Access Memory) to handle it, because although you acquire the data a slice at a time, you want to analyze each voxel's time course.

While the raw data are acquired four dimensionally (space plus time), Pekar says that an increasingly popular technique is to accomplish 3D renderings not in native 3D space, but rather on an inflated and flattened cortical surface.

Once data are captured, investigators may choose different modes of managing that data. They can take raw data on a CD to their own analysis systems, they can send data via a network to another site, they can use the computational resources at the Institute for data analysis, or they can use a combination of techniques. Pekar says their center is dedicated to insuring that the investigators have an efficient and effective way to accomplish their research goals.

The University of British Columbia High Field Magnetic Resonance Imaging Center uses the Philips 3T Intera scanner for research protocols related to psychiatry as well as other brain research studies.

Clinical scientist Burkhard Maedler, PhD, configured a DICOM server to manage the images, and then backed away from that approach for fMRI scans due to the cumbersome nature of managing the complex fMRI studies in DICOM format. They use the proprietary Philips database, PAR/REC binary file format, that incorporates all of the images into a single file for automatic analysis. SENSE (SENSitivity Encoding) technology provides an increase in acquisition speed and improves reconstruction time.

Alex Mackay, D.Phil, a professor in radiology, physics and astronomy at the University of British Columbia, serves as the director of the High Field Magnetic Imaging Center.

"There is a lot of advanced processing going on," says Mackay. You want to see the brain every two seconds over 20 minutes while the person is doing some paradigm. If the patient moves his head, it will be hard to follow one structure, so all of the images have to be registered with all other time points." Couple that dynamic with the fact that every person's brain is unique in shape and structure, and the challenges become daunting.

GE Healthcare has focused attention on moving fMRI from the research laboratory to the clinical setting with the development of the EXCITE data pipeline for both the Signa Infinity 1.5 Tesla and 3T scanners.

"We saw the need for a completely new imaging chain that would manage a very data-intensive procedure such as fMRI," says GEMS' Lindsey Carver, global manager of advanced MRI applications. Besides rapid acquisition of images and additional functional information, moving the data to be analyzed and reconstructed in real time is accomplished at a rate of 25 frames per second by the system.

Using a software package called BrainWave that automates many of the processes required in these studies, clinical technicians are able to manage the data files.

"Typically, what they would do with the BrainWave software is to take all of those thousands of images and process them into a fused image," says Carver. "Normally that three dimensional data set would be what was sent to the PACS." At that point, the images are configured in DICOM format, and can be distributed to RIS, PACS or any other information system that reads DICOM data.

Currently, the primary clinical applications for fMRI involve pre-surgical planning for management of some patients with brain lesions, and assessment of treatment in patients with some forms of mental illness, such as schizophrenia.


CASE STUDIES

From an IT perspective - data analysis, management and storage assume mission-critical significance. While research and clinical applications may occur in the same setting, fMRI has until recently resided primarily in the research domain. Different challenges arise depending on the purpose of an fMRI study.

Kennedy Krieger Institute is a research facility that accomplishes these studies for a number of investigators. Joseph Gillen is the research associate with the responsibility of managing the system that enables these research protocols to be completed.

"There's a technique on the scanner to export your data without using the DICOM system, which is the usual way to move data off," says Gillen. "This basically gives us the entire scan as a single file. So we take that 4D file, copy it onto a 73 GB Ultra-SCSI [small computer system interface] drive provided by Philips as part of the console. That contains not only data but also a header file. We then use a FTP [File Transfer Protocol] to move that off the scanner to our server."

For the server, they use a Sun Microsystems V880 comprised of eight CPUs (central processing units) which does data processing in addition to serving as a file server. There are two 100 GB RAID (Redundant Array of Independent Disks) arrays, the first is a A1000 made by Sun Microsystems that is used as a disk cache for the tape library. Behind that is a 4 TB capacity tape system, the Qualstar TLS4480 that has four Sony AIT2 tape drives and holds 80 AIT2 tapes that are 50 GB each uncompressed, where the data are stored in an HFS (hierarchical file system). There are four differential SCSI interfaces on the system for connection of these devices, one for each RAID array and one for each of the two tape drives.

Gillen has configured the system so that when disks become 80 percent filled, an automatic program running in the background deletes the oldest and largest files, although the directory entry for the file remains. If someone needs to access one of these files that has been moved to tape storage, it takes approximately 30 seconds to retrieve the file.

The speed of connections is important to the process. The Philips scanner runs a 100 MB Ethernet, and Gillen says they are looking to upgrade that capability. The V880 sever is running a GB Ethernet to the network hub and the various computers that are connected to it.

Joseph Maldjian, MD, associate professor of radiology and director of advanced neuroscience imaging research laboratory at Wake Forest University School of Medicine is using the GEMS 1.5 Tesla scanner for fMRI studies in research applications. The bulk of their investigators are exploring issues in dyslexia, pain research, rehabilitation issues and studies of the aging brain.

To manage their images, Maldjian has created a system where they scan in raw data mode, which is re-constructed and processed offline.

As they do the scanning, they burn the data to CD at the scanner, and simultaneously, in the background they transfer the data to an offline RAID that is capable of being expanded to 65 TB. After the data are processed using a software package known as SPM99 in the background, it is fused with the anatomic images, and then sent from the hard drive to the PACS.

"There is a lot of IT involved in the need to move data, and I've worked to automate it all so no one has to interact with the system," explains Maldjian. "That insures the integrity of the data, because there are many potential problems with loss of data if you have people trying to do those transfers themselves." Maldjian describes his system in an article in a September 2003 issue of AJNR (American Journal of Neuroradiology).

Thulborn at the University of Illinois in Chicago is using a 3T long-bore whole body system on the VHI-LX platform from GEMS for clinical fMRI.

"A physician at UIC can order a fMRI in the same way that they would order a chest x-ray, and they'll get a report within an hour of the patient leaving the service," says Thulborn. These tests are used for pre-surgical planning and to evaluate the response of some psychiatric diseases to medications.

Thulborn Associates, Inc. has developed a turnkey integrated software and hardware solution (marketed as MRIx; www.mrixtechnologies.com) that takes DICOM images, strips the headers and places images into a single time series and then integrates the behavioral data and physiologic data which is synchronized with the imaging data.

"Getting the data from the scanner to the radiologist and into the electronic medical record is a really big issue," says Thulborn. Once the study has been reduced to an activation map, becoming the synthesis of all of the data, it can be moved and stored on a PACS. "The goal is to provide a complete turnkey solution for a department that wants to provide functional imaging on a routine basis."


CONCLUSION

Whether used in a research or clinical setting, the IT perspective on fMRI requires sophisticated solutions to data management. Because these images and attendant information from other parameters must be integrated, analyzed, moved, reported and stored, careful configuration of the network is an essential component to success.