High-tech Modeling: Computer Mockups Hone in on Human Physiology

An exploration of recent projects at Argonne National Laboratory and Johns Hopkins University traverse the numerical and graphical landscape of cerebral blood flow, the pathology of malaria, brain aneurysm, sickle cell anemia and even cancer. Computer modeling provides ultra-specific information regarding human response to factors of disease and drug toxicology, which presents not only a supplement but also perhaps one day a potential alternative, in some cases, to animal studies.

Just a few decades ago the computation we take for granted at the touch of a mouse or track pad atop our required vast rooms filled with hardware. Now that those functions have been miniaturized into ever-smaller housing, the worlds’ supercomputers are hard at work with some of life’s most complex questions—for instance, what does it take to model the human brain? Supercomputers like the IBM Blue Gene/Q, Mira, at Argonne National Laboratory just outside Chicago in Lemont, Ill., is being put to the task of answering precise questions about our physiology.

For years, Argonne computer scientists ran simulations on Mira’s predecessor, a Blue Gene/P, mapping discrete neural networks and applying functions of disease to gain a better understanding of the interactions between neurons. Some significant work has been done in the area of epilepsy, but another effort, one to develop a multi-scale computational model of blood flow within the brain, endeavors to connect everything from individual molecules, proteins, cells, arteries and dynamic circulation in the next generation of human modeling. These models are already being applied to a range of clinical applications. Leading this project is George E.M. Karniadakis, PhD, also a professor of applied mathematics at Brown University in Providence, R.I., and a research scientist at the Massachusetts Institute of Technology in Cambridge, Mass.

At Argonne, Karniadakis models hematological disorders at the cellular level. He has modeled individual blood cells deformed due to parasites in the case of malaria infection and genetic variation in hemoglobin in the context of sickle cell anemia all in an effort to better understand these mechanisms of disease. Additionally, Karniadakis and his colleagues are looking into how brain aneurysms develop and rupture as a result of basilary artery occlusion.

“In this case, we are trying to create predictive models that will inform us of when the aneurysm might rupture,” explains Karniadakis. This information could one day be modeled to predict when a specific patient’s aneurysm is likely to rupture for better staging of disease and clinical decision-making. “The idea is to begin to understand some of the observations that are impossible to understand with imaging alone or with laboratory experiments.”

Imaging cannot touch this level of predictive modeling due to inherent limitations in resolution and an inability to capture dynamic events at the micron level and at fractions of a second. 

“It’s very difficult to visualize with sufficient accuracy exactly what is going on,” adds Karniadakis.

Some of the most experimental modeling research at the Argonne Leadership Computing Facility (ALCF), which houses Mira, involves the mapping of neurovascular networks that quantify how neurons interact with brain vessels and how the modulation of blood flow affects individual astrocytes and glial cells.

“We actually quantify these interactions and the interesting thing is that the interaction is bidirectional,” says Karniadakis. Neurons are not only taking up oxygen and nutrients from the blood, but modulating blood flow. “This type of modeling is not very detailed at this point, because we don’t have a lot of data. We don’t yet need a supercomputer to do this work.” 

As computation develops, “lumped models” that include untold numbers of physiological parameters serve as functions in each individual model, but limitations remain. Scientists have not yet fully modeled these networks. “These neural networks are running parallel, but we’ve never been able to actually connect them,” says Karniadakis.

The Blue Brain Project, an international effort, aims to model the entire brain in full and relies on the power of supercomputers and other powerhouses of computation to connect disparate networks. The project is starting small by simulating a cortical column of a rat brain, which is typically comprised of 10,000 neurons. In humans, such a column would connect 100,000 neurons. To put that in perspective, the human brain is home to approximately two million of these cortical columns.

Karniadakis expects that complete multiscale models could be finished within five years. Increasingly complex algorithms are being applied to deal with the clouds of referential data. The end results could be used clinically to test certain drugs, such as in the case of sickle cell anemia or brain aneurysm. Drug evaluations using these models could be possible within the next decade.

“Some of this research could be useful even now, clinically,” Karniadakis says. “For example, in the case of a brain aneurysm, it’s a matter of a few years before we can predict accurately the course of rupture. This is of almost immediate use.”

Visualizing research

Michael E. Papka, PhD, director of the ALCF and deputy associate laboratory director of the Computing, Environment and Life Sciences (CELS) Directorate, works with Karniadakis to create stunning visualizations, both still and dynamic, to demonstrate the research.

“My team is interested in helping them understand their results more effectively and efficiently,” says Papka. “Hence, our main contributions to the project are the data analysis and visualization components … our research helps to interpret the massive amounts of data that George’s team produces with every simulation run.”

One of the major challenges faced by these researchers is the exponentially growing need to manage gargantuan datasets. The more closely these researchers approximate actual physiology, the more demand for technological resources.

“The more realistic the simulation is, the more data it produces,” adds Papka. “And scientists will only continue to produce higher and higher fidelity simulations. Another challenge is how to build and maintain sufficient infrastructure to accommodate the widely divergent research goals of individual users.”

Vascular trackers

Meanwhile, at the systems biology laboratory within Johns Hopkins University, Aleksander S. Popel, PhD, and colleagues similarly use computational modeling to gain knowledge about the function of blood vasculature as it relates to cardiovascular disease and cancer. The objective is to develop new therapeutic molecular targets to treat a range of diseases. Two such applications would be to encourage new vasculature in the case of ischemic cardiovascular disease and inhibit angiogenesis in the case of cancerous tumors that cannot grow without the development of new blood vessels.

“The computational modeling approaches include formulation of kinetic equations for multiple molecular factors, e.g. interactions of growth factor ligands with their cell surface receptors, resulting in hundreds of ordinary or partial differential equations that are then solved numerically,” explains Pope.

As is the case for researchers at Argonne, Pope’s team strives to develop accurate parameters within each model that represent specific organs, tissues and organisms. Models must then be validated and researchers use experimental data to put them to the test. When these models are one day translated to clinical use, it would be the closest definition of personalized medicine developed to date. 

Could the animal model become obsolete?

While it is not the agenda of these scientists to render other models of research null and void, some experiments may one day be deemed unnecessary, or at least relatively inaccurate, as both mouse and human computational models grow increasingly sophisticated and true to form. 

“Within the next 10 years, we could replace some of these animal experiments,” suggests Karniadakis. He and his team of researchers, however, will leave that up to biomedical researchers at the point where they can take on these new computational tools and apply them to their research.

Six years ago, Robert Matthews, a visiting professor of science at Aston University in Birmingham, United Kingdom, published a paper in the Journal of the Royal Society of Medicine that called into question the very assertion that “virtually every medical achievement of the last century has depended directly or indirectly on research with animals.” Neither a proponent for nor against animal research, Matthews suggests that any failing of the animal model is due to researchers conducting a limited number of studies, thereby hampering the clinical value of what takes place in the lab. 

“This has led to a cavalier use of the data from animal experiments, with a lot of testing being done, ironically, on too few animals to give adequate statistical power, and with too little work being done to verify the predictive value of animal models,” he says.

Matthews urges researchers to take a closer look at not just true positives in animal models, but also false positives, which he says are often not considered until late-phase clinical trials. 

“It is a basic fact of scientific inference that without both it is impossible to gauge the evidential weight provided by any would-be predictive model,” remarks Matthews. “I certainly think computational modeling will help speed the move away from unreliable animal models. My feeling is that those involved in such modeling tend to come from a background with a tradition of back-testing, hold-out samples and validation methods. In contrast, animal models date back to a time when researchers knew little and cared less about such issues.”

Pressure to improve

Countless drug trials pass animal testing, but then go on to fail in humans and computer modeling may be a means of improving those odds, says Matthews. In the future, pharmaceutical companies may even pressure the industry to move away from unvalidated experiments to computational modeling that more closely resemble human physiology. 

“I might add that I’ve long been surprised that shareholders haven’t made more of the possibility that poor animal models have a big part to play in the costly attrition rate. Maybe they hold the key to the adoption of better, validated methods like computational modeling,” Matthews says.

The majority of computational modeling of human anatomy and physiology is taking place in exclusive research institutions. The researchers have done the math, but time will tell if these models can be translated to clinical environments for more personalized medicine and superior drug development protocols. 

Trimed Popup
Trimed Popup