A group at Vanderbilt University in Nashville is developing a helmet that would use ultrasound brain technology to produce real-time images with novel implications for surgery guidance and robotics, according to a university press release.
“The goal is to create a brain-machine interface using an ultrasound helmet and [electroencephalogram],” said Brett Byram, an assistant professor of biomedical engineering at Vanderbilt. “A lot of the technology we’re using now wasn’t available when people were working on this 20 or 30 years ago. Deep neural networks and machine learning have become popular, and our group is the first to show how you can use those for ultrasound beam forming.”
With a recent $550,000 National Science Foundation grant, Byram and his team plan to use machine learning to gather workable images from the helmet, eventually integrating EEG to visualize brain perfusion and areas associated with movement and emotion.
Byram said there are numerous applications, but in theory, someone with limited movement, such as an Alzheimer’s disease patient, could think about a glass of water and a robot arm could execute the action based on detected blood flow and EEG information from the helmet.