Ultrasound helmet would make live images, brain-machine interface possible

May 8, 2018

Ultrasound technology for the brain could mean real-time images during surgery, a better idea of which areas get stimulated by certain feelings or actions and, ultimately, an effective way for people to control software and robotics by thinking about it.

Medical doctors and scientists have spent decades hoping for such an advance, but it was impossible before now, said Brett Byram, assistant professor of biomedical engineering at Vanderbilt University. Ultrasound beams bounced around inside the skull, so no useful imagery could make it out.

With his new $550,000 National Science Foundation grant, Byram plans to use machine learning that will gradually be able to account for distortion and deliver workable images. What’s more, he wants to integrate electroencephalogram technology so doctors could see not only brain perfusion—how blood flow correlates to changes in thought—but also areas of stimulation related to movement and emotion.

“The goal is to create a brain-machine interface using an ultrasound helmet and EEG,” Byram said. “A lot of the technology we’re using now wasn’t available when people were working on this 20 or 30 years ago. Deep neural networks and machine learning have become popular, and our group is the first to show how you can use those for ultrasound beamforming.”

The applications, he said, are endless. At the basic level, it could allow for images as clear or moreso than those doctors are accustomed to seeing of the heart or womb.

Going forward, a person with limited movement due to ALS could think about wanting a glass of water, and a robotic arm could retrieve one because the helmet detected blood flow and EEG information that told it to. A student reading a paper may feel stress about a certain part that isn’t properly sourced, and the computer would know to put a mark there for later editing.

Byram, whose award is a Faculty Early Career Development grant, said he’s working with Leon Bellan, assistant professor of mechanical engineering and biomedical engineering, and Michael Miga, Harvie Branscomb Professor and professor of biomedical engineering, radiology and neurological surgery, to develop the helmet. He plans to invite medical center doctors to the team as their work progresses.

Newswise has the release

Sponsored Recommendations

AI-Driven Healthcare: Empowering Nurses, Clinicians, and Care Teams for Smarter, More Efficient Care

Explore how AI-first ThinkAndor® is transforming nursing workflows and patient care at Sentara, improving outcomes, reducing readmissions, and enhancing care transitions in this...

The Future of Storage: The Complexities and Implications in Healthcare

Join us on January 23rd to explore the future of data storage in healthcare and learn how strategic IT decisions today can shape agility and competitiveness for tomorrow.

IT Healthcare Report: Technology Insights for a Transformative Future

Explore the latest healthcare IT trends, challenges, and opportunities in AI, patient care, and security. Gain actionable insights to navigate the industry's transformation.

How to Build Trust in AI: The Data Leaders’ Playbook

This eBook strives to provide data leaders like you with a comprehensive understanding of the urgent need to deliver high-quality data to your business. It also reviews key strategies...