Researchers scientists from the Swiss Federal Institute of Technology (EPFL) have developed a new machine-learning algorithm to translate brain signals into videos.
The algorithm, called CEBRA (and pronounced zebra), has the potential to reveal the hidden structure in data recorded from the brain and predict complex information.
In an experiment, researchers successfully reconstructed a film seen by a mouse using the novel technique.
"We asked the question: could we actually reconstruct what the animal was watching just purely from the neural data?" said Mackenzie Mathis, a neuroscientist at EPFL.
"We used our new algorithm CEBRA to build this latent representation of the embedding space. And then you can take this embedding space and essentially use that as the basis for a neural decoding algorithm and then predict exactly the sequence of frames the mouse was watching".
The research team used CEBRA to map brain signals and movie features from the brain data recorded at the Allen Institute in Seattle, in the US state of Washington.
The Washington researchers had shown mice a black-and-white movie clip of a man running to a car and opening the trunk.
The brain signals of mice were measured via electrode probes inserted into the visual cortex region. Optical probes were used for genetically engineered mice so that the neurons of the mice glow green when transmitting information while they were passively watching the film.
Experts say this is a common procedure of genetic engineering amid growing interest in brain research.
"There's a technology called optogenetics, where you use genetic markers which you have bred into the mouse (...) so it doesn't have any effect on the mouse," said Dr Nadia Rosenthal, scientific director and Professor at the Jackson Laboratory for Mammalian Genetics.
"The mouse doesn't even know it's there. It allows you to follow when a nerve is firing or not. So you can actually watch the firing network of nerves in the brain," she told Euronews Next.
CEBRA boasts a high degree of accuracy – the movie reconstructed by the AI almost matched the original completely, with some slight distortions.
"With this algorithm, we could do this with over 95 per cent accuracy on these movies. So we think this is sort of a first demonstration that it's actually possible to do this brain-machine interface style decoding," said Mathis.
Researchers elsewhere in the world have recently made breakthroughs decoding brain signals using AI. Just last week, a team in Austin, Texas unveiled a system able to translate someone’s brain activity into a continuous stream of text.
Another study in March, by a team at Osaka University in Japan, revealed how AI could read brain scans to recreate images a person has seen.
It is not yet possible to fully reconstruct what a human sees based on brain signals alone, but its developers believe CEBRA could have clinical applications beyond neuroscience.
“It could be used for things like visual neuroprosthetics, potentially restoring vision or doing arm movements. So those patients that are paralysed or want to restore or even enhancement in this way," Mathis added.
Rosenthal agrees technologies like CEBRA have big potential.
“There is an extraordinary amount of new technology that we now have and would allow us to watch what's going on in the mouse's brain,” Rosenthal said.
“It helps a lot that we can engineer these little markers that allow us to follow things as they happen without having to kill the mouse to open up the brain,” she added.
For more on this story, watch the video in the media player above.