WHY THIS MATTERS IN BRIEF
This is the first time scientists have used ultrasound to read the minds of primates.
Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, connect, watch a keynote, or browse my blog.
First, a quick recap … In time all technologies, including sensors, get more performant, in other words better, cheaper, and smaller – from the CMOS sensors in the camera in your smartphones, which let you take better quality photos, to the optics and sensors in satellites that now let you track individuals from space and capture radio signals from illegal fishing vessels.
And why am I telling you this? Well, it’s because in time the technology in this article might very well let you, or companies, read people’s minds from a distance wirelessly and non-intrusively – without them knowing. Think, for example, of this tech in a future airport scanning people’s minds for signs of terrorist activity and you start getting the picture.
Brain Machine Interfaces (BMI) are one of those incredible ideas that were once the reserve of science fiction, and there are two forms – invasive ones that need surgery, like Elon Musk’s Neuralink technology, and non-invasive ones, like those being developed by Facebook. And from internet connected pigs, to amputees, paralysed people, and quadriplegics who can now use the technology to control exosuits, neuroprostetic limbs, and fleets of fighter jets, the field is now advancing faster than at any other time in history.
Now, researchers at Caltech in the US have demonstrated a non-invasive brain-machine interface using Functional Ultrasound (fUS) technology. The landmark proof-of-concept study reveals an ultrasound technique recording brain activity in monkeys and then using that data to predict their subsequent motor movements.
The preliminary research utilised non-human primates to explore whether ultrasound recordings could be used to predict behaviour. Mikhail Shapiro, one of the authors on the new study, says the first question the researchers were asking was whether high-resolution blood flow dynamics in the brain, as measured by ultrasound, could be associated with animal behaviour.
“The answer is yes,” Shapiro says. “This technique produced detailed images of the dynamics of neural signals in our target region that could not be seen with other non-invasive techniques like fMRI. We produced a level of detail approaching electrophysiology, but with a far less invasive procedure.”
Focusing on activity in the posterior parietal cortex, a brain region known to co-ordinate motor movement, the researchers found they could effectively associate ultrasound readings with subsequent physical actions. A machine learning algorithm was then tasked with correlating the ultrasound data with the animals’ physical movements.
The results revealed the system could effectively predict whether an animal was about to move its eyes left or right with 78 percent accuracy and whether an animal was about to reach out to its left or right with 89 percent accuracy.
“We pushed the limits of ultrasound neuroimaging and were thrilled that it could predict movement,” explains Sumner Norman, co-first-author on the study. “What’s most exciting is that fUS is a young technique with huge potential – this is just our first step in bringing high performance, less invasive BMI to more people.”
Perhaps the most apparent limitation for the technology raised by this preliminary research is one of latency. The system that was tested needed around two seconds of data to predict the animals’ movements but the researchers suggest this delay could certainly be reduced in the future using a variety of technological improvements.
So now we can add functional ultrasound to the list of experimental brain-machine interface techniques being tested hopefully offering an option in the future for those keen on controlling machines with their mind but not so keen on having electrodes implanted in their brain.
And, as for how the technology will develop in the future you can already imagine how this innovation could be used at airports to predict people’s next actions, and as the resolution improves it’s very difficult to see how, like some of its other cousins, it won’t evolve to a point where it can be used to read people’s thoughts from a distance much in the same way that this one, for example, can stream your thoughts to YouTube … Sci fi is lame compared to sci fact.
The new study was published in the journal Neuron.