WHY THIS MATTERS IN BRIEF
Using AI to translate human languages is nothing new, but now we’re using the technology to help us decipher animal languages.
Dolphins are getting in on the act, and now so are mice – having their language translated into something humans can understand using Artificial Intelligence (AI) in what seems to be an increasingly interesting field that’s using AI to help us create a new breed of universal translators that help us decode and translate not just other human languages, but also animal language. Doctor Doolittle has competition…
In what is somehow the cutest science story of the new year so far, apparently, scientists at the University of Washington (UW) have announced a new AI system for decoding mouse squeaks.
Dubbed DeepSqueak, the software program can analyse rodent vocalisations and then pattern-match the audio to behaviors observed in laboratory settings. As such, the software can be used to partially decode the language of mice and other rodents, and researchers hope that the technology will be helpful in developing a broad range of medical and psychological studies.
Published this week in the journal Neuropsychopharmacology, the study is based around a novel use of sonogram technology, which transforms an audio signal into an image or series of graphs.
The DeepSqueak program turns recordings of mouse chatter into visual output, which is then analysed using advanced machine learning algorithms. In fact, the algorithms are in the same family as those used by self-driving cars to “see” their environment. The technology represents the first use of deep learning in rodent vocalisation research, said study co-author Russell Marx in a statement.
One critical advantage to the DeepSqueak system is that it can “hear” vocalizations that are otherwise inaudible to human ears.
“As it turns out, rats and mice have this rich vocal communication, but it’s way above our hearing range, so it’s been really hard to detect and analyze these calls,” Russell says in a project demo video. “Our software allows us to visualize all those calls, look at their shape and structure and categorize them.
Marx and DeepSqueak co-creator Kevin Coffey, who both study addiction and psychology issues at UW have already made some interesting discoveries. Their initial efforts have focused on discerning calls of happiness or distress when working with mice in addiction experiments.
“The animals have a rich repertoire of calls, around 20 kinds,” said Coffey. “With drugs of abuse, you see both positive and negative calls.”
The mice appear to be happiest when they are anticipating a reward such as sugar, Coffey noted, but they also make happy calls in certain social situations. The researchers also observed that male mice make the same calls over and over when they’ re around other males, but switch to more complex vocalisations when females are nearby.
There’ s probably an entire shelf full of future sociological research in that one observation, but for now the team’s goal is to use the new technology for advancing addiction research.
John Neumaier, professor of psychiatry and behavioural sciences and associate director of the Alcohol and Drug Abuse Institute, said that DeepSqueak should help his lab make better and faster progress in the field by making vocalisation analysis convenient, efficient, and affordable.
“If scientists can understand better how drugs change brain activity to cause pleasure or unpleasant feelings,” he said, “we could devise better treatments for addiction.”