Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
Animal noises are just another language, so researchers are now using AI to create the equivalent of a Doctor Doolittle app that can understand it.
Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential University, read about exponential tech and trends, connect, watch a keynote, or browse my blog.
I know what you’re wondering – when will we be able to talk to aliens using a universal translator like the ones in the movies? Bearing in mind we’ve already created at least one using synthetic biology … Well, the answer might be sooner than you think as Artificial Intelligence (AI) wraps its electronic head around translating human languages and “other worldly” animal languages including dolphins, mice, and now pigs.
Never mind trouncing humans at video games and the ancient pursuits of chess and Go. Researchers have now harnessed the power of AI to infer how pigs are feeling on the basis of their grunts.
Scientists believe that the AI pig translator – which turns oinks, snuffles, grunts and squeals into emotions – could be used to automatically monitor animal wellbeing and pave the way for better livestock treatment on farms and elsewhere in the future.
“We have trained the algorithm to decode pig grunts,” said Dr Elodie Briefer, an expert in animal communication who co-led the work at the University of Copenhagen. “Now we need someone who wants to develop the algorithm into an app that farmers can use to improve the welfare of their animals.”
Working with an international team of colleagues, Briefer trained a neural network to learn whether pigs were experiencing positive emotions, such as happiness or excitement, or negative emotions, such as fear and distress, using audio recordings and behavioural data from pigs in different situations, from birth through to death.
Writing in the journal Scientific Reports, the researchers describe how they used the AI to analyse the acoustic signatures of 7,414 pig calls recorded from more than 400 animals. While most of the recordings came from farms and other commercial settings, others came from experimental enclosures where pigs were given toys, food and unfamiliar objects to nose around and explore.
The scientists used the algorithm to distinguish calls linked to positive emotions from those linked to negative emotions. The different noises represented emotions across the spectrum and reflected positive situations, such as huddling with littermates, suckling their mothers, running about and being reunited with the family, to negative situations ranging from piglet fights, crushing, castration and waiting in the abattoir.
The researchers found that there were more high-pitched squeals in negative situations. Meanwhile, low-pitched grunts and barks were heard across the board, regardless of their predicament. Short grunts, however, were generally a good sign of porcine contentment.
“There are clear differences in pig calls when we look at positive and negative situations,” Briefer said. “In the positive situations, the calls are far shorter, with minor fluctuations in amplitude. Grunts, more specifically, begin high and gradually go lower in frequency.” According to the researchers, the algorithm correctly classified 92% of the calls as positive or negative emotions. With more recordings, the pig translator may be able to learn to distinguish a broader repertoire of emotions and shed light on the mental wellbeing of other animals.
While farmers tend to recognise that the mental health of animals is important for their wellbeing, the majority of animal welfare efforts focus on physical health. Briefer and her colleagues believe their algorithm can pave the way for new automated systems in the livestock industry that monitor sounds on farms and other sites to assess the animals’ psychological wellbeing.