When we think of biometrics most people think fingerprints and facial recognition, but the way you walk is a rich source of information too.


Interested in the Exponential Future? Connect, download a free E-Book, watch a keynote, or browse my blog.

Recently I discussed how Artificial Intelligence (AI) is helping analyse and diagnose everything from cancer and depression to dementia and PTSD, among many other things including even a person’s personality and their intent to criminality using nothing more than a clever app. Now, in a next step, excusing the pun, a team of researchers have figured out how to categorise people’s emotions using AI from the way they walk. And the tech could be used to gauge everything from the mood of shoppers, to the emotional state and mental health of an entire population. It’s also not the only tech that can do this – you might be surprised to learn that AI can also turn your home Wi-Fi router into a radar spy that can analyse the state of your emotions and your health.


Insane AI listened to peoples voices then created accurate photofits


We all know that the way you walk says a lot about how you’re feeling at any given moment. When you’re downtrodden or depressed, for example, you’re more likely to slump your shoulders than when you’re contented or upset. Leveraging this somatic lexicon, researchers at the University of Chapel Hill and the University of Maryland recently investigated a machine learning method that can identify a person’s perceived emotion, valence, for example negative or positive, and arousal, including calm or energetic, from their gait alone. The researchers claim this approach — which they believe is the first of its kind — achieved 80.07% percent accuracy in preliminary experiments.

“Emotions play a large role in our lives, defining our experiences and shaping how we view the world and interact with other humans,” wrote the co-authors. “Because of the importance of perceived emotion in everyday life, automatic emotion recognition is a critical problem in many fields, such as games and entertainment, security and law enforcement, shopping, human-computer interaction, and human-robot interaction.”


New hydrogel that mimics cartilage could transform knee surgeries


The researchers selected four emotions — happy, sad, angry, and neutral — for their tendency to “last an extended period” and their “abundance” in walking activity. Then they extracted gaits from multiple walking video corpora to identify affective features and extracted poses using a 3D pose estimation technique. Finally, they tapped a long short-term memory (LSTM) model — capable of learning long-term dependencies — to obtain features from pose sequences, which they combined with a random forest classifier, which outputs the mean prediction of several individual decision trees, to classify examples into the aforementioned four emotion categories.

The features included things like shoulder posture, the distance between consecutive steps, and the area between the hands and neck. Head tilt angle was used to distinguish between happy and sad emotions, while more compact postures and “body expansion” identified positive and negative emotions, respectively. As for arousal, which the scientists note tends to correspond to increased movements, the model considered the magnitude of velocity, acceleration, and “movement jerks” of hands, feet, and head joints.


This new test will tell us if AI has become self-aware and gained consciousness


The AI system processed samples from Emotion Walk, or EWalk, a novel data set containing 1,384 gaits extracted from videos of 24 subjects walking around a university campus, both indoors and outdoors. Roughly 700 participants from Amazon Mechanical Turk labelled emotions, and the researchers used these labels to determine valence and arousal level.

In tests, the team reports that their emotion detection approach offered a 13.85% improvement over state-of-the-art algorithms and a 24.60% improvement over “vanilla” LSTMs that don’t consider affective features. That isn’t to say it’s fool proof — its accuracy is largely dependent on the precision of the 3D human pose estimation and gait extraction. But despite these limitations, the team believes their method will provide a strong foundation for studies involving additional activities and other emotion identification algorithms.

“Our approach is also the first approach to provide a real-time pipeline for emotion identification from walking videos by leveraging state-of-the-art 3D human pose estimation,” wrote the co-authors. “As part of future work, we would like to collect more data sets and address [limitations].”

About author

Matthew Griffin

Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.

Your email address will not be published. Required fields are marked *