0

WHY THIS MATTERS IN BRIEF

Ears are often overlooked as a place to shove a computer into but as computing and sensor technologies become smaller and more sophisticated the researchers have a good point.

 

Love the Exponential Future? Join our XPotential Community, future proof yourself with courses from XPotential Universityconnect, watch a keynote, or browse my blog.

As a regular reader of my blog I know you’ve heard of wearable computing, and most definitely you’ve heard about biological, chemical, DNA, liquid, and quantum computing – obviously. But, have you heard about Earable Computing? Probably not because up until this week even though the concept makes sense I hadn’t heard of it either, although I did just write an article on a set of new earbuds that can hack and zap your brain, but for now that’s a separate thing …

 

RELATED
DARPA propose creating an AI that can monitor the whole world for threats

 

Now though it too seems to be a thing after CSL’s Systems and Networking Research Group (SyNRG) defined a new sub-area of mobile technology that they call, you guessed it, “Earable Computing.” The team believes that earphones will be the next significant milestone in wearable devices, and that new hardware, software, and apps will all run on this platform.

“The leap from today’s earphones to ‘earables’ would mimic the transformation that we had seen from basic phones to smartphones,” said Romit Roy Choudhury, professor in electrical and computer engineering (ECE). “Today’s smartphones are hardly a calling device anymore, much like how tomorrow’s earables will hardly be a smartphone accessory.”

Instead, the group believes tomorrow’s earphones will continuously sense human behaviour, run acoustic augmented reality, have Alexa and Siri whisper just-in-time information, track user motion and health, and offer seamless security, among many other capabilities.

 

RELATED
Researchers build a wirelessly powered drone that can fly forever

 

The research questions that underlie earable computing draw from a wide range of fields, including sensing, signal processing, embedded systems, communications, and machine learning. The SyNRG team is on the forefront of developing new algorithms while also experimenting with them on real earphone platforms with live users.

Computer science PhD student Zhijian Yang and other members of the SyNRG group, including his fellow students Yu-Lin Wei and Liz Li, are leading the way. They have published a series of papers in this area, starting with one on the topic of hollow noise cancellation that was published at ACM SIGCOMM 2018. Recently, the group had three papers published at the 26th Annual International Conference on Mobile Computing and Networking (ACM MobiCom) on three different aspects of earables research: facial motion sensing, acoustic augmented reality, and voice localization for earphones.

 

RELATED
OpenAI's GPT-3 moves us closer to realising computers that write their own code

 

“If you want to find a store in a mall,” says Zhijian, “the earphone could estimate the relative location of the store and play a 3D voice that simply says ‘Follow me.’ In your ears, the sound would appear to come from the direction in which you should walk, as if it’s a voice escort.”

The second paper, EarSense: Earphones as a Teeth Activity Sensor, looks at how earphones could sense facial and in-mouth activities such as teeth movements and taps, something that resembles MIT’s most recent “mind reading” gadget that lets you use your jawbone to search Google, that would enable hands-free communication with smartphones – as well as possibly, for all you privacy geeks, finally letting you talk silently in public while the gadget translates your mouth movements into sound that the other person on the other end of the phone could hear.

 

RELATED
Quantum computing Rose's Law is Moore's Law on steroids

 

Moreover, various medical conditions manifest in teeth chatter, and the proposed technology would make it possible to identify them by wearing earphones during the day. In the future, the team is planning to look into analyzing facial muscle movements and emotions with earphone sensors.

The third publication, Voice Localization Using Nearby Wall Reflections, investigates the use of algorithms to detect the direction of a sound. This means that if Alice and Bob are having a conversation, Bob’s earphones would be able to tune into the direction Alice’s voice is coming from.

“We’ve been working on mobile sensing and computing for 10 years,” said Wei. “We have a lot of experience to define this emerging landscape of earable computing.”

 

RELATED
China unveils new neuromorphic chip as it aims for AI dominance

 

So will earable computing be a thing in the future? Well, it’s actually highly likely given the fact that today you can already buy earphones with inbuilt heart rate sensors and so on, and as technology and sensors continue to get more sophisticated and smaller we’ll be able to embed more and more tech into the tiny things that we embed in our ears …

About author

Matthew Griffin

Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.

Your email address will not be published. Required fields are marked *