Matthew Griffin, award winning Futurist working between the dates of 2020 and 2070, is described as “The Adviser behind the Advisers” and a “Young Kurzweil.” Regularly featured in the global press, including BBC, CNBC, Discovery and RT, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew sits on several boards and his recent work includes mentoring Lunar XPrize teams, building the first generation of biological computers and re-envisioning global education with the G20, and helping the world’s largest manufacturers ideate the next 20 years of intelligent devices and machines. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Bain & Co, BCG, BOA, Blackrock, Bentley, Credit Suisse, Dell EMC, Dentons, Deloitte, Du Pont, E&Y, HPE, Huawei, JPMorgan Chase, KPMG, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, UBS, and many more.
WHY THIS MATTERS IN BRIEF
Today’s computer interfaces are still limiting, even with the advent of voice and touch, so Hypersurfaces is setting them free.
In the future one thing is certain, as technology proliferates around us the keyboard and mouse will slowly die out and we will all be using and interfacing with technology in new ways – whether it’s telepathically via neural interfaces like the ones Facebook and friends are developing, or using more simple means such as gestures and voice, something known as behavioural computing. But as technology finds its way into everything from windows to clothing I can’t help feeling that the interfaces we have today in the labs are still, to some degree, missing the point. After all, wouldn’t it be nice to be able to interact with the devices and gadgets around us, from our self-driving cars to our smart home hi-fi’s using whatever interface is most convenient – whether it’s a book, a wall, or even a piece of furniture?
And this is precisely what a new innovation from Bruno Zamborlin and his company Hypersurfaces are delivering against. Hypersurfaces taps into the power of Artificial Intelligence (AI) and machine learning to “turn any object of any material, shape and size” into a user interface.
See it in action
Imagine a wooden kitchen table that can be used to control lighting or room temperature, a floor that’s able to determine if the intruder in your house is just the cat or a would-be thief, the surface of a door transformed into one big interface or the inner surface of a car door acting as a button-free control panel. These are some of the examples offered by the HyperSurfaces system.
The company’s technology works by combining vibration sensing with neural network algorithms running on dedicated microchips.
“Every time we interact with an object, we create a distinctive vibration pattern which dedicated sensors coupled with our patented algorithms, can transform into digital commands,” said Zamborlin – who heads an international development team split between London and Los Angeles.
All of the data processing is undertaken in real-time on the chip itself, meaning that once a use case model is loaded onto the system-on-chip, it can work without needing to access external systems such as data processing in the cloud.
“HyperSurfaces aims to revolutionise the way we live, blending the data world within any object around us,” the company stated in a press release. “Consumer electronics, IoT, retail, transportation, augmented reality, smart facilities, all these domains can potentially be changed forever.”
Development of the system continues, but you can see what their new interface has to offer, and its potential, in the video above.