Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
As the digital and physical worlds collide and merge companies are trying to find new ways to extend the physical experience.
Imagine holding hands with a loved one on the other side of the world. Or feeling a pat on the back from a teammate in the online game “Fortnite.” Well, today we have haptic gloves, jackets, and suits that can do that, and even let you rugby tackle people hundreds of miles away, but they’re bulky and you have to put them on in the first place.
Now though a team in the US at Northwestern University have developed a less bulky version of the technology – a new thin, wireless system, a synthetic skin if you will, that adds a sense of touch, and eventually heat, to any Virtual Reality (VR) or virtual experience.
Referred to as an “Epidermal VR” system the teams device communicates virtual touch through “a fast, programmable array of miniature vibrating actuators embedded into a thin, soft, flexible material,” and the 15-centimeter-by-15-centimeter sheet-like prototypes fit comfortably on the skin without the need for bulky batteries or cumbersome wires.
“People have contemplated this overall concept in the past, but without a clear basis for a realistic technology with the right set of characteristics or the proper form of scalability. Past designs like the haptic suits involve manual assemblies of actuators, wires, batteries and combined internal and external control hardware,” said Northwestern’s John Rogers, a bioelectronics pioneer. “We leveraged our knowledge in flexible electronics and wireless power transfer to put together a superior collection of components, including miniaturised actuators, in an advanced architecture designed as a skin-interface wearable device – with almost no encumbrances on the user. We feel that it’s a good starting point that will scale naturally to full-body systems and hundreds or thousands of discrete, programmable actuators.”
“We are expanding the boundaries and capabilities of virtual and augmented reality,” added Yonggang Huang, who co-led the research with Rogers. “By comparison to the eyes and the ears, the skin is a relatively underexplored sensory interface that could significantly enhance experiences.”
The research was published in the journal Nature.
Rogers and co’s most sophisticated device incorporates a distributed array of 32 individually programmable, millimeter-scale actuators, each of which generates a discrete sense of touch at a corresponding location on the skin with each actuator resonating most intensely at 200 cycles per second – the point at which skin exhibits its maximum sensitivity.
“We can adjust the frequency and amplitude of each actuator quickly and on-the-fly through our graphical user interface,” Rogers said. “We tailored the designs to maximie the sensory perception of the vibratory force delivered to the skin.”
The patch wirelessly connects to a touchscreen interface on a smartphone or tablet. When a user touches the touchscreen that pattern of touch transmits to the patch, and if the user draws an “X” pattern on the touchscreen, for example, then the synthetic skin produces a corresponding sensory feeling, simultaneously and in real-time on the users own skin.
“You could imagine that sensing virtual touch while on a video call with your family may become ubiquitous in the foreseeable future,” Huang said.
The actuators are embedded into an intrinsically soft and slightly tacky silicone polymer that adheres to the skin without tape or straps. Wireless and battery-free, the device communicates through Near-Field Communication (NFC) protocols, the same technology used in smart phones for electronic payments.
“With this wireless power delivery scheme, we completely avoid the need for batteries, with their weight, size, bulk and limited operating lifetimes,” Rogers said. “The result is a thin, lightweight system that can be worn and used without constraint, indefinitely.”
While everyone can imagine how this type of technology could be combined with a VR headset to create more interactive and immersive gaming or entertainment experiences for US Army veteran Garrett Anderson the new tech might provide a much-needed solution to a real-life problem.
At 4 a.m. on Oct. 15, 2005, Anderson was ambushed during his deployment in the Iraq War and lost his right arm just below the elbow.
“A bomb exploded under my truck,” Anderson said. “It blew the entire engine out of the vehicle. Then shrapnel came through the vehicle and severed my arm, which was hanging on by tendons.”
Anderson recently tried Northwestern’s system, integrated with his prosthetic arm. When wearing the patch on his upper arm, Anderson could feel sensations from his prosthetic fingertips transmitted to his arm. The vibrations felt more or less intense, depending on the firmness of his grip.
“Say that I’m grabbing an egg or something fragile,” said Anderson, who is now the outreach coordinator at the University of Illinois’ Chez Veterans Center. “If I can’t adjust my grip, then I might crush the egg. I need to know the amount of grip that I’m applying, so that I don’t hurt something or someone.”
“I have never felt them with my right arm,” he said.
“Users develop an ability to sense touch at the fingertips of their prosthetics through the sensory inputs on the upper arm,” Rogers explained. “Overtime, your brain can convert the sensation on your arm to a surrogate sense of feeling in your fingertips. It adds a sensory channel to reproduce the sense of touch.”
Anderson believes this device could potentially “trick” his brain in a way that relieves phantom pain. He also imagines that it could allow him to interact with his children in a new way.
“I lost my arm 15 years ago,” he said. “My kids are 13 and 10, so I have never felt them with my right arm. I don’t know what it’s like when they grab my right hand and this tech could help.”
Huang views the current device as a starting point. “This is our first attempt at a system of this type,” he said. “It could be very powerful for social interactions, clinical medicine and applications that we cannot conceive of today, beyond the obvious opportunities in gaming and entertainment.”
Huang then added that he and his team are already working to make the current device slimmer and lighter, and that they also plan to exploit different types of actuators, including those that can produce heating and stretching sensations. With thermal inputs, for example, a person might be able to sense how hot a cup of coffee is through prosthetic fingertips, so it’ll be interesting to see how they get on.