Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, Bloomberg, CNBC, Discovery, RT, Viacom, and WIRED, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Aon, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
You might not care too much about your privacy today, but what about your children’s privacy tomorrow? The technologies to track and analyse our every thought and move have already been assembled, the next step is deployment.
Not only can a new Artificial Intelligence (AI) security system for airports from Hitachi pick out every little detail about you as you walk around, it can also, perhaps more interestingly, follow you through a crowd.
The new system can classify people based on more than a hundred characteristics, including your gender, what you’re wearing, what you’re carrying, your face, your gait, your hairstyle, how old you are and many more things besides, and it’s been developed to help make you safer, says Hitachi. However, when it’s combined with new so called new pre-crime and “touchless” security systems, such as platforms that can take your facial ID, fingerprints, read your lips and take retinal scans of your eyes from a hundred or so meters away, as well as other new forms of AI that can, for example, single your conversation out in a crowd and identify you even when you’re wearing a mask, or figure out your emotional state using just your WiFi router at home, then the tech becomes part of a much larger, much more comprehensive “surveillance and security” apparatus.
“In Japan, the demand for such technology is increasing because of the Tokyo 2020 Olympics,” said senior researcher Tomokazu Murakami, who recently demonstrated the new software, “but for us, we’re developing it in a way so that it can be utilised in many different places, such as train stations, stadiums, and even shopping malls.”
The AI can also be used to flag and monitor suspicious behaviour, and even help find missing children, which I think is perhaps one of its best applications. But, in a hat tip to privacy complainers, it the software can also be instructed to track people down with certain characteristics, even as they move through busy crowds.
It’s no wonder then that privacy advocates see a couple of problems with such software.
“It erodes your right to privacy in a public space,” says Privacy International’s policy officer, Frederike Kaltheuner, “and, importantly, it opens the door to profiling, because even though it might appear to be a neutral machine, someone has to tell it what to look for.”
“The way these systems are programmed,” adds Kalthenuer, “there’s no way for them to be neutral because the suspicious behaviour is not a scientific fact. It’s something you have to define when you build this system.”
And that can lead to problems.
“You can employ and develop these systems with the best intentions in mind, and then they’re used by people who want to identify alcoholics, or who want to identify people of a certain race.”
The software is expected to be rolled out for corporate customers within two years, and yes, you really should now start thinking about how much you value you and your family’s privacy, because you won’t have it for very much longer… Did I mention the new brainwave reading tech that can pull your secrets out of your head or the new persistent surveillance systems that can watch you 247 from orbit? No? Silly me. Never mind, they’re probably not that important anyway… Hah.