Matthew Griffin, described as “The Adviser behind the Advisers” and a “Young Kurzweil,” is the founder and CEO of the World Futures Forum and the 311 Institute, a global Futures and Deep Futures consultancy working between the dates of 2020 to 2070, and is an award winning futurist, and author of “Codex of the Future” series. Regularly featured in the global media, including AP, BBC, CNBC, Discovery, RT, and Viacom, Matthew’s ability to identify, track, and explain the impacts of hundreds of revolutionary emerging technologies on global culture, industry and society, is unparalleled. Recognised for the past six years as one of the world’s foremost futurists, innovation and strategy experts Matthew is an international speaker who helps governments, investors, multi-nationals and regulators around the world envision, build and lead an inclusive, sustainable future. A rare talent Matthew’s recent work includes mentoring Lunar XPrize teams, re-envisioning global education and training with the G20, and helping the world’s largest organisations envision and ideate the future of their products and services, industries, and countries. Matthew's clients include three Prime Ministers and several governments, including the G7, Accenture, Bain & Co, BCG, Credit Suisse, Dell EMC, Dentons, Deloitte, E&Y, GEMS, Huawei, JPMorgan Chase, KPMG, Lego, McKinsey, PWC, Qualcomm, SAP, Samsung, Sopra Steria, T-Mobile, and many more.
WHY THIS MATTERS IN BRIEF
New brain wave monitoring devices are making it easier than ever for companies, governments and institutions to pull your darkest secrets from your head.
In a world where biometric security is often seen as the de-facto consumer and enterprise security technology, and where Touchless Security Authentication technology – security authentication technologies that can identify you from a distance, is increasingly becoming the norm, we all know far too well that fingerprints can be stolen and 3D printed, iris scans can be spoofed, and facial recognition software fooled.
As a result, particularly for those individuals and organisations operating at the highest levels of security – whether it’s within airport environments, or operating in the darkest corners of the NSA – it has become increasingly challenging to unassailably authenticate a person’s identity. As a result researchers around the world are now focusing on identifying individuals using just their brain waves and, if you Google EEG headsets it’s likely that all you’ll see are people wearing these devices with big grins on their faces. But little do they know that that device is a gateway into their mind.
As is the nature of any competition many of the groups in this field are looking to out do one another, boasting how accurately and accessibly they can verify a person’s identity using Electroencephalograph (EEG) data. In April, for example, a team in New York achieved 100 percent accuracy at identifying individuals using a skullcap with 30 electrodes and just last week a company called Neurosky reported that they had managed to use a simple set of earbuds to garner an 80 percent accuracy rate.
Our brains don’t produce a single, clear signal that can be checked like a fingerprint though. Rather, they emit a messy, vibrant symphony of personal information, including one’s emotional state, learning ability, and personality traits. And as EEG headsets and technology become cheaper, portable, and more ubiquitous – not only for identity authentication, but in applications, for example, from companies such as Aware that use the technology to monitor customers relaxation levels, or NeuroSky (again) that use it to enhance game play there’s more of a likelihood that someone will learn to tap into that concerto of information and use it for malicious purposes.
“If you have these apps, then the reality of the matter is that you don’t know what the app is reading from your brain or what the app’s creators are going to use that information for, but you do, or should, know is that they’re going to have a lot of information,” says Abdul Serwadda, a cyber security researcher at Texas Tech University.
Serwadda and graduate student Richard Matovu recently played devil’s advocate to see if they could glean sensitive personal information from brain data captured by two popular EEG based authentication systems and surprise, surprise, they did.
Serwadda presented the results earlier this week at the 8th IEEE International Conference on Biometrics in Buffalo, New York.
The systems the pair examined were EEG based authentication systems that have claimed high levels of authentication accuracy. One of the systems was from John Chuang and colleagues at the University of California, Berkley and the other adapted from the work of a research team at Binghamton University and the University of Buffalo.
These EEG based authentication systems use specific features, or markers, of brain activity to identify a person which can be thought of as the equivalent of isolating the melody of a specific instrument in an orchestra to identify a song.
Serwadda and Matovu wanted to see if those markers also contained sensitive personal information – in this case, a tendency for alcoholism. First, they put each system to work analysing a medical dataset of EEG scans from a group of alcoholics and non alcoholics. Then in a blind trial, they used a machine learning classifier, a machine which selects the best algorithms to use for any given situation or application, that had been trained to recognise the brain patterns associated with alcoholism and used the brain wave data from the two EEG authentication systems to accurately identify 25 percent of the alcoholics in the sample.
That’s 25 percent of people who just lost their privacy.
“We weren’t surprised, because we know the brain signal is so rich in information,” says Serwadda.
“But it is scary. Wearable brain measurement is an application that’s just about to go mainstream, and you can infer a lot of information about users.”
That information isn’t limited to just alcohol use. Malicious third parties could mine brain data to make inferences about learning disabilities, mental illnesses, and more, says Serwadda.
“Imagine if you made these things public, and insurance companies became aware of them,” he adds.
“It would be terrible.”
Unfortunately, the researchers do not yet have a solution for how to secure such information but during the study, compromising a little on authentication accuracy did reduce the ability to detect who was an alcoholic. Serwadda hopes other research teams will now take privacy, and not just accuracy, into account when building and designing such systems.
That is especially important with functional near-infrared spectroscopy (fNIRS) right around the corner. Compared to EEG, fNIRS measures brain activity with a significantly higher signal-to-noise ratio. And though it is still relatively expensive, systems based on the technology have already begun to drop.
“EEG pales in comparison to what will be seen once fNIRS hits the road,” adds Serwadda, “and we have to prepare for the days when brain wave technology and assessments become mainstream.”