Scroll Top

Researchers have developed an AI that can identify masked individuals

WHY THIS MATTERS IN BRIEF

Technology is increasingly helping governments and multi-nationals strip away our anonymity and privacy, and the pros and cons depend on which side of the fence you sit, but one thing is certain, in the future there will be no such thing as privacy – online or offline.

 

Technologies that “denude” us of our offline and online privacy, especially facial recognition technologies, are fast becoming pervasive within our society, whether it’s to identify you at the airport, in a shopping mall, or at a riot, but ask anyone how to prevent facial recognition from identifying you and they’ll say easy, wear a mask.

 

RELATED
Lack of Covid-19 vaccine pushes DARPA to develop a therapeutic "Shield"

 

That might have been true last year, but this year it’s less so and the Artificial Intelligence (AI) algorithms behind the technology to unmask us all are getting better with every passing day. The result is that for those of you who want to stay anonymous, or, to put it another way, for those of you who value your privacy, for whatever reason, benign, whether it’s to avoid being analysed, catalogued and tracked on advertisers databases as you move innocently around a city, or for other more nefarious reasons, is that staying anonymous, both online and offline, is getting harder all the time.

Of course, using AI to unmask people isn’t anything new, national security agencies and the police have been developing tools to do just that for years, but while many people will point out that the accuracy of these systems isn’t that high that ignores the fact that technology always improves.

 

RELATED
Researchers are trying to create a rating system for cybersecurity software

 

That said though, as I always say, when these new individual technologies are combined with other, in this case, powerful camera and sensor technologies such as new AI multi-spectral cameras that can cut through any disguise using IR, UV and other bands of the electro-magnetic spectrum, and other  “touchless” biometric technologies that can analyse your fingerprints, gait, posture and voice, and even read your lips, from hundreds of feet away all of a sudden, despite what anyone says, it will become easier than ever for tomorrow’s surveillance systems to identify everyone with very high degrees of accuracy.

Now let me ratchet up the dial even further – all of these advanced biometric security technologies will likely become moot when a new remote brainwave reading biometric technology that I’ve written about before, that can uniquely identify you and pull secrets out of your head, eventually emerges from its lab.

 

RELATED
New synthetic muscle breakthrough will create better, nimbler robots

 

Is it any wonder then that all of a sudden you can see why privacy advocates are increasingly freaking out?

Using these new technologies, and some of the increasingly freaky pre-crime technologies that I’ve written about, to find and arrest criminals is a cause for celebration, but with every good use of a technology comes a negative one and in this case it’s the potential to bring on a dystopian future where authoritative regimes, of which there are plenty, could use all these new technologies, online and offline, to control their populations and unmask dissidents like never before.

Anyway, back to our story.

In this case using AI to unmask individuals first started trending, again, last week when a piece of research describing how the technology has advanced was published on the research journal arXiv.

 

RELATED
World first as researchers hack a computer using modified DNA

 

Using Deep Learning and a dataset of pictures of people wearing various disguises, researchers were able to train a neural network that managed to identify masked faces with some reliability. Academic and sociologist Zeynep Tufekci then shared the work on Twitter, noting that such technology could become a tool of oppression, with authoritarian states using it to identify anonymous protestors and stifle dissent. But for now though the paper itself needs to be taken with a pinch of salt.

Its results weren’t overly accurate, for example, when someone was wearing a cap, sunglasses, and a scarf, all together I’ll note, the system could only identify them 55 percent of the time, which I think is still quite good given how young the technology is. It also only used a small dataset, and some experts in the field have criticised its methodology.

 

RELATED
NASA's Martian autonomous cybersecurity defence test is a resounding success

 

“It doesn’t strike me as a particularly convincing paper,” said Patrik Huber, a researcher at the University of Surrey who specializes in face tracking and analysis, pointing out that the system doesn’t actually match disguised faces to mugshots or portraits, but instead used something called “Facial keypoints, that is the distances between facial features like eyes, noses, lips and so on, as a proxy for someone’s identity.

However, although the paper has its flaws, the challenge of recognising people when their faces are covered is one that plenty of teams around the world are working on, so today, like many other emerging technologies, it’s not that accurate, but tomorrow, well it could soon be pushing 90 percent and above, and lest we forget, there are plenty of other ways to identify people as well, like the ones I’ve mentioned above.

 

RELATED
Hitachi's newest surveillance AI can follow you through a crowd

 

Facebook, for example, has trained neural networks that can recognise people based on characteristics like hair, body shape, and posture, and when these are bought into the mix and used to augment the teams work then all of a sudden we could see the systems accuracy leap ahead a hundred fold or more.

Facial recognition systems that work on portions of the face have also been developed in the national security industry, for obvious reasons, and there are other, more exotic methods to identify people.

AI powered gait analysis, for example, that identifies people from their walk, can recognise individuals with a high degree of accuracy, and even works with low resolution footage like the type you get from a regular CCTV camera.

 

RELATED
Get a drones-eye view with this new headset

 

Elsewhere another system for identifying masked individuals which was developed at the University of Basel in Switzerland recreates a 3D model of the target’s face based on what it can see. Bernhard Egger, one of the scientists behind the work said he expected “lots of development” in this area in the near future, but similarly he thought that there would always be ways to fool the machine.

“Maybe machines will outperform humans on very specific tasks with partial occlusions,” said Egger, “but, I believe, it will still be possible to not be recognised if you want to avoid this.”

Wearing a rigid mask that covers the whole face, for example, would give current facial recognition systems nothing to go on. And other researchers have developed patterned glasses and special fabrics that are specially designed to trick and confuse AI facial recognition systems. Getting clear pictures is also difficult, although that too will improve dramatically over the coming years, thanks to new AI sharpening tools and better optical systems, and Egger points out that we’re used to facial recognition performing quickly and accurately, but that’s in situations where the subject is compliant, scanning their face with a phone, for example, or at a border checkpoint. But, let’s face it, as cameras get better that too will be challenge that vanishes.

 

RELATED
Microsoft's VALL-E AI can clone your voice in just three seconds

 

Privacy advocates, though, say even if these systems have flaws, they’re still likely to be embraced by law enforcement. A few months ago, for example, police in London used real time facial recognition to scan people attending the annual Notting Hill Carnival. Before the event they assembled a “bespoke dataset” with images of more than 500 people who were either banned from attending or wanted for arrest and then set up cameras at one of the carnival’s main thoroughfares, but despite the fact that only one attendee was successfully identified using the system, even though his arrest warrant was out of date, and there were still 35 false positives. Despite that though the police still deemed it a success.

If you combine this attitude with the increasing adoption of police body cameras, the growth of facial recognition databases, and new AI techniques for analysing data, it seems clear that public anonymity is, and will continue to be undermined, and in the current political climate, where protests are becoming more common and more violent, this is potentially very dangerous. And as Tufekci noted on Twitter, this new technology is often developed without considering the uses it might be put to.

 

RELATED
Norwegian robot learns to self-evolve and 3D print itself in the lab

 

Amarjot Singh, the lead researcher behind the recent paper published on arXiv, said he thought the systems themselves were neutral, and whether they would have a harmful effect on society depends on how they’re deployed.

“There are more benefits to this technology than harm,” he said, “everything can be used in a good way and a negative way. Even a car.”

He added that he and his colleagues were working to get funding to improve their system, and that they might eventually commercialize it.

“To expand the dataset we might try and make a product out of it,” he said, “we’re not very sure of that yet, but we will definitely be expanding the dataset.”

Related Posts

Leave a comment

EXPLORE MORE!

1000's of articles about the exponential future, 1000's of pages of insights, 1000's of videos, and 100's of exponential technologies: Get The Email from 311, your no-nonsense briefing on all the biggest stories in exponential technology and science.

You have Successfully Subscribed!

Pin It on Pinterest

Share This