Scroll Top

The FBI warns people are using DeepFakes to apply for remote tech jobs

Futurist_depjob

WHY THIS MATTERS IN BRIEF

When you hire someone you give them money, as in a wage, and access to your systems, so what happens when those people aren’t who they seem to be?

 

Love the Exponential Future? Join our XPotential Community, subscribe to the podcast, future proof yourself with courses from XPotential University, read about exponential tech and trendsconnect, watch a keynote, or browse my blog.

DeepFakes, which can be broadly described as the use of Artificial Intelligence (AI) to alter and manipulate video in order to make people say things they haven’t, can be fun but increasingly they’re being used by criminals to scam people out of Bitcoin, and by governments to spread lies, discredit people, and falsify GIS data. Among lots of other things. But one thing you might not have expected is to find deepfakes being used to scam companies for jobs.

 

RELATED
Telsa vehicles will soon be connected to SpaceX's global space internet system

 

As a result of this new threat companies hiring for open positions might need to do more than scrutinise how prospective employees react to the question “What is your worst quality?” – if the prospective hire sneezes or coughs without moving their lips then their worst quality might be that they’re not actually real.

The FBI wrote to its Internet Crime Complaint Center Tuesday that it has received multiple complaints of people using stolen information and deepfaked video and voice to apply to remote tech jobs.

 

Get up to speed on DeepFakes FAST, with keynote Matthew Griffin

 

According to the FBI’s announcement, more companies have been reporting people applying to jobs using video, images, or recordings that are manipulated to look and sound like somebody else. These fakers are also using personal identifiable information from other people – stolen identities – to apply to jobs at IT, programming, database, and software firms. The report noted that many of these open positions had access to sensitive customer or employee data, as well as financial and proprietary company info, implying the imposters could have a desire to steal sensitive information as well as a bent to cash a fraudulent paycheck.

 

RELATED
Crazy thinkers suggest beaming solar energy from space to end energy poverty

 

What isn’t clear is how many of these fake attempts at getting a job were successful versus how many were caught and reported. Or, in a more nefarious hypothetical, whether someone secured an offer, took a paycheck or stole confidential company data and IP, and then got caught.

Either way these applicants were apparently using voice spoofing techniques during online interviews where lip movement did not match what’s being said during video calls, according to the announcement. Apparently, the jig was up in some of these cases when the interviewee coughed or sneezed, which wasn’t picked up by the video spoofing software.

The FBI was among several federal agencies to recently warn companies of individuals working for the North Korean government applying to remote positions in IT or other tech jobs in May. In those cases, fake workers often bid on remote contract work through sites like Upwork or Fiverr using fake documentation and references.

 

RELATED
Researchers use duelling AI's to defeat facial recognition technology

 

In cases like those described in the federal agencies’ May report, some fake operators worked through several layers of shell companies, making it much harder to discern their identity.

Though the technology has come a long way some of the more amateur attempts at deepfakes often result in faked voices that barely match up with speakers’ mouths. Other, professionally produced deepfakes can make a much better attempt at generating a realistic seeming human.

And it’s not so easy to detect a fake video as you might think, especially if you’re not looking for it. AI meant to detect altered video could range in accuracy from 30 to 97%, according to a recent report by researchers from Carnegie Mellon University. There are ways for humans to detect fake video, especially once they’re trained to watch for certain visual glitches such as shadows that don’t behave as they should or skin texture that doesn’t seem accurate, but for now the FBI are asking companies who suspect a fake applicant to report it to the complaint center site.

Related Posts

Leave a comment

EXPLORE MORE!

1000's of articles about the exponential future, 1000's of pages of insights, 1000's of videos, and 100's of exponential technologies: Get The Email from 311, your no-nonsense briefing on all the biggest stories in exponential technology and science.

You have Successfully Subscribed!

Pin It on Pinterest

Share This