fbpx

For most people, the increase in remote work opportunities has been a welcome change. We’re saving on gas, on clothes, on childcare, and even eating out.

There are always some people who want to spoil good things for the rest of us, though, so let me tell you about the people using deepfakes to apply for that job you might want for yourself.

Image Credit: iStock

All deepfakes can be hard to spot, but according to the FBI, people doing remote interviews for remote positions need to be analyzing more than the answers to questions – they need to be analyzing whether or not they’re speaking to a real person.

The FBI wrote to its Internet Crime Complaint Center on Tuesday that they have received multiple complaints about people using stolen information and deepfaked video and voice to apply to remote jobs, mostly in the tech industry.

The fake applicants are using stolen identities to apply to IT, programming, database, and software firms all over the country. Many of the open positions had access to sensitive customer or employee data and financial company info as well, making officials believe the intent was to steal sensitive information as much as cash a fraudulent paycheck.

The FBI is unsure how many of these deepfakes may have been successful in landing the job rather than being uncovered.

Image Credit: iStock

If you’re doing interviews online, experts say to watch for lip movement that doesn’t match up with what’s being said, or an applicant sneezing or coughing without moving their face and/or lips.

Several other federal agencies also warn that individuals working for the North Korean government are applying for remote positions, typically through sites like Upwork and Fiverrr, using fake documentation and references.

Working backward, authorities learned that operators were working through several layers of shell companies, which made it hard to identify them. Luckily, many of the attempts are amateurish enough to fail to match up speakers’ mouths; professionally produced videos are typically much harder to spot.

Even artificial intelligence created to detect altered video gets it right only between 30-97% of the time, so humans are going to have to pay special attention to be able to spot fakes.

Image Credit: iStock

If you think you have encountered one of these deepfaked interviewees, the FBI asks that you report it to the complaint center site.

Would you be able to spot on of these?

Tell us why you think yes (or no) down in the comments!