Companies might have a harder time vetting candidates now that deepfakes are getting involved. The FBI warns that employers have interviewed people who've used the face-altering technology to simulate someone else, and are also passing along stolen personal info as their own.
The people using deepfakes -- a technology that taps artificial intelligence to make it look like a person is doing or saying things they actually aren't -- were interviewing for remote or work-from-home jobs in information technology, programming, databases and other software-related roles, according to the FBI's public service announcement. Employers noticed some telltale signs of digital trickery when lip movements and facial actions didn't match up with the audio of the person being interviewed, especially when they coughed or sneezed.
The deepfaking interviewees also tried to pass along personally identifiable information stolen from someone else in order to pass background checks.
This is the latest use of deepfakes, which entered the mainstream in 2019 with the political upheaval. Hobbyists have used deepfakes for more benign stunts since then, like cleaning up de-aging in or swapping out an ultra-serious Caped Crusader for a more jovial one .other people's faces and voices and place victims into embarrassing situations like pornography, or cause
But the threat of using deepfakes for political ends remains, as when Facebookof Ukrainian President Volodymyr Zelenskyy back in March. The EU just strengthened its disinformation rules to , but their use in situations as mundane as job interviews shows how easy the deception tech is to get your hands on and use.