Scammers have been exploiting deepfake technology to impersonate job candidates during interviews for remote positions, according to the FBI.
The agency has recently seen an increase in the number of complaints about the scam, the FBI said in a public advisory(Opens in a new window) on Tuesday. Fraudsters have been using both deepfakes and personal identifying information stolen from victims to dupe employers into hiring them for remote jobs.
Deepfakes involve using AI-powered programs to create realistic but phony media of a person. In the video realm, the technology can be used to swap in a celebrity’s face onto someone else's body. On the audio front, the programs can clone a person’s voice, which can then be manipulated to say whatever you’d like.
The technology is already being used in YouTube videos to entertaining effect(Opens in a new window). However, the FBI’s advisory shows deepfakes are also fueling identity theft schemes. “Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants,” the FBI says.
The scammers have been using the technology to apply for remote or work-from-home jobs from IT companies. The FBI didn’t clearly state what the scammers' end goal. But the agency noted, “some reported positions include access to customer PII (personal identifying information), financial data, corporate IT databases and/or proprietary information.”
Such info could help scammers steal valuable details from companies and commit other identity fraud schemes. But in some good news, the FBI says there’s a way employers can detect the deepfakery. To secure the jobs, scammers have been participating in video interviews with prospective employers.
Read more on pcmag.com