The FBI is warning that scammers are using AI technology to create sexually explicit deepfake photos and videos of people in a bid to extort money from them, also known as “sextortion.”
The threat is particularly disturbing because it exploits the benign photos people post on their social media accounts, which are often public. Thanks to advancements in image- and video-editing software, a bad actor can take the same photos and use them to create AI-generated porn with the victim’s face.
“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,” the agency said(Opens in a new window) in the alert. “The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”
As a result, the FBI is warning the public about the danger of posting photos and videos of themselves online. “Although seemingly innocuous when posted or shared, the images and videos can provide malicious actors an abundant supply of content to exploit for criminal activity."
The FBI did not say how many complaints it has received. But the agency issued the alert as it’s seen thousands of sextortion schemes targeting minors. This can involve an online predator pretending to be an attractive girl and then duping a teenage boy into sending them nudes. The scammer will then threaten to post the nudes online unless money is paid.
In today’s alert, the FBI noted recent sextortion schemes have also involved the use of deepfakes. “As of April 2023, the FBI has observed an uptick in sextortion victims reporting the use of fake images or videos created from content posted on
Read more on pcmag.com