Today, the FBI alerted warned against a new even more disgusting type of sextortion. Previously, these schemes involved coerced or stolen digital material, but now some criminals are using technology to create explicit content from innocent images or videos found online. This information comes from today's alert by the FBI's Internet Crime Complaint Center (IC3).
According to the FBI, deepfake scams have the same goal as classic sextortion schemes: the scammer demands payment to prevent the release of compromising material or uses the material to coerce the victim into providing more explicit content. However, with deepfakes, the victim may appear in a realistic image or video without their knowledge or consent.
“The FBI continues to receive reports from victims, including minor children and non-consenting adults, whose photos or videos were altered into explicit content,” the bureau said. “The photos or videos are then publicly circulated on social media or pornographic websites, for the purpose of harassing victims or sextortion schemes.”
In this blog we have warned for years about deepfakes, which are also called "synthetic media" created with artificial intelligence or machine learning tools. The FBI has noticed an increase in reports from victims who have appeared in explicit content that was created using raw material obtained from social media sites, web postings, or video chats.
For cases involving children less than 18 years old, the FBI says the Take It Down service from the National Center for Missing and Exploited Children can provide free help. The bureau also reported that sextortion — which it tracks as a subset of romance scams — is responsible for millions of dollars in losses for Americans. To add insult to injury, in some cases victims get scammed twice when they contact criminal “assistance” organizations that pledge to help but take the money and run.
Here is the FBI Alert: https://www.ic3.gov/Media/Y2023/PSA230605
Don't let this happen to you, your family, your friends or co-workers.