Please complete the required fields.



Rapidly advancing AI technologies are making it easier for scammers to extort victims, including children, by doctoring innocent photos into fake pornographic content, experts and police say.The warnings coincide with a general “explosion” of “sextortion” schemes targeting children and teens who have been linked more than a dozen suicides, according to the FBI.

The National Center for Missing & Exploited Children has recently received reports of manipulated images of victims being shared on social media and other platforms, says John Shehan, a senior vice president at the organization.
“Right now, it can feel a bit like the Wild West,” Shehan told Axios. “These technologies could spiral very quickly out of control.”

Typical sextortion schemes involve scammers coercing victims into sending explicit images, then demanding payment to keep the images private or delete them from the web.But with AI, malicious actors can pull benign photos or videos from social media and create explicit content using open-source image-generation tools.

So-called “deepfakes” and the threats they pose have been around for years, but the tools to create them have recently become extremely powerful and more user-friendly, said John Wilson, a senior fellow at cybersecurity firm Fortra.

The FBI said earlier this month that it has received reports from victims — including minors — that innocuous images of them had been altered using AI tools to create “true-to-life” explicit content, then shared on social media platforms or porn sites. “Once circulated, victims can face significant challenges in preventing the continued sharing of the manipulated content or removal from the internet,” the FBI said.

Last year, the FBI received 7,000 reports of financial sextortion against minors that resulted in at least 3,000 victims — primarily boys — according to a December public safety alert.

Write a Reply or Comment

You should Sign In or Sign Up account to post comment.