Technology

The AI-generated nightmare of child abuse is here

Experts warn that a terrifying new era of ultra-realistic, AI-generated images of child sexual abuse is now underway. Perpetrators are using downloadable open source generative AI models that can generate images, with devastating consequences. The technology is used to create hundreds of new images of children who were previously abused. Offenders are sharing datasets of abuse images that can be used to customize AI models and are beginning to sell monthly subscriptions to AI-generated child sexual abuse material (CSAM).

The details of how the technology is being misused are in a new, Comprehensive report published by the Internet Watch Foundation (IWF), a UK-based non-profit organization that screens and removes abusive content from the internet. In June, the IWF said it had found seven URLs on the open web that contained suspected AI-generated material. An investigation into a dark web CSAM forum that provides an overview of the use of AI has now found almost 3,000 AI-generated images that the IMF considers illegal under UK law.

According to the IWF study, the AI-generated images show rape of babies and young children, abuse of famous teenage children, and BDSM content involving teenagers. “We have seen calls, discussions and actual examples of child abuse material featuring celebrities,” said Dan Sexton, the IMF’s chief technology officer. Sometimes, Sexton says, celebrities are aged to the point where they look like children. In other cases, adult celebrities are portrayed as those who abuse children.

While reports of AI-generated CSAM are still dwarfed by the number of real abuse images and videos online, Sexton is concerned about the speed of development and the potential it creates for new types of abusive images. The results are consistent with those of other groups studying the spread of CSAM online. In a shared database, investigators around the world have flagged 13,500 AI-generated images of child sexual abuse and exploitation, Lloyd Richardson, the director of information technology at the Canadian Center for Child Protection, tells WIRED. “This is just the tip of the iceberg,” says Richardson.

A realistic nightmare

Capable of producing compelling art, realistic photos, and edgy designs, current AI image generators offer a new kind of creativity and the promise of changing art forever. They have also been used to create convincing fakes, such as Balenciaga Pope and an early version of Donald Trump’s arrest. The systems are trained on massive amounts of existing images, often removed from the Internet without permission, and allow images to be created based on simple text input. If you ask for an “elephant in a hat” this will be exactly the result.

It is no surprise that criminals creating CSAM have adopted image generation tools. “The way these images are generated is typically through the use of openly available software,” says Sexton. Offenders the IMF has seen often refer to Stable Diffusion, an AI model from British firm Stability AI. The company did not respond to WIRED’s request for comment. The company introduces the second version of its software, released late last year has changed its model to make it harder for people to create CSAM and other nude images.

Sexton says criminals are using older versions of AI models and refining them to create illegal material of children. A model is fed existing images of abuse or photos of people’s faces so that the AI ​​can create images of specific people. “We’re seeing fine-tuned models generating new images of existing victims,” Sexton says. The perpetrators would “exchange hundreds of new pictures of existing victims” and make inquiries about individuals, he says. Some threads on dark web forums share victims’ faces, the study said, and one thread was called “photo resources for AI and deepfaking-specific girls.”

Snopx

Pechip.com is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – admin@pechip.com. The content will be deleted within 24 hours.

Related Articles

Back to top button