Generative AI is exacerbating the problem of child sexual abuse material (CSAM) online, with watchdog groups reporting a surge in deepfake content that mimics images of real victims.
A report published by the UK’s Internet Watch Foundation (IWF) documented a significant increase in digitally altered or fully synthetic images of children in explicit situations, with one forum seeing 3,512 images and videos shared in a 30-day period, most of them of young girls, and also documented criminals sharing advice with each other, as well as AI models derived from real images.
“Without proper oversight, generative AI tools provide a playground for online predators to realise their most perverse and disturbing fantasies,” wrote IWF CEO Suzie Hargreaves OBE. “Even now, IWF is beginning to see an increasing number of these types of material being shared and sold on commercial child sexual abuse websites across the internet.”
X is developing a tool to block links in replies to reduce spam
The Snapshot study found that there has been a 17 percent increase in AI-altered CSAM online since fall 2023, as well as an alarming increase in material depicting extreme and explicit sexual acts. Material includes adult pornography that has been altered to show a child’s face, and existing child sexual abuse content that has been digitally edited to overlay the likeness of another child.
“The report also highlights how rapidly the technology to generate fully synthetic AI videos of CSAM is improving,” IWF wrote. “While this type of video is not yet sophisticated enough to pass for actual child sexual abuse videos, analysts have described it as the ‘worst’ fully synthetic video available. Advances in AI may soon enable the generation of more realistic videos, in the same way that still images have become photorealistic.”
Review of 12,000 new AI-generated data sets reveals image IWF analysts said 90 percent of the CSAM posted on dark web forums over the course of a month was realistic enough to be assessed as actual CSAM under existing laws.
Mashable Lightspeed
Another report by a British watchdog said: Guardian today, It has been alleged that Apple has significantly underreported the amount of child sexual abuse material shared through its products, raising concerns about how the company moderates content created by generative AI. In its investigation, the National Society for the Prevention of Cruelty to Children (NSPCC) compared official figures released by Apple with figures gathered through Freedom of Information requests.
Apple reported 267 cases of CSAM to the National Centre for Missing and Exploited Children (NCMEC) worldwide in 2023, but the NSPCC alleges the company was responsible for 337 child abuse image crimes in England and Wales alone. And these figures only covered the period from April 2022 to March 2023.
Apple Guardian Reached for comment, the company pointed the publication to its previous decision not to scan iCloud Photo Library for CSAM because it prioritizes user security and privacy. Mashable has also reached out to Apple and will update this article if it responds.
US law requires US-based technology companies to report cases of CSAM to the NCMEC. Google reported more than 1.47 million cases to the NCMEC in 2023. As another example, Facebook removed 14.4 million pieces of child sexual exploitation content between January and March of this year. While the company reports a significant drop in the number of reports of child nudity and abuse over the past five years, watchdog groups remain alarmed.
Online child exploitation is notoriously difficult to combat, as child abusers frequently exploit social media platforms and their loopholes to continue engaging with minors online. Now, with the power of generative AI in the hands of bad actors, the battle is only set to intensify.
Read more from Mashable’s report on the impact of non-consensual synthetic imagery:
If intimate images of you have been shared without your consent, please call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website states: Useful information The list of International Resources.