Leading social media platforms, including TikTok and Snapchat, have signed a pledge to tackle child sexual abuse images generated through artificial intelligence (AI).
In a gathering hosted on Monday by UK’s Home Secretary Suella Braverman, representatives from leading tech companies, Australian and UK government officials, academics, and nonprofit organisations came together to address the spread of AI-generated images depicting child sexual abuse.
The stakeholders also discussed the potential threats and risks AI-generated imagery poses to children.
The gathering was held after the Internet Watch Foundation, a nonprofit that works to support minor victims by tracking and removing sexual abuse material online, issued a warning that generative AI tools could contribute heavily to violence against children on the internet.
The proliferation of media depicting child sexual abuse could be accelerated by deepfake images, said the IWF, as it called on the governments and tech companies to take immediate preventive measures.
“Child sexual abuse images generated by AI are an online scourge,” said Braverman. “This is why tech giants must work alongside law enforcement to clamp down on their spread. The pictures are computer-generated but they often show real people – it’s depraved and damages lives.”
The secretary remarked AI-generated images depicting child sexual abuse have spread at a “shocking” pace. “We cannot let this go on unchecked.”
According to the IWF’s research, child sex abuse images can be found in large volumes on the dark web. The images, which are realistic enough to be challenged under the UK law, can obfuscate the identification of real instances of sexual abuse of minors on the internet.
“The realism of these images is astounding, and improving all the time,” said Susie Hargreaves, chief executive at the IWF. “The majority of what we’re seeing is now so real, and so serious, it would need to be treated exactly as though it were real imagery under UK law.”
She stressed the need for countering technology-facilitated child abuse “before it has a chance to fully take root”.