Thousands of AI generated images depicting the sexual abuse of children are threatening to “overwhelm” the internet, according to the Internet Watch Foundation (IWF).
The charity said its “worst nightmares” are coming true as predators are now able to make AI images of real child victims.
The UK organisation responsible for detecting and removing harmful images, said that most AI child abuse imagery identified by IWF analysts is now realistic enough to be treated as real imagery under UK law.
The IWF also warned that technology is being abused to “nudify” children whose clothed images have been uploaded online for legitimate reasons, with this content now being commercialised by online predators. It added that criminals are also using AI technology to create imagery of celebrities who have been “de-aged” and depicted as children in sexual abuse scenarios.
Over the course of of a month, the IWF investigated 11,108 AI images which had been shared on a dark web child abuse forum. It said 2,978 of these were confirmed as images which breach UK law – meaning they depicted child sexual abuse.
One in five of these images were classified as Category A, the most serious kind of imagery and 1,372 imaged depicted primary school-aged children.
Commenting on the news Susie Hargreaves OBE, chief executive of the IWF, said: “Our worst nightmares have come true. Earlier this year, we warned AI imagery could soon become indistinguishable from real pictures of children suffering sexual abuse, and that we could start to see this imagery proliferating in much greater numbers. We have now passed that point.”
She added: “Chillingly, we are seeing criminals deliberately training their AI on real victims’ images who have already suffered abuse. Children who have been raped in the past are now being incorporated into new scenarios because someone, somewhere, wants to see it.”
In September, the US and the UK made a joint statement promising to find solutions to halt the spread of AI-generated child sexual abuse images and encouraged other countries to join them in their pledge to prevent a further rise in this type of content.
Recent Stories