Children creating AI nudes and sharing among friends
SCHOOLCHILDREN are downloading AI apps specifically designed to create nudes, a government report has warned.
They are also sharing the images illegally with their friends, experts say, while organised crime gangs are using the apps to blackmail children with computer-made naked images.
A report from the Department for Science, Innovation and Technology (DSIT) was authored by the UK Council for Internet Safety, which includes tech giants, charities, government departments and regulators to advise teachers on how to deal with the issue of pupils sharing nudes or semi-nudes.
The document was recently updated to include Ai-generated images, deepfakes and the topic of “sextortion”.
Education workers are told to deal with Ai-made images in the same way as normal nudes, which includes not looking at them, not deleting them, calling the police, and not immediately informing parents.
The guidance has been welcomed by campaigners and experts, but there are calls for Ofcom, the regulator of the Online Safety Act 2023, to be more “proactive and comprehensive” in cracking down on AI nudes. The Government is also being encouraged to change the law so that it is illegal to make an AI nude or deepfake porn.
It is an offence to share deepfake porn or /AI nude images of an adult without consent. It is also an offence to threaten to do so.
However, the creation of AI nudes is not illegal, if of an adult.
This loophole has led to high-profile women, such as Taylor Swift, having deepfake nudes shared online and the topic has been raised in the Commons and in the House of Lords.
However, it is unlawful to create, own or share nude or sexual images of children, which includes AI photos, as it constitutes child sexual abuse.
The law has criminalised the consensual sharing of nudes among two 17-year-olds, and also makes it illegal for under-18s to make and share AI nudes of their classmates.
“A young boy using a nudify app to create a nude is committing an offence of creating/possessing child sexual abuse imagery, even if of himself,” said Durham University’s Prof Clare Mcglynn, an expert in the regulation of pornography.
Experts are warning that the ease with which people can access AI nude-making apps online is driving a rise in virtual sexual exploitation of women and girls.