A third of child porn images on adult sites are selfies
ALMOST one in three reported child abuse images are now taken by the children themselves, new figures from the Internet Watch Foundation show.
Children are increasingly filming or photographing themselves in explicit situations and sharing the footage, which then ends up on adult pornography sites or shared by paedophiles, the foundation’s annual report found.
The number of cases that involved “self-generated” content rose from 349 in January 2017 to 1,717 one year later, rising from six per cent of the total to 26 per cent. The average proportion of images discovered between November and February that were self-generated was 31 per cent, rising to a high of 40 per cent in December, according to figures seen by The Daily Telegraph.
“We increasingly see more imagery of 11- to 15-year-olds in what is termed “self-produced” content created using webcams and then shared online. This can have serious repercussions for young people and we take this trend very seriously,” the report said.
Experts said 11- to 15-year-olds were copying celebrity idols who were frequently sharing naked selfies.
Fred Langford, the deputy chief executive of the Internet Watch Foundation (IWF), said: These are people they aspire to be in the future. It’s a very difficult thing to say ‘don’t act like your heroes’. The normalisation of this in the
adult population is just being mirrored by kids,” he said. In many cases, children were taking the images to share with friends and they were then being aggregated by paedophiles who added them to dedicated websites, he added.
The report also found that paedophiles were increasingly technologically savvy, with rising numbers of images hidden in disguised websites, only accessible via a specific pathway.
Category A content, which includes the rape and sexual torture of children, made up more of the images and videos found, growing from 28 per cent of all content to 33 per cent.
A spokesman for the NSPCC said: “The sheer scale and complexity of the problem is evolving rapidly in line with technology, so it’s impossible to simply police our way out of the problem, we need a comprehensive strategy to stop potential offenders in their tracks.”
The IWF is a charity which searches for and identifies child abuse images online to get them taken down.