Use AI to curb abuse instead of for adverts, tech firms told
THE head of the National Crime Agency has challenged social media giants to explain why they can develop Artificial Intelligence to target adverts at users but cannot create AI capable of protecting children from abuse.
Lynne Owens said the failure of social media firms to stop paedophiles on the internet was distracting the agency and police from hunting the “worst offenders” on the dark web.
There are 2.9million accounts registered on the worst child abuse sites on the dark web – 140,000 from the UK.
The number of referrals of child abuse images to the agency from the open web has risen 1,000 per cent since 2013, to 113,948 in 2018, according to data provided to The Sunday Telegraph. Each referral can include multiple images of abused children.
Ms Owens said: “This is my frustration. If you can apply AI to deliver targeted adverts to an individual site, I don’t understand why it’s so difficult to develop AI to think of all the different ways in which people might choose to abuse children online.”
If tech giants were more active, investigators would be able to “interrogate the dark web so we can arrest the worst offenders and look to take down dark web servers”, she added.
She also backed demands for law enforcement agencies to have a “key” to unlock encrypted communications to investigate child abuse. As it stood, she said, tighter encryption – as planned for Facebook Messenger – would thwart investigations and give paedophiles unseen channels to prey on children.
Social media firms claim they have developed and aim to improve AI to weed out child abuse and terrorism.