Computers to sift abuse images to ease distress of officers
AI programmes will take over disturbing task of examining indecent pictures
COMPUTERS will be used to identify the severity level of child abuse images on suspects’ devices to stop police officers suffering psychological trauma, the Metropolitan Police has said.
Its digital forensics department, which last year trawled through 53,000 different devices for incriminating evidence, expects artificial intelligence to take over the work of officers within three years.
The department already uses image recognition software but it is not sophisticated enough to spot indecent images and video, Mark Stokes, the Met’s head of digital and electronics forensics, told The Daily Telegraph.
“We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans,” he said.
“You can imagine that doing that year on year is very disturbing.”
The force is drawing up an ambitious plan to move its sensitive data to cloud providers such as Amazon Web Services, Google or Microsoft, Mr Stokes said.
This would allow specialists to harness the technology companies’ massive computing power for analytics.
The Met uses a London data centre but the sheer volume of images along with the popularity of high resolution video is putting pressure on resources. With the help of Silicon Valley providers, AI could be trained to detect abusive images “within two to three years”, Mr Stokes said.
The Met’s digital forensics team uses bespoke software that can identify drugs, guns and money while scanning a computer or phone. But it has proven problematic when searching for nudity.
“Sometimes it comes up with a desert and it thinks it’s an indecent image,” Mr Stokes said. “Lots of people have screensavers of deserts and it picks it up thinking it is skin colour.”
The mammoth task of moving the Met’s data into the cloud is a legal minefield due to the sensitive nature of the files the force stores.
Police staff are granted consent from the courts to store criminal images, but it is an offence for anyone else – including Amazon, Microsoft or any cloud provider – to store them. Providers would be taking on an incredible risk associated with storing this material.
Storing data in the cloud is a controversial move thanks to a series of high profile hacks, including a widespread Apple cloud breach where several celebrities’ personal photos were stolen and distributed on the web.
Mr Stokes said that despite concerns, the likes of Google and Amazon might be best placed to keep police information watertight thanks to their huge profits, which can be invested in talent, expertise and ensuring they are using the most advanced technology.
Mr Stokes said: “We have been working on the terms and conditions with cloud providers, and we think we have it covered.”