Stay out of trouble: Robocop will be ready to go on the beat by 2027
Advances in computer technology will allow police forces to replace the bobby on our streets
ROBOCOP technology is set to replace bobbies on the beat within a decade as artificial intelligence becomes widespread in the fight against crime.
Thames Valley Police said AI computers, which can make “human” decisions, could be used to answer 999 calls, detect crimes and identify offenders. It has already been used by Scotland Yard to pick out faces at the Notting Hill Carnival and Durham police plan to use it to help make decisions over who to keep in custody.
But concerns have been raised that there might be a risk of “bias” in the computer software and that AI “might be unable to reason with a human”.
Ministers this weekend are preparing to publish a review into how AI will change Britain in the coming decades.
In a submission to a parliamentary inquiry into the “Implications of Artiticial Intelligence”, Thames Valley said that “even at the lowest level AI could perform many of the process-driven tasks that take place in the police” by linking databases, risk-assessing offenders, analysing devices, transcribing and scrutinising CCTV, running security checks and automating administrative tasks. Thames Valley said a 999 caller might even be able to describe an incident and be understood by AI.
“Speech analysis categorises the type of incident and detects indicators of stress from the caller,” the submission says. “The date, time, location and offence details are recorded automatically.” CCTV would then monitor the situation and identify suspects.
Police would be dispatched, arrests could be made, statements taken and uploaded, transcribed and attached to a crime report instantly.
“Solvability factors are calculated on the quality of the availability data. The risk assessment provides a recommendation for officers on the next steps for the offender and also an appropriate support package for the victim.”
Thames Valley police acknowledged there needed to be “a high level of human oversight and clear justification”.
It said that “recent tests of AI in policing indicate there is a risk of bias, so engagement with privacy and civil rights groups will be necessary”.
A spokesman said: “AI is likely to emerge in law enforcement activity over the next 10 years.”
But David Green, director of the Civitas think tank, called AI “Orwellian” and said it could unfairly target ethnic minorities. “Robocop policing has now arrived in England,” he said.
Renate Samson, chief executive of Big Brother Watch, said facial recognition technology had a low success rate.
“Data held by forces is far from accurate or complete,” she said. “With rubbish data, AI will give rubbish answers.”
Simon Kempton, of the Police Federation of England and Wales, warned that AI could drive “a wedge between the public and the police”.