The problems with police tech
Why tech and policing aren’t always the happiest of bedfellows
There’s a reason futuristic policing films such as Blade Runner and RoboCop are set in apocalyptic versions of our world – there’s a lot that can go wrong when technology and police combine.
While there’s potential for improved transparency, faster response times and cheaper but better results, there are also concerns around privacy invasion, excessive surveillance and abuse of power.
“While technological advances may bring benefits for policing, this must always be balanced with the risks they pose to our privacy and security,” said Pam Cowburn, communications director at the Open Rights Group. “The police have a duty to uphold human rights law and that includes ensuring that their techniques do not violate our fundamental rights.”
That’s not always easy, she noted, as tech develops more quickly than the law. “For example, drones armed with Tasers or driverless police cars could easily become a policing reality in the UK (they are already being adopted in other countries),” she said. “We need to ensure that we have the laws to regulate developments such as this.”
Legal wobbles
Existing laws aren’t always as robust as rights campaigners would like. The Investigatory Powers Act of 2017 – the so-called Snoopers’ Charter – gives police the power to hack devices, for example.
But there are protections. Last December, the Court of Justice of the EU ruled that police must get independent authorisation to view communications data. “Despite this, the police continue to get internal sign-off for such requests,” said Cowburn. “We know that there have been abuses of such powers: for example, the accessing of journalists’ data. If the police want to maintain the public’s trust, they need to ensure that there is rigorous oversight and regulation of their powers.”
Even when we do have laws in place to protect us, police forces can bend or push the rules. “There have been a number of occasions when the UK police have refused to delete data because it can be useful in the future,” Cowburn said. “This goes against the core principle of data protection.”
For example, in March the independent Biometrics Commissioner criticised British police forces for storing the images of millions of people who have not been charged with any crimes, and holding that data on searchable databases. In response to the commissioner’s report, the Home Office noted that people could request to have their images deleted… if they happen to know the police have their photo. The Home Office also noted that not all forces use the Police National Database, so it’s difficult to say how many photos are held nationwide. The collection in question has 19 million photos, but it doesn’t include the Metropolitan Police Service, the largest in the country.
Body-worn cameras, drones and intelligent CCTV will only increase the data collected on us and stretch the applicability of existing law. “For example, the government’s guidance on surveillance cameras says, ‘people in public places should normally be made aware whenever they are being monitored by a surveillance camera system’” said Cowburn. “It is difficult to see how this can be met if police are continuously filming via body wearables.”
On the other hand, body-worn cameras don’t only raise risks, but have clear rewards, Cowburn noted. “Wearables could also make the police more accountable and provide evidence if they are accused of abusing their powers,” she said.
And what of Minority Report- style policing? The use of algorithms to predict where crime will happen – and potentially who will commit it – as well as to support decisions on bail, punishment and so on could prove problematic. “Predictive policing is particularly worrying as it can reverse the presumption of innocence,” Cowburn explained. “Algorithms can have inherent racial and social biases. This must be taken into account if they are being used, for example, to make decisions about bail or re-offending. There needs to be proper oversight to pick up on any bias.”
That’s a problem as we have a tendency to trust technology as neutral and fair over human decisions. “Officers may be more likely to trust a computer’s decision as more neutral, but this may not be the case,” she said. Sorry RoboCop, the future of policing may well be better off human.