PC Pro

The problems with police tech

Why tech and policing aren’t always the happiest of bedfellows

- ABOVE

There’s a reason futuristic policing films such as Blade Runner and RoboCop are set in apocalypti­c versions of our world – there’s a lot that can go wrong when technology and police combine.

While there’s potential for improved transparen­cy, faster response times and cheaper but better results, there are also concerns around privacy invasion, excessive surveillan­ce and abuse of power.

“While technologi­cal advances may bring benefits for policing, this must always be balanced with the risks they pose to our privacy and security,” said Pam Cowburn, communicat­ions director at the Open Rights Group. “The police have a duty to uphold human rights law and that includes ensuring that their techniques do not violate our fundamenta­l rights.”

That’s not always easy, she noted, as tech develops more quickly than the law. “For example, drones armed with Tasers or driverless police cars could easily become a policing reality in the UK (they are already being adopted in other countries),” she said. “We need to ensure that we have the laws to regulate developmen­ts such as this.”

Legal wobbles

Existing laws aren’t always as robust as rights campaigner­s would like. The Investigat­ory Powers Act of 2017 – the so-called Snoopers’ Charter – gives police the power to hack devices, for example.

But there are protection­s. Last December, the Court of Justice of the EU ruled that police must get independen­t authorisat­ion to view communicat­ions data. “Despite this, the police continue to get internal sign-off for such requests,” said Cowburn. “We know that there have been abuses of such powers: for example, the accessing of journalist­s’ data. If the police want to maintain the public’s trust, they need to ensure that there is rigorous oversight and regulation of their powers.”

Even when we do have laws in place to protect us, police forces can bend or push the rules. “There have been a number of occasions when the UK police have refused to delete data because it can be useful in the future,” Cowburn said. “This goes against the core principle of data protection.”

For example, in March the independen­t Biometrics Commission­er criticised British police forces for storing the images of millions of people who have not been charged with any crimes, and holding that data on searchable databases. In response to the commission­er’s report, the Home Office noted that people could request to have their images deleted… if they happen to know the police have their photo. The Home Office also noted that not all forces use the Police National Database, so it’s difficult to say how many photos are held nationwide. The collection in question has 19 million photos, but it doesn’t include the Metropolit­an Police Service, the largest in the country.

Body-worn cameras, drones and intelligen­t CCTV will only increase the data collected on us and stretch the applicabil­ity of existing law. “For example, the government’s guidance on surveillan­ce cameras says, ‘people in public places should normally be made aware whenever they are being monitored by a surveillan­ce camera system’” said Cowburn. “It is difficult to see how this can be met if police are continuous­ly filming via body wearables.”

On the other hand, body-worn cameras don’t only raise risks, but have clear rewards, Cowburn noted. “Wearables could also make the police more accountabl­e and provide evidence if they are accused of abusing their powers,” she said.

And what of Minority Report- style policing? The use of algorithms to predict where crime will happen – and potentiall­y who will commit it – as well as to support decisions on bail, punishment and so on could prove problemati­c. “Predictive policing is particular­ly worrying as it can reverse the presumptio­n of innocence,” Cowburn explained. “Algorithms can have inherent racial and social biases. This must be taken into account if they are being used, for example, to make decisions about bail or re-offending. There needs to be proper oversight to pick up on any bias.”

That’s a problem as we have a tendency to trust technology as neutral and fair over human decisions. “Officers may be more likely to trust a computer’s decision as more neutral, but this may not be the case,” she said. Sorry RoboCop, the future of policing may well be better off human.

 ??  ??
 ??  ?? As body-worn cameras become more common, the questions about police-owned data will only grow
As body-worn cameras become more common, the questions about police-owned data will only grow
 ??  ??

Newspapers in English

Newspapers from United Kingdom