Face-recognition technology warning issued
CCTV firms argue that there’s a place for live facial recognition technology in the UK, but privacy and bias concerns remain
CCTV firms argue that there’s a place for live facial recognition technology in the UK, but privacy and bias concerns remain.
It has long been established that Britain is one of the most heavily surveilled countries in the world. London alone has more than 600,000 CCTV cameras, averaging just over 67 cameras for every 1,000 people, according to analysis by Comparitech. It’s the only city outside of China to make the top ten. And, slowly, the cameras are getting smarter.
At the moment, only a small proportion of cameras are capable of live facial recognition (LFR), but the numbers are growing. In a bid to get ahead of the technology, the information commissioner Elizabeth Denham has fired a warning shot at the industry, in the form of a new official “opinion” from her office, setting out the rules around use of the technology.
“The law requires them to demonstrate that their processing can be justified as fair, necessary and proportionate,” the opinion concludes, arguing that: “Where LFR is used for the automatic, indiscriminate collection of biometric data in public places, there is a high bar for its use to be lawful.”
The key issues are around the governance of LFR, and concerns around the automatic collection of biometric data. There are also concerns about the lack of choice by individuals over whether such systems are used on them, the risks of bias and discrimination from facial recognition algorithms, and on how data on children and vulnerable adults is processed.
“There isn’t this thicket of case law, [so] we need to go over and above to assess what we’re doing to make sure that it is okay,” said Simon Randall, CEO of Pimloc, a CCTV video analytics firm. “We need to make sure that the security and safety aspects are fine, the data privacy aspects are fine, that personal freedoms are fine, and that citizens are happy.”
Randall has seen the trade off with LFR up close as someone in the industry, but he believes that the line between privacy and utility can be successfully managed.
He argues that when someone crosses into a different environment, such as a retail store or a football stadium, there’s often an implicit understanding of how data might be collected and used. “There’s a known contract that you’re stepping into a private environment and, therefore, there may be slightly different rules” he says.
He gives the example of how the case for live facial recognition might be questionable in a shop, but at Wembley Stadium there is a stronger case for using the technology. For instance, to keep out the hooligans who broke into the European Championships final.
So how can citizens – and the information commissioner – trust that LFR isn’t being misused? Randall argues that more transparency over what is collected and allowing people to see what data has been collected is key. “As long as that’s there, I think most people are pretty happy,” he said.