Face facts: it’s a free-for-all
Police and private security firms are routinely using facial recognition – but who, asks Stewart Mitchell, is checking it’s being used legally?
Police and private security firms are routinely using facial recognition – but who, asks Stewart Mitchell, is checking it’s being used legally?
Tracking citizens via face recognition was once a dystopian fantasy – now the tech is advancing so fast it is being deployed without a proper legal checks That’s the opinion of privacy advocates Big Brother Watch (BBW), which has taken the Metropolitan Police to court over its trials of technology that uses live video footage of public spaces to identify passers-by against a “watchlist” of people of interest.
“The police are using live facial recognition to subject every passer-by to a highly invasive biometric identity check, akin to a fingerprint or DNA check, often without their knowledge or consent,” BBW’s legal and policy officer, Griff Ferris, told PC Pro. “Despite the lack of any legal basis for the use of this authoritarian technology, and despite significant concerns that its use infringes people’s human rights, the Metropolitan Police has announced it plans to use live facial recognition several more times this year. We’re hoping the court will intervene to stop this lawless technology.”
While it’s sometimes tempting to dismiss privacy groups as reactionary, many organisations are concerned about the unfettered rise of face recognition, including one of the watchdogs charged with overseeing deployments.
“The shortcoming in terms of the legislation is the question of legality,” Tony Porter, the governmentappointed Surveillance Camera Commissioner, told us.
“On the one hand, the state will claim it’s operating under common law for policing purposes and therefore the use of technology is entirely appropriate.
“The opposing view is that common law provides none of the protections that are outlined within the European human rights articles, so it doesn’t provide a clarity in law under which the police or the state ought to be able to protect themselves.”
According to Porter, legislation has once again failed to keep pace with technology. “You’re reducing a member of the public to a digital signature that can be used, migrated and cross checked,” he said.
“That is a massively different paradigm to a simple face capture and so raises a lot of questions about legality. What is the legal basis against which the capability is being deployed at the moment?
“The Data Protection Act doesn’t provide it, there’s a query over whether the Protection of Freedoms Act does. The government needs to say ‘This isn’t going away – we need to look at the legitimacy of using this in the first place’.”
Diluted regulation
The lack of legal clarity is mirrored by a regulatory system where no central body takes responsibility for face recognition. Instead, the technology flops haphazardly across the remits of at least three separate watchdogs, none of which has much power to address abuse.
The Information Commissioner’s Office, the Biometrics Commissioner and Porter’s Surveillance Camera regulating body all have some jurisdiction, but it’s unclear who would lead an investigation into abuse.
“At the moment the biometrics regulator clearly has responsibility under the Protection of Freedoms Act for certain types of biometrics – such as DNA – but has no responsibility for facial recognition or data analysis for biometrics,” Porter said. “That is something that’s an issue the government may look at, or not.”
The Information Commissioner has a role in biometrics under the Data Protection Act, and can issue stop orders - but only where data is being processed improperly. The Surveillance Camera Commission, meanwhile, operates under the Protection of Freedoms Act and Surveillance Camera Code. While Porter can’t issue fines, he was at least recently given the power to reveal where investigations had broken the code.
“Where a relevant authority, be it a police or crime agency or local authority use biometrics and don’t comply to the code, it is disclosable to the CPS and defence lawyers,” said Porter.
“That is a game changer - it is absolutely imperative investigations can demonstrate that they pay regard to the code. I can’t issue a fine but a court can stop a prosecution and that, for the police, is more damaging from an integrity point of view to an organisation than a fine.”
The lack of legal clarity is mirrored by a regulatory system where no central body takes responsibility
Doubts over accuracy
The police claim that tracking helps catch criminals more quickly, although widespread reports about the current system’s accuracy undermine the theory.
“Facial recognition technology has the potential to help us disrupt crime networks and identify people who pose a threat to the public,” Chief Constable Mike Barton, National Police Chiefs’ Council lead for crime operations told us in a statement. “The public would expect the police to consider all new technologies that could make them safer. Any wider rollout of this technology must be based on evidence showing it to be effective with sufficient safeguards and oversight.”
According to Big Brother Watch, the use of face recognition has so far resulted in no arrests, although several false matches have meant passers-by were stopped, searched and asked to produce ID. The Met says it plans ten further trials before an evaluation at the end of 2018.
Even in public tests, there remains a big difference between the police’s account of how the technology is deployed and those of expert witnesses. For example, in a recent trial in an East London shopping centre, the police claimed “the technology was used overtly. Information leaflets were handed to members of the public, posters were placed in the area and officers engaged with members of the public to explain the process and technology.”
It’s a stark contrast to the views of privacy group Liberty, which attended the trial as a witness and reported problems with the process. “Although the operation made for an intimidating scene with a line of police officers and dogs alongside a knife-arch – a sort of walkthrough metal detector – there was actually alarmingly little information about the use of facial recognition technology,” said Hannah Couchman, an advocacy and policy officer at Liberty, in a report on the trial.
“Having been told there would be plenty of posters and information leaflets, we saw two small posters, positioned below people’s sightlines and just one leaflet being given out – to a man who was incorrectly apprehended, after the fact.”
Another concern for regulators and privacy groups is the lack of transparency at several stages of the face recognition process, not least the databases and watchlists the police use to initiate searches on live camera feeds.
At present, it’s not clear which types of suspect could be identified by face recognition. “The creation of a watchlist, by whom and for what, is a really interesting question,” said Porter. “What are the standards? What are the processes, What is the understanding of the public? What is the compliance of the state?
“The system could jump from the creation of a watchlist to algorithms being used. Do we know if it’s accurate? Do we care if it’s accurate? Is it capable of differentiating between race and ethnic minorities, age, sexuality? To all of the questions, the answer at the moment is, ‘We don’t know’.”
The ICO has even questioned the validity of the database of images that the police use to cross reference against, questioning whether all of the images should be accessible. “The use of images collected when individuals are taken into custody is of concern; there are over 19 million images in the Police National Database,” the Information Commissioner said in a blog post.
“I am considering the transparency and proportionality of retaining these photographs as a separate issue, particularly for those arrested but not charged.”
Privacy International is worried the database could be used for more than just identifying suspects. “If you link the faces to databases with everybody’s name on them – and this is what we see in China and the most dystopian scenario – you can say ‘I want to know where this person is’,” said Frederike Kaltheuner, a data exploitation expert at Privacy International.
“It’s always a slippery slope and it’s easy to start out with one system and then roll out a software update that’s a full-on dystopian one.”