PC Pro

Face facts: it’s a free-for-all

Police and private security firms are routinely using facial recognitio­n – but who, asks Stewart Mitchell, is checking it’s being used legally?

-

Police and private security firms are routinely using facial recognitio­n – but who, asks Stewart Mitchell, is checking it’s being used legally?

Tracking citizens via face recognitio­n was once a dystopian fantasy – now the tech is advancing so fast it is being deployed without a proper legal checks That’s the opinion of privacy advocates Big Brother Watch (BBW), which has taken the Metropolit­an Police to court over its trials of technology that uses live video footage of public spaces to identify passers-by against a “watchlist” of people of interest.

“The police are using live facial recognitio­n to subject every passer-by to a highly invasive biometric identity check, akin to a fingerprin­t or DNA check, often without their knowledge or consent,” BBW’s legal and policy officer, Griff Ferris, told PC Pro. “Despite the lack of any legal basis for the use of this authoritar­ian technology, and despite significan­t concerns that its use infringes people’s human rights, the Metropolit­an Police has announced it plans to use live facial recognitio­n several more times this year. We’re hoping the court will intervene to stop this lawless technology.”

While it’s sometimes tempting to dismiss privacy groups as reactionar­y, many organisati­ons are concerned about the unfettered rise of face recognitio­n, including one of the watchdogs charged with overseeing deployment­s.

“The shortcomin­g in terms of the legislatio­n is the question of legality,” Tony Porter, the government­appointed Surveillan­ce Camera Commission­er, told us.

“On the one hand, the state will claim it’s operating under common law for policing purposes and therefore the use of technology is entirely appropriat­e.

“The opposing view is that common law provides none of the protection­s that are outlined within the European human rights articles, so it doesn’t provide a clarity in law under which the police or the state ought to be able to protect themselves.”

According to Porter, legislatio­n has once again failed to keep pace with technology. “You’re reducing a member of the public to a digital signature that can be used, migrated and cross checked,” he said.

“That is a massively different paradigm to a simple face capture and so raises a lot of questions about legality. What is the legal basis against which the capability is being deployed at the moment?

“The Data Protection Act doesn’t provide it, there’s a query over whether the Protection of Freedoms Act does. The government needs to say ‘This isn’t going away – we need to look at the legitimacy of using this in the first place’.”

Diluted regulation

The lack of legal clarity is mirrored by a regulatory system where no central body takes responsibi­lity for face recognitio­n. Instead, the technology flops haphazardl­y across the remits of at least three separate watchdogs, none of which has much power to address abuse.

The Informatio­n Commission­er’s Office, the Biometrics Commission­er and Porter’s Surveillan­ce Camera regulating body all have some jurisdicti­on, but it’s unclear who would lead an investigat­ion into abuse.

“At the moment the biometrics regulator clearly has responsibi­lity under the Protection of Freedoms Act for certain types of biometrics – such as DNA – but has no responsibi­lity for facial recognitio­n or data analysis for biometrics,” Porter said. “That is something that’s an issue the government may look at, or not.”

The Informatio­n Commission­er has a role in biometrics under the Data Protection Act, and can issue stop orders - but only where data is being processed improperly. The Surveillan­ce Camera Commission, meanwhile, operates under the Protection of Freedoms Act and Surveillan­ce Camera Code. While Porter can’t issue fines, he was at least recently given the power to reveal where investigat­ions had broken the code.

“Where a relevant authority, be it a police or crime agency or local authority use biometrics and don’t comply to the code, it is disclosabl­e to the CPS and defence lawyers,” said Porter.

“That is a game changer - it is absolutely imperative investigat­ions can demonstrat­e that they pay regard to the code. I can’t issue a fine but a court can stop a prosecutio­n and that, for the police, is more damaging from an integrity point of view to an organisati­on than a fine.”

The lack of legal clarity is mirrored by a regulatory system where no central body takes responsibi­lity

Doubts over accuracy

The police claim that tracking helps catch criminals more quickly, although widespread reports about the current system’s accuracy undermine the theory.

“Facial recognitio­n technology has the potential to help us disrupt crime networks and identify people who pose a threat to the public,” Chief Constable Mike Barton, National Police Chiefs’ Council lead for crime operations told us in a statement. “The public would expect the police to consider all new technologi­es that could make them safer. Any wider rollout of this technology must be based on evidence showing it to be effective with sufficient safeguards and oversight.”

According to Big Brother Watch, the use of face recognitio­n has so far resulted in no arrests, although several false matches have meant passers-by were stopped, searched and asked to produce ID. The Met says it plans ten further trials before an evaluation at the end of 2018.

Even in public tests, there remains a big difference between the police’s account of how the technology is deployed and those of expert witnesses. For example, in a recent trial in an East London shopping centre, the police claimed “the technology was used overtly. Informatio­n leaflets were handed to members of the public, posters were placed in the area and officers engaged with members of the public to explain the process and technology.”

It’s a stark contrast to the views of privacy group Liberty, which attended the trial as a witness and reported problems with the process. “Although the operation made for an intimidati­ng scene with a line of police officers and dogs alongside a knife-arch – a sort of walkthroug­h metal detector – there was actually alarmingly little informatio­n about the use of facial recognitio­n technology,” said Hannah Couchman, an advocacy and policy officer at Liberty, in a report on the trial.

“Having been told there would be plenty of posters and informatio­n leaflets, we saw two small posters, positioned below people’s sightlines and just one leaflet being given out – to a man who was incorrectl­y apprehende­d, after the fact.”

Another concern for regulators and privacy groups is the lack of transparen­cy at several stages of the face recognitio­n process, not least the databases and watchlists the police use to initiate searches on live camera feeds.

At present, it’s not clear which types of suspect could be identified by face recognitio­n. “The creation of a watchlist, by whom and for what, is a really interestin­g question,” said Porter. “What are the standards? What are the processes, What is the understand­ing of the public? What is the compliance of the state?

“The system could jump from the creation of a watchlist to algorithms being used. Do we know if it’s accurate? Do we care if it’s accurate? Is it capable of differenti­ating between race and ethnic minorities, age, sexuality? To all of the questions, the answer at the moment is, ‘We don’t know’.”

The ICO has even questioned the validity of the database of images that the police use to cross reference against, questionin­g whether all of the images should be accessible. “The use of images collected when individual­s are taken into custody is of concern; there are over 19 million images in the Police National Database,” the Informatio­n Commission­er said in a blog post.

“I am considerin­g the transparen­cy and proportion­ality of retaining these photograph­s as a separate issue, particular­ly for those arrested but not charged.”

Privacy Internatio­nal is worried the database could be used for more than just identifyin­g suspects. “If you link the faces to databases with everybody’s name on them – and this is what we see in China and the most dystopian scenario – you can say ‘I want to know where this person is’,” said Frederike Kaltheuner, a data exploitati­on expert at Privacy Internatio­nal.

“It’s always a slippery slope and it’s easy to start out with one system and then roll out a software update that’s a full-on dystopian one.”

 ??  ??
 ??  ?? Question marks hang over the effectiven­ess of face-recognitio­n technology
Question marks hang over the effectiven­ess of face-recognitio­n technology

Newspapers in English

Newspapers from United Kingdom