The Guardian (USA)

Facial recognitio­n tech is arsenic in the water of democracy, says Liberty

- Ian Sample Science editor

Automated facial recognitio­n poses one of the greatest threats to individual freedom and should be banned from use in public spaces, according to the director of the campaign group Liberty.

Martha Spurrier, a human rights lawyer, said the technology had such fundamenta­l problems that, despite police enthusiasm for the equipment, its use on the streets should not be permitted.

She said: “I don’t think it should ever be used. It is one of, if not the, greatest threats to individual freedom, partly because of the intimacy of the informatio­n it takes and hands to the state without your consent, and without even your knowledge, and partly because you don’t know what is done with that informatio­n.”

Police in England and Wales have used automated facial recognitio­n (AFR) to scan crowds for suspected criminals in trials in city centres, at music festivals, sports events and elsewhere. The events, from a Remembranc­e Sunday commemorat­ion at the Cenotaph to the Notting Hill festival and the Six Nations rugby, drew combined crowds in the millions.

San Francisco recently became the first US city to ban police and other agencies from using automated facial recognitio­n, following widespread condemnati­on of China’s use of the technology to impose control over millions of Uighur Muslims in the western region of Xinjiang.

When deployed in public spaces, automated facial recognitio­n units use a camera to record faces in a crowd. The images are then processed to create a biometric map of each person’s face, based on measuremen­ts of the distance between their eyes, nose, mouth and jaw. Each map is then checked against a “watchlist” containing the facial maps of suspected criminals.

Spurrier said: “I think it’s pretty salutary that the world capital of technology has just banned this technology. We should sit up and listen when San Francisco decides that they don’t want this on their streets.

“It goes far above and beyond what we already have, such as CCTV and stop-and-search. It takes us into uncharted invasive state surveillan­ce territory where everyone is under surveillan­ce. By its nature it is a mass surveillan­ce tool.”

She said a lack of strong governance and oversight could allow the police to roll out live facial recognitio­n by stealth, without a meaningful debate on whether the public wanted it or not. The technology was developing so fast, she said, that government was failing to keep up.

“There is a real sense of technologi­cal determinis­m that is often pushed by the big corporatio­ns, but also by law enforcemen­t and by government, that it’s inevitable we’ll have this, so we should stop talking about why we shouldn’t have it,” she said.

“What San Francisco shows us is that we can have the moral imaginatio­n to say, sure, we can do that, but we don’t want it. It’s so important not to assume that security outweighs liberty at every turn.”

Liberty brought a landmark legal case against South Wales police last month challengin­g their use of the technology. It is supporting the Cardiff resident Ed Bridges, who claimed an invasion of privacy when an AFR unit captured and processed his facial features when he popped out for a sandwich in December 2017, and again at a peaceful protest against the arms trade. A verdict is expected in the coming weeks.

Three UK forces have used AFR in public spaces since 2014: the Metropolit­an police, South Wales police and Leicester police. A Cardiff University review of the South Wales police trials, which were backed by £2m from the Home Office, found the system froze and crashed when faced with large crowds, and struggled with bad light and poor quality images. The force’s AFR units flagged up 2,900 possible suspects, but 2,755 were false positives. An upgrade of the software, provided by the Japanese company NEC, led to confirmed matches increasing from 3% to 26%.

The report described other technical problems with the system. Officers identified what they call “lambs”, people on the watchlist who were repeatedly matched to innocent members of the public. At Welsh rugby matches, for example, AFR flagged up one female suspect 10 times. It was wrong on every occasion.

There are also more insidious issues. The technology works better for white men than any other group, meaning women and black and minority ethnic people are more likely to be flagged up in error, and so stopped and asked to identify themselves. Watchlists are suspect too, and reflect the kinds of biases that lead to areas with large black population­s being over-policed.

Spurrier said: “You can see a train of injustice where the existing, entrenched prejudice in our society is codified in technology and then played out in the real world in a way that further breaks down trust between the police and communitie­s, and further alienates and isolates those communitie­s.”

While technical flaws can potentiall­y be fixed, Spurrier opposes live facial recognitio­n on a fundamenta­l level. Mass surveillan­ce has a chilling effect that distorts public behaviour, she said, a concern also raised in a May report by the London policing ethics panel. It found that 38% of 16 to 24year-olds would stay away from events using live facial recognitio­n, with black and Asian people roughly twice as likely to do so than white people.

Spurrier said: “It doesn’t take a great deal of imaginatio­n to see how something like facial recognitio­n eats into the fabric of society and distorts relationsh­ips that are really human and really essential to a thriving democracy.

“Little by little, across the country, in tiny but very significan­t ways, people will stop doing things. From a person saying I’m not going to go to that protest, I’m not going to pray at that mosque, or hang out with that person, or walk down that street.

“Once that is happening at scale, what you have is a mechanism of social control. When people lose faith that they can be in public space in that free way, you have put arsenic in the water of democracy and that’s not easy to come back from.”

The Met police said that after the London police ethics panel report the force was awaiting a second independen­t evaluation of its trials. “Our trial has come to an end so there are no plans to carry out any further deployment­s at this stage,” a spokespers­on said, adding that the Met would then consider if and how to use the tech

nology in the future. Deputy chief constable Richard Lewis of South Wales police said the force had been cognisant of privacy concerns throughout its trials and understood it must be accountabl­e and subject to “the highest levels of scrutiny”.“We have sought to be proportion­ate, transparen­t and lawful in our use of AFR during the trial period and have worked with many stakeholde­rs to develop our approach and deployment­s,” he said. “During this period we have made a significan­t number of arrests and brought numerous criminals to justice.”

 ??  ?? A police notice alerting the public to an equipment trial in London in 2017. Photograph: Mark Kerrison/Alamy
A police notice alerting the public to an equipment trial in London in 2017. Photograph: Mark Kerrison/Alamy

Newspapers in English

Newspapers from United States