55% want curb on police use of face tech
MORE than half of people in the UK want the government to curb police use of facial recognition technology, a new survey suggests.
The research found 55% of the 4,109 adults who responded want restrictions on police use, while almost a third (29%) feel uncomfortable with forces using the technology at all.
However, nearly half (49%) supported the use of the technology in day-to-day policing if proper safeguards are in place.
Conducted by the Ada Lovelace Institute research body with YouGov, the survey also revealed nearly half (46%) of those asked want the right to opt out of the use of the technology.
People are even more uneasy with its use by private companies, with 77% saying they are uncomfortable with its use in shops to track customers and 76% reporting they are uncomfortable with it being used by HR departments in recruitment.
The survey also found around two-thirds of people (67%) are opposed to the use of the technology in schools and 61% do not want it used on public transport.
The Ada Lovelace Institute is calling for firms to stop selling and using the technology while the public is consulted.
Director Carly Kind said: “These findings show that companies and the government have a responsibility to act now. The UK is not ready for facial recognition technology.
“As a first step, a voluntary moratorium by all those selling and using facial recognition technology would enable a more informed conversation with the public about limitations and appropriate safeguards.”
The report, “Beyond face value: public attitudes to facial recognition technology”, was published yesterday after an activist lost the world’s first legal challenge over its use by police.
Ed Bridges, 36, from Cardiff, brought the challenge at the High Court after claiming his face was scanned while he was Christmas shopping in 2017 and at a peaceful anti-arms protest in 2018.
His lawyers argued the use of automatic facial recognition (AFR) by South Wales Police caused him “distress” and violated his privacy and data protection rights by processing an image taken of him in public.
But his case was dismissed on Wednesday by two leading judges, who said the use of the technology was not unlawful.
Facial recognition technology maps faces in a crowd, then compares results with a “watch list” of images – which can include suspects and persons of interest.