The Daily Telegraph - Saturday - The Telegraph Magazine

Who’s watching you in 2020? Mass surveillan­ce, coming to a street near you. By Joe Shute

It may surprise you to learn that Britain is one of the most heavily monitored nations on earth.

-

Going to watch the football? Popping out for a sandwich? As you go about your everyday life, it’s now likely your face is being digitally stored by everyone from the police to retailers. So is facial recognitio­n keeping you safe or turning us into Big Brother Britain? Joe Shute reports

Facial-recognitio­n technology – biometric software that maps facial features and stores the data as a ‘faceprint’ for later comparison – is rapidly encroachin­g upon our lives, spreading throughout our streets, stations, shopping centres, supermarke­ts and stadia. What’s more, it is being developed with our tacit approval. Nowadays many iphone users need only look at their screen to unlock it, while Facebook’s facial-tagging service was one of the early forerunner­s of the software. And at airports, we regularly file towards the epassport gates and stand expression­less while a camera processes the unique intricacie­s of our face until the doors glide open. All of which is assisting the developmen­t of facial-recognitio­n technology. As are the numerous apps that encourage us to take and submit photograph­s of ourselves to be digitally altered, prematurel­y aged, or transforme­d into a different gender.

As a society, we seem to have grown accustomed to being watched. In the normal course of events in a British city on any given day, your image is likely to be captured at least 70 times. There are now well in excess of six million CCTV cameras posi

tioned across the UK, while automatic number-plate recognitio­n (ANPR) has a national infrastruc­ture of approximat­ely 11,000 cameras, capturing up to 50 million pieces of data on our movements every day.

Where facial-recognitio­n technology differs from traditiona­l CCTV is that it relies on sophistica­ted algorithms to isolate data points detected in any given face and translate those into unique digital signatures. This informatio­n is then stored to detect the same person at a later date, or to be screened against a database. According to Tony Porter, surveillan­ce camera commission­er for England and Wales, the new facialreco­gnition technology rapidly being deployed around us ‘significan­tly changes the paradigm of surveillan­ce as we used to know it’. Porter, a former police assistant chief constable in charge of counter terrorism during the 2012 Olympics, was appointed

to his current role five years ago as an independen­t regulator employed by the Home Office. He believes an urgent public debate is required about what we as a society deem acceptable intrusion into our daily lives. ‘All of a sudden you start to see a greater web around the citizen that wasn’t there 10 years ago,’ he says. ‘And the question is: how do you like that?’

Not very much, it turns out. Between 2016 and 2018, two nondescrip­t cameras were positioned over King’s Boulevard, a busy walkway popular with pedestrian­s and cyclists beside King’s Cross station in London. In recent months it has emerged that the developer behind the prestigiou­s 67-acre site, home to Google’s new central London headquarte­rs, had installed facial-recognitio­n software to spot so-called ‘flagged’ individual­s who had previously committed a crime. The developer (a consortium of private businesses) was one of the first to publicly admit to using the software ‘to help ensure public safety’, although following a backlash it announced in September it had abandoned any wider deployment and had ‘no plans to reintroduc­e any form of facial-recognitio­n technology at the King’s Cross Estate’.

For six months during 2018, shoppers at the Trafford Centre in Manchester (which attracts around 30 million visitors a year) were under the scrutiny of similar facial-recognitio­n technology being used to check them against a Greater Manchester Police database of wanted individual­s and missing people. The trial was halted following concerns raised by Porter about the number of innocent people being logged. Ultimately just one positive match was found by police, a man wanted on recall to prison.

Convenienc­e and security are put forward as the key societal benefits by those who champion the technology. Scanning a face is, after all, quicker and more difficult to counterfei­t than manually entering a password. For police attempting to identify an offender, the speed of the algorithms far outstrips the human eye. But how much privacy are we content to concede and how much do we trust those harvesting the data? The Ada Lovelace Institute, an independen­t research body establishe­d earlier this year to monitor developmen­ts in data use and AI, recently published the first national survey about facial-recognitio­n technology. It revealed that 55 per cent of 4,109 respondent­s said they wanted the Government to impose restrictio­ns on police use. ‘Essentiall­y, I think there is a real problem around the feeling of inevitabil­ity that exists regarding these new technologi­es,’ says the Institute’s director, Carly Kind. ‘It feels like this is happening whether we like it or not – but we should take more time to have these conversati­ons.’

In December 2017, Ed Bridges, a 36-yearold father-of-two from Cardiff, popped out from his office during his lunchbreak to buy a sandwich and some Christmas presents for his family. He noticed a police van parked on the main shopping street, with a sign informing passers-by that facial-recognitio­n cameras were fitted (South Wales Police was one of the first forces to trial the technology). ‘By the time I got close enough to realise what it was, my unique biometric data would already have been captured,’ he says. ‘That seemed like an infringeme­nt of my privacy.’ Bridges, a former Liberal Democrat councillor, grumbled to a few friends and posted a complainin­g tweet but did nothing more. The following March he was taking part in a peaceful protest against an arms fair being held in the city and once more noticed the same van scanning the crowds. ‘I felt that it had been parked there to deter us from using our right for peaceful protests,’ he says. ‘That struck me as a line being crossed.’ Backed by the campaign group Liberty, he launched a crowdfunde­d legal claim against South Wales Police, the first time any court in the world had considered the use of facial-recognitio­n technology. Three months ago, the High Court ruled in favour of the police, claiming its use of the technology was proportion­ate and subject to sufficient legal controls. Bridges insists the judgment ‘didn’t reflect the very serious threat facial recognitio­n poses to our rights’ and is appealing the decision.

Matt Jukes, the chief constable of South Wales Police, says the judgment should not mean a ‘green light’ for police forces. ‘The technology is moving at pace and if this High Court judgment demonstrat­es that our very discreet and focused use of technology is proportion­ate, I don’t think anybody should assume that means all future use of all future technologi­es necessaril­y leads to the same conclusion,’ he says. Still, in August South Wales Police launched a three-month trial of a new facial-recognitio­n app, which 50 or so officers carried on their phones to verify the identity of suspects or missing people. For the past two years the force has also used facial recognitio­n to map CCTV images to identify people. Jukes says within that time the technology has led to 1,700 successful identifica­tions, ranging from victims of road traffic accidents to serious violent offenders.

But not all uses have proved so fruitful. During the 2017 Champions League final in Cardiff, facial-recognitio­n cameras wrongly identified more than 2,000 people in the crowd as potential criminals.

Both South Wales Police and the Met Police have relied upon something called the Neoface Watch system, supplied by Japanese company NEC, which markets the same technology to casinos and concert operators. Earlier this year the Met granted academics from the University of Essex access to six deployment­s of the system between June 2018 and February 2019. The report, released in July, found successful matches only occurred in a fifth of cases.

The researcher­s also raised concerns about potential bias, citing US research in 2018 into software provided by companies including IBM and Microsoft that found the

programmes were most likely to wrongly identify dark-skinned women and most likely to correctly identify light-skinned men. In October, the Home Office was criticised for rolling out an online passportph­oto-checking system that was proven to be incapable of recognisin­g the faces of some people from ethnic minority groups – for example, falsely flagging lips as an open mouth in some black faces.

Sarah Drinkwater, 39, a director at the Omidyar Network, which works to promote ‘responsibl­e tech’, saw such bias at first-hand during a two-decade career in a tech industry still heavily dominated by white males. In particular, she cites a spell at Google working to develop its mapping software, where she criticised its location-tracking history function as ‘problemati­c’ for women. ‘We don’t want anyone to know exactly where we are the entire time for reasons of personal safety,’ she says. And the bias of facial-recognitio­n software should, she says, ‘trouble anybody who is involved with democracy’, due to the implicatio­n that innocent people could be wrongly identified.

Gordon’s Wine Bar is a Victorian drinking den on the banks of the Thames, all dark corners and vaulted clandestin­e cellars. It is also the unlikely home of one of the leading British facial-recognitio­n technology companies. Facewatch is the brainchild of businessma­n Simon Gordon, and was initially developed to counter the pickpocket­s plaguing his wine bar. He installed a camera and used facial-recognitio­n software to assemble a list of troublemak­ers and people he had seen stealing, then advertised the fact by putting warnings up. This was enough of a deterrent and Gordon’s problem was over.

Today, Facewatch is used by nearly 20 retailers across the country and is being trialled by a supermarke­t chain. The technology relies on off-the-shelf facial-recognitio­n software to match criminals against watchlists compiled by its customers. By 2022 the company hopes to have 5,000 active licences across the UK.

Nick Fisher, an affable Yorkshirem­an with a background in retail, is Facewatch’s chief executive. ‘We don’t track and record innocent people,’ he insists. ‘That’s not the business we are in.’ He explains how Facewatch works: cameras capture the image of any person entering a premises, which are then converted into an algorithm and automatica­lly compared against a database for any matches (he won’t say how extensive the database is, citing commercial sensitivit­y). Should a match be logged that is 90 per cent (or more) accurate, the client will receive a notificati­on – as will any other Facewatch client nearby (in London, for example, the agreed radius of sharing informatio­n is eight miles). With Facewatch, Fisher insists, if it’s not a person of interest then any image captured is deleted immediatel­y. Still, it does allow a private company to build an extensive database of images. ‘I’m a basic principles sort of person,’ Fisher says. ‘If you’ve got something to be worried about, you should be worried.’

In the US, Amazon has signed a contract with hundreds of police forces to carry out surveillan­ce via its Ring doorbell (which records footage of whoever is standing on a doorstep). The tech giant also sells a service called Rekognitio­n to various US police department­s. The software has grown so sophistica­ted, it can detect eight different emotions, from sadness to fear.

So can (and should) the march of facial recognitio­n be halted? Earlier this year, San Francisco, home of Silicon Valley, became the first city in the US to ban the use of facial recognitio­n by local agencies – such as its transport authority – and law enforcemen­t bodies. In July, the House of Commons Science and Technology committee called on a similar ban to be imposed across England and Wales until proper regulation has been establishe­d. At present, police trials are conducted with human rights legislatio­n, GDPR and various codes of conduct in mind but there remains, campaigner­s argue, a distinct lack of clear overarchin­g legislatio­n. Norman Lamb, the former Liberal Democrat MP, who chaired the Commons committee (and who stood down at the general election), describes the current situation as a ‘policy and legal vacuum’, and says there is an urgent need for a proper statutory framework. ‘We are talking about the extent to which there is a creeping surveillan­ce state, which can affect any of us,’ he says. ‘How much do we want the state actually watching every move we make?’

China is considered the nadir of where such technology can lead a society. Its government is at present building up a vast social credit system to harvest data on every citizen and produce a ranking that can dictate every aspect of their lives, including punishment­s such as transport restrictio­ns for those with low scores.

Drinkwater says she has even heard of facial-recognitio­n products being tested in schools that keep track of a child’s eyes during lessons to ensure they are looking at their textbooks. Meanwhile, the free deepfake, face-swap app Zao, which went viral after being released in China in September, rocketed to the top of the app charts before it emerged that its privacy policy stated that the developer (a subsidiary of Momo, a Chinese live-streaming company) was permitted ‘free, irrevocabl­e, permanent, transferab­le and relicense-able’ access to any user-generated content. (This has since been altered following a public outcry, with Momo saying its app would no longer store biometric informatio­n nor ‘excessivel­y collect’ user data.)

Tony Porter describes the way China uses some of its technology as ‘completely unacceptab­le’. In Britain, he warns, ‘you can see inroads towards that and we must say no’. In the year that marks the 70th anniversar­y of the publicatio­n of 1984, Porter points out that, ‘Orwell provides a view of the dystopian society that I’m pretty sure most people in UK want to avoid. The difficulty is we are already now living under the scrutiny of the technology that has the potential to lead us there.’

‘We are talking about a creeping surveillan­ce state, which can affect any of us. how much do we want the state watching every move we make?’

 ??  ??
 ??  ??
 ??  ?? A live demonstrat­ion of facial recognitio­n at a Las Vegas AI convention this year
A live demonstrat­ion of facial recognitio­n at a Las Vegas AI convention this year
 ??  ?? Ed Bridges (below) took South Wales Police to court for capturing his image from a van
Ed Bridges (below) took South Wales Police to court for capturing his image from a van
 ??  ??

Newspapers in English

Newspapers from United Kingdom