The Scottish Mail on Sunday

Creepy. Orwellian. UnBritish. Racist.

Facial recognitio­n cameras are sweeping the country – with almost no debate. But PROFESSOR NOEL SHARKEY has witnessed them in action first-hand and says they’re ...

- By PROFESSOR NOEL SHARKEY PROFESSOR OF ROBOTICS, UNIVERSITY OF SHEFFIELD

SURVEILLAN­CE cameras are filming us everywhere. We know that and we’re used to it. We also know the images won’t be analysed unless there’s been a crime – so that’s OK, isn’t it?

No it’s not. Because, as The Mail on Sunday reveals today, a number of police forces and councils have recently adopted a controvers­ial new form of camera technology which is not accurate, reliable or trustworth­y.

At the heart of this developmen­t is something called live facial recognitio­n (LFR), software which can automatica­lly recognise a face in a crowd within a fraction of a second. And this means we can be monitored everywhere we go, no matter who we are with or what we are doing.

Facial recognitio­n could strike at the very core of our free society. Yet – without any public consultati­on – British police forces are introducin­g LFR as quickly as they can. It’s as if we’re standing in a perpetual identity parade without ever being told, let alone asked.

Needless to say, the dangers are huge and I have witnessed them first-hand. The South Wales police force has been testing facial recognitio­n for a couple of years and, at the invitation of the Chief Constable, I attended a trial at last year’s Swansea air show.

What I saw was concerning, and not just because I object to the slippery introducti­on of Orwellian surveillan­ce. For when I saw the technology up close, I could also see how fallible it is.

Hidden some distance away, I watched as individual­s at the air show walked past an LFR camera and the computer studied each face, taking just 0.2 seconds per individual. In that time, it generated a ‘face print’ and compared it with mugshots of known criminals.

I witnessed how the computer flagged what the policeman beside me described as a ‘wrong ’un’ and how the individual’s face then flashed up alongside the correspond­ing criminal face.

Ten seconds later, however, and after human interventi­on, it was decided it wasn’t a match after all. The human eye had picked up minor difference­s in the hairlines of the men in the two photograph­s – a very necessary check.

Yet, with the rapid pace of technologi­cal advancemen­t, it seems to me inevitable human beings will eventually be removed from the process altogether. And what then?

An investigat­ion by the Big Brother Watch think-tank has shown LFR results are worryingly inaccurate with the computer correct, at best, in only five per cent of cases. Accuracy fell to two per cent in a trial at the Notting Hill carnival. It seems that LFR has particular trouble when it comes to people who have darker shades of skin.

More favourable results emerged from an Essex University review of the Metropolit­an Police trials, in which the computer was found to have been accurate in 18 per cent of the individual­s it ‘recognised’. This is still an astonishin­gly low figure, however.

Why, then, is the Met announcing the ‘successful’ completion of its trials? And why is it installing LFR across the capital? The police should look to America, where the technology is already taking hold, to see how misguided a decision this is.

When the American Civil Liberties Union (ACLU) ran photograph­s of members of Congress through US police software, the computer identified 28 of them as criminals. Wrongly, of course. And the people it picked out were mostly from African and Latino background­s.

A computer scientist at the Massachuse­tts Institute of Technology,

Joy Buolamwini, experiment­ed with the most commonly used facial recognitio­n systems to find out how good they were at matching a photograph to a face stored in a memory bank. The worst results were for women with dark skin. IBM’s system, for example, failed 31 per cent of the time.

Ms Buolamwini, who is GhanaianAm­erican, began the research after discoverin­g that LFR couldn’t even locate her face – let alone identify her features – unless she wore a white mask. LFR is biased both in terms of race and gender.

Then we must ask about the cost to our liberty, and the troubling example set by China, where the Communist leadership uses LFR to great effect. When citizens buy a smartphone, they are photograph­ed for use with LFR, which means they can be tracked by the ubiquitous surveillan­ce cameras.

It is used much like our numberplat­e recognitio­n software, and the result is that you can get an automatic fine for antisocial behaviour. It has been used to publicly shame people who go outside in their pyjamas. And, bizarrely, public lavatories are being fitted with LFR to set a limit on the amount of toilet paper that those in need can take.

The omnipresen­t technology is used in Xinjiang province as an aid to repress the Uighur Muslim population. The computer keeps records of their comings and goings and can be connected to other apps that alert police if more than five Uighurs collect together. Many end up in re-education camps.

China is close to becoming the ultimate surveillan­ce state, but Russia is at it too, with LFR applied to the vast Moscow CCTV network.

It is all too easy to assume that, because we are a democracy, such intensive surveillan­ce would not be tolerated in Britain. But we have seen how quickly civil liberties can be ignored in the case of, say, major terror attacks. In the past few weeks, in fact, Sir Andrew Parker, head of MI5, spoke of the ‘need’ for our intelligen­ce services ‘to be able to make sense of the data lives of thousands of people in as near to real time as we can get to’.

The Mail on Sunday’s revelation gives me grave cause for concern because it means that to ID people not already on the police offenders’ lists requires records of the faces of all of us. And where do those come from? Enter the private companies who are always ready to help with large data banks of facial portraits and accompanyi­ng technology.

We know that big tech companies such as Facebook retain photograph­s of us, but even small startups are getting in on the act.

For example, the firm Clearview has scraped the images of more than three billion faces from internet sites and is selling these along with its technology to law-enforcemen­t agencies and other companies.

It’s too early to tell what injustices this may bring in the future, but we can be confident there is a breathtaki­ng arrogance to it all.

It emerged last year, for example, that a number of private shopping centres including Sheffield’s Meadowhall, the Trafford Centre in Manchester and the area around London’s King’s Cross had been trying out LFR without telling any of us.

They were mainly looking for shoplifter­s or antisocial behaviour. But if a shopkeeper felt someone had behaved suspicious­ly, they could put that individual’s picture on a watch list that was shared with all of the other shops.

The suspect had no idea they were on the list and no recourse to justice. That is plain wrong.

As Informatio­n Commission­er Elizabeth Denham put it recently, LFR is a ‘step change in policing techniques; never before have we seen technologi­es with the potential for such widespread invasivene­ss’.

When it comes to LFR, we are in the Wild West. There is sparse regulation and what we do have is ignored with impunity. It won’t be easy to put this genie back in the bottle but we must try – and try again until we succeed.

Our Government must immediatel­y limit the scope of LFR to serious offences. We need strong laws to ensure that this horrific technology is contained.

The computer’s only accurate in five per cent of cases

This horrific technology must be containedb­y strong laws

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United Kingdom