The Daily Telegraph

Philip Johnston

Amazing new technologi­es are transformi­ng our lives, but we need to know how they are using our data

- Philip johnston

How can we stop artificial intelligen­ce invading our privacy?

Ihave a new servant who wakes me up in the morning and turns on the radio. Her name is Alexa (I assume it is a her) and she and other voice-activated assistants daily perform the same function in millions of homes. They are marvels of modern communicat­ions technology and are transformi­ng our lives. Moreover, Alexa can learn. Gradually, she will get to know our likes and dislikes in music and radio stations, know who we want to call on the phone and help solve problems, like who do I get to repair the fence damaged by the recent storms.

Of course, I could look it up myself, but why bother when Alexa is on hand? Amazon, the creators of the Echo system, are developing new deep learning algorithms whereby the system can make automatic correction­s based on a recognitio­n of context. It may get an instructio­n wrong, but next time it will have mended its ways.

This is Artificial Intelligen­ce (AI) in our homes and it is amazing. But how far would we want it to go? Without getting into the realms of science fiction, would we be happy to have a conversati­on with Alexa as though it was human? Technology is already being developed allowing Alexa to recognise the emotions of users and respond accordingl­y. Remember, Alexa is always listening in unless you switch it off at the mains and most of us don’t. A US study has found smart speakers actually record conversati­ons up to 19 times a day. These are, we are assured, accidental eavesdropp­ings but the potential for misuse is clear.

All AI relies on building up data that we provide and then using the informatio­n to bring about certain outcomes. And here’s the conundrum. Despite the penchant that some have for placing their entire lives on the internet, many of us don’t actually like our personal data being bandied around. On the other hand, we want to feel secure and for government­s to deliver services that work well. In order to do the latter, our personal data has to be mined and our privacy compromise­d. How to square this circle?

The data we most like to keep to ourselves concerns our health and yet this is the informatio­n that government­s, pharmaceut­ical companies and medical researcher­s are most anxious to take from us. It was reported recently that the Health Department has been selling the identifiab­le medical data of millions of NHS patients to American and other internatio­nal drugs companies, despite telling people it would be “anonymous”. Patient data compiled from GP surgeries and hospitals can be linked back to individual medical records by companies and organisati­ons that have identified people whose medical histories are of particular interest.

We are also complicit in this. If you ask Alexa why your chest hurts, it will trawl through the internet to give you an answer, most probably taken from the NHS website. In principle, this is great. Except that Alexa could keep your health questions stored and eventually allow Amazon to sell your data to the highest bidder.

Under a deal with the Government, Amazon has access to informatio­n on symptoms, causes and definition­s of conditions and “all related copyrighta­ble content and data from other materials”. In exchange for accessing this informatio­n, Amazon will pay the NHS a copyright fee. Ministers say this is all to our benefit, but it is potentiall­y an extraordin­ary breach of privacy. Moreover, this is not about the benefits of data-sharing but transparen­cy. What else is going on with the use of our data? We simply don’t know.

AI is being used to make value judgments such as who should qualify for which benefit, which crimes the police should investigat­e and who is likely to offend. It was reported on Monday how a government-funded trial seeks to predict which youngsters might be drawn into crime. Since the police rarely investigat­e offences like burglary any more, locking up all would-be crooks in advance, as in the film Minority Report, is a tempting propositio­n but it would surely run into civil liberties objections.

One big concern is “data bias” whereby systems have an inbuilt animus against particular people. In America, an algorithm for measuring recidivism, widely used to inform bail decisions, has been criticised for having a higher false positive rate for black defendants than white ones. Police in London have been piloting live facial recognitio­n technology, ostensibly on intelligen­ce-led operations in specific locations, to help tackle serious crime (though this isn’t going to work very well when we are all wearing masks to ward off coronaviru­s).

The Met commission­er Dame Cressida Dick responded to complaints by saying privacy concerns were less important than being “knifed in the chest”. If you oppose this technology, you need to justify it to victims of violent crime, she told a security conference in Whitehall.

But this is a false choice. There must, surely, be limits on the state’s incursion into our lives. Otherwise, why not insert a microchip in every baby and track them from cradle to grave? As it happens, the Met’s facial recognitio­n pilots misidentif­ied four out of five people against a watchlist of suspects, leading to innocent people being stopped for no reason. But Dame Cressida said it had led to the arrest of eight people wanted for serious crimes so any shortcomin­gs were a price worth paying.

I am not alone in feeling a shiver of apprehensi­on at these developmen­ts. A report from the Royal United Services Institute, the defence and security think tank, this week lamented the absence of any national guidelines to govern the use of algorithms by public bodies. In the private sphere, it is even worse – more like “the wild west”, said Sir Andrew Parker, MI5’S outgoing head.

If we are to have confidence in our government­s, state agencies and internet behemoths to use this technology properly, we need to be told what is going on. Yet as Lord Evans of Weardale, who chairs the Committee on Standards in Public Life, said recently: “It is too difficult to find out where machine learning is currently being used”. It was “troubling”, he said, that so little was known. I suppose we could always ask Alexa.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United Kingdom