The Guardian (USA)

UK risks scandal over ‘bias’ in AI tools in use across public sector

- Kiran Stacey

Kate Osamor, the Labour MP for Edmonton, recently received an email from a charity about a constituen­t of hers who had had her benefits suspended apparently without reason.

“For well over a year now she has been trying to contact DWP [the Department for Work and Pensions] and find out more about the reason for the suspension of her UC [Universal Credit], but neither she nor our casework team have got anywhere,” the email said. “It remains unclear why DWP has suspended the claim, never mind whether this had any merit … she has been unable to pay rent for 18 months and is consequent­ly facing eviction proceeding­s.”

Osamor has been dealing with dozens of such cases in recent years, often involving Bulgarian nationals. She believes they have been victims of a semi-automated system that uses an algorithm to flag up potential benefits fraud before referring those cases to humans to make a final decision on whether to suspend people’s claims.

“I was contacted by dozens of constituen­ts around the beginning of 2022, all Bulgarian nationals, who had their benefits suspended,” Osamor said. “Their cases had been identified by the DWP’s Integrated Risk and Intelligen­ce Service as being high risk after carrying out automated data analytics.

“They were left in destitutio­n for months, with no means of appeal. Yet, in almost all cases, no evidence of fraud was found and their benefits were eventually restored. There was no accountabi­lity for this process.”

The DWP has been using AI to help detect benefits fraud since 2021. The algorithm detects cases that are worthy of further investigat­ion by a human and passes them on for review.

In response to a freedom of informatio­n request by the Guardian, the DWP said it could not reveal details of how the algorithm works in case it helps people game the system.

The department said the algorithm does not take nationalit­y into account. But because these algorithms are selflearni­ng, no one can know exactly how they do balance the data they receive.

The DWP said in its latest annual accounts that it monitored the system for signs of bias, but was limited in its capacity to do so where it had insufficie­nt user data. The public spending watchdog has urged it to publish summaries of any internal equality assessment­s.

Shameem Ahmad, the chief executive of the Public Law Project, said: “In response to numerous Freedom of Informatio­n Act requests, and despite the evident risks, the DWP continues to refuse to provide even basic infor

mation on how these AI tools work, such as who they are being tested on, or whether the systems are working accurately.”

The DWP is not the only department using AI in a way that can have major impacts on people’s daily lives. A Guardian investigat­ion has found such tools in use in at least eight Whitehall department­s and a handful of police forces around the UK.

The Home Office has a similar tool to detect potential sham marriages. An algorithm flags marriage licence applicatio­ns for review to a case worker who can then approve, delay or reject the applicatio­n.

The tool has allowed the department to process applicatio­ns much more quickly. But its own equality impact assessment found it was flagging a disproport­ionately high number of marriages from four countries: Greece, Albania, Bulgaria and Romania.

The assessment, which has been seen by the Guardian, found: “Where there may be indirect discrimina­tion it is justified by the overall aims and outcomes of the process.”

Several police forces are also using AI tools, especially to analyse patterns of crime and for facial recognitio­n. The Metropolit­an police have introduced live facial recognitio­n cameras across London in order to help officers detect people on its “watchlist”.

But just like other AI tools, there is evidence the Met’s facial recognitio­n systems can lead to bias. A review carried out this year by the National Physical Laboratory found that under most conditions, the cameras had very low error rates, and errors were evenly spread over different demographi­cs.

When the sensitivit­y settings were dialled down however, as they might be in an effort to catch more people, they falsely detected at least five times more black people than white people.

The Met did not respond to a request for comment.

West Midlands police, meanwhile, are using AI to predict potential hotspots for knife crime and car theft, and are developing a separate tool to predict which criminals might become “high harm offenders”.

These examples are those about which the Guardian was able to find out most informatio­n.

In many cases, department­s and police forces used an array of exemptions to freedom of informatio­n rules to avoid publishing details of their AI tools.

Some worry the UK could be heading for a scandal similar to that in the Netherland­s, where tax authoritie­s were found to have breached European data rules, or in Australia, where 400,000 people were wrongly accused of giving authoritie­s incorrect details about their income.

John Edwards, the UK’s informatio­n commission­er, said he had examined many AI tools being used in the public sector, including the DWP’s fraud detection systems, and not found any to be in breach of data protection rules: “We have had a look at the DWP applicatio­ns and have looked at AI being used by local authoritie­s in relation to benefits. We have found they have been deployed responsibl­y and there has been sufficient human interventi­on to avoid the risk of harm.”

However, he added that facial recognitio­n cameras were a source of concern. “We are watching with interest the developmen­ts of live facial recognitio­n,” he said. “It is potentiall­y intrusive and we are monitoring that.”

Some department­s are trying to be more open about how AI is being used in the public sphere. The Cabinet Office is putting together a central database of such tools, but it is up to individual department­s whether to include their systems or not.

In the meantime, campaigner­s worry that those on the receiving end of AI-informed decision-making are being harmed without even realising.

Ahmad warned: “Examples from other countries illustrate the catastroph­ic consequenc­es for affected individual­s, government­s, and society as a whole. Given the lack of transparen­cy and regulation, the government is setting up the precise circumstan­ces for it to happen here, too.”

the beautiful game? Joshua

I think it’s safe to say that, yes, for most players going to the Saudi Pro League, money is the major motivation. And, yes, there are all kinds of dangers inherent in that. But then what do you think motivates most players most of the time? Of course, certain players have affinities for certain clubs, or want to play at the highest level they can, or are intrigued by the prospect of playing for particular coaches, but money underlies almost everything in soccer – and it’s always been that way. Netflix’s The English Game may be a laboured and risibly cliched bit of television that takes inexplicab­le liberties with reality, but something it does get right is its representa­tion of the flow of Scottish players to the clubs of the industrial north of England: when Fergus Suter left Partick in Glasgow for

Darwen in Lancashire in 1878, he was doing so for cash – even though profession­alism wouldn’t be legal for another seven years.

Have a question for Jonathan? Email soccerwith­jw@theguardia­n.com.

This is an extract from Soccer with Jonathan Wilson, a weekly look from the Guardian US at the game in Europe and beyond. Subscribe for free here.Have a question for Jonathan? Email soccerwith­jw@theguardia­n.com, and he’ll answer the best in a future edition

 ?? Composite: Guardian Design/EPA ?? The DWP said in response to a FoI request that it could not reveal details of how the algorithm works in case it helps people game the system.
Composite: Guardian Design/EPA The DWP said in response to a FoI request that it could not reveal details of how the algorithm works in case it helps people game the system.

Newspapers in English

Newspapers from United States