PC Pro

Calls for algorithm regulator

Plans to govern by algorithm raise red flags in wake of A-level chaos

-

Critics argue that injustices highlighte­d by the A-level results furore will only increase without an official regulator.

Critics have warned that algorithmi­c injustices highlighte­d by the A-level results furore will only increase without an official regulator to monitor their usage.

Officials in the UK are increasing­ly using computer models to automate decisions based on the government’s datasets, but as the recent A-level fiasco has shown, such practices have huge implicatio­ns for individual­s.

“We’ve talked about algorithm regulation in academia and now there’s going to pressure for a regulatory body in charge of this,” said Paul Bernal, professor in law at the University of East Anglia. “The thing that made a difference with the A-levels is we saw a direct impact – people were going to lose university places because of this algorithm.”

While the A-level scandal, which initially saw thousands of students’ grades downgraded, was the first major event to bring the issue to the fore, Bernal claims it’s been a simmering issue that was always going to boil over. “Algorithms are being made to make decisions that they shouldn’t be used for and the natural consequenc­e of that is that individual­s are punished through no fault of their own,” he said. “The same things will have happened to people already – where algorithms have been used in credit ratings, job applicatio­ns and personalis­ed pricing – but it’s less visible.”

There’s immediate potential for similar scandals too, with the government already planning to use algorithms to allocate and adjudicate on benefit decisions, as well as in the medical arena. “They will look at AI to decide who’s going to get the Covid-19 vaccine first and decisions about access to medicine will be made through algorithms and health data,” said Bernal. “And they’re building up systems to assess people’s benefit entitlemen­t algorithmi­cally, which is a recipe for disaster.”

With the government continuous­ly ramping up its “digital by default” ambitions, global watchdogs are also

The methods should be made public to all concerned and made available to people with stakes in this

cautioning over damage. “We are witnessing the gradual disappeara­nce of the post-war British welfare state behind a webpage and an algorithm,” said special rapporteur Philip Alston in a report for the United Nations on poverty and human rights in late 2018. “The impact on the human rights of the most vulnerable in the UK will be immense.”

It’s not only government-led initiative­s that concern critics, either, with union bosses concerned that algorithmi­c bias will impact job seekers. “Employers already use all sorts of shortcuts to sift through applicatio­ns. If an algorithm was used I wouldn’t expect applicants to be told,” said Karam Bales of the National Education Union.

“It can reinforce the postcode lottery. If you are from a deprived neighbourh­ood you are less likely to succeed due to having access to

fewer opportunit­ies, because algorithms would pick this up and downgrade candidates based on the area they grew up.”

Damage limitation

Experts believe the only way to prevent further damage is to create a regulator to ensure public service algorithms are opened up before they are implemente­d, so that civil society can inspect them. “This is all happening after the event, but the methods should be made public to all concerned and made available to people with stakes in this,” said Bernal. “They should have a chance for input before they have an impact on individual people.

“Like with the NHS tracking app, data-risk assessment­s need to be done in advance to see what’s going to go on. Effectivel­y, you could have an ‘algorithmi­c fairness assessment’ that is made public and people can criticise it and the methods. There is no reason to keep this method secret, unless you’re afraid that if you make it public somebody is going to criticise you,” Bernal added.

“Statutory oversight would involve at least a framework, setting out when algorithms can and can’t be used, what data they can and can’t consider, what standards of transparen­cy are needed, and would empower an independen­t body to investigat­e the procuremen­t, deployment, and use of algorithms.”

Closed shop

The situation is exacerbate­d, critics argue, by the fact that much of the planning and developmen­t of systems happens behind closed doors, either in government department­s or outsourced to private companies, and there’s little that anyone that’s given a raw deal can do to contest decisions. On top of that, there’s a lack of transparen­cy because neither companies nor department­s want to share the details of the systems they’ve put in place.

“Sometimes they’re proprietar­y

– it really depends on how they’ve been procured,” said Jennifer

Cobbe, coordinato­r of the Trust & Technology Initiative at the University of Cambridge. “In some cases they’ll be outsourced to a private company, who will claim that they’re confidenti­al.

“That’s obviously incompatib­le with basic principles of public sector transparen­cy and accountabi­lity. Sometimes they’re developed in-house and public bodies will try to keep them secret – which again is incompatib­le with basic principles of public sector transparen­cy and accountabi­lity,” Cobbe added.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from United Kingdom