Must do better
Experts and students weigh in over the furore surrounding A-level algorithm modelling Hasan Chowdhury
It is a rite of passage for students reaching the final stages of secondary education, a summation of years of hard work to secure a place at university. But this year, A-level results day has felt a little different.
With coronavirus shutting schools for the vast majority of pupils, exam season was thrown out too. So how should students be judged?
With algorithms. Roughly 300,000 students who received results yesterday will almost all have had a grade assigned to them based on a piece of code put together by Ofqual, the exams regulator. But not everyone will be satisfied.
Early figures suggest 36pc of results in England were marked down by one grade, while 3pc were down by two. Ofqual numbers also show a 4.7pc rise in A and A* grades in private schools versus 2019, and just a 0.3pc shift in the needle for state institutions.
“These outcomes are not surprising – it is well established that the use of algorithms to score and predict creates risks that existing social biases and inequalities will become exacerbated and entrenched,” says Carly Kind, director of the Ada Lovelace Institute.
Years of brushing up on integration calculations, Chaucer, hydrocarbons or game theory, then, were for nought, as students rage that their results could have gone differently had their teachers’ predictions been given greater weight versus statistical modelling done by a computer.
Has a drastic mistake been made by putting students’ futures in the hands of artificial intelligence?
Little transparency has been afforded to date on the algorithm. The Government is expected to release a near-150 page document that could shine some light on its mechanism.
In the run up to results day, Ofqual changed how it said it would assess students. First, it asked teachers to submit predicted grades, along with a ranking of students. Later, it decided predicted grades would play a minimal role versus statistical modelling for classes of over 15 students.
Curtis Parfitt-Ford, an A-level student at Elthorne Park High School in west London who received results yesterday, saw tears on the faces of
friends who had been marked down. One student was marked down “two entire grades”, snatching their university place out of their grasp and forcing them to find a new offer through clearing. “When we found out it was all being done by computer … that really hit as a shock,” he says.
It’s why Parfitt-Ford is bringing a challenge to the Government over the algorithm with the support of Foxglove, a team of legal specialists challenging “digital injustices”.
Among the concerns raised by the organisation is the fact a student’s life chances “hang on an estimated balance on their school’s historic performance”, given the data fed to the algorithm.
For Cori Crider, lawyer and cofounder of Foxglove, it means the algorithm robs students of an assessment that takes them on their own merit, particularly in state schools where classes tend to be bigger, turning code into bias that exacerbates chasms in Britain’s education system.
“We think that’s legally super problematic, the whole point of Ofqual is to assess individual achievement. If you are substituting teacher’s assessments with this algorithm in the name of trying to maintain your bell curve then you’re not doing what the statute says you can do,” she says.
Foxglove is also arguing that the algorithm violates a key part of GDPR and the UK Data Protection Act, which “provide significant protections from automated decisions” that may have significant consequences for people.
With A-level results determining whether or not a student secures a spot on their chosen university course to pursue a dream career, the algorithm must be held to account if it has been unfair. Experts have warned that algorithms are easily skewed by data unless properly assessed and can proliferate existing biases.
The Home Office previously had an algorithm used to grade visa applicants to the UK scrapped after being described as “racist” by campaigners.
“We won the UK’s first judicial review of a government algorithm and this one looks equally unfair and biased to us,” Crider says.
Gavin Williamson, Education Secretary, claimed the “majority of young people will have received a calculated grade today” that will help them on towards “the destination they deserve”, stressing the option to appeal over disparities with mock results.
Parfitt-Ford is hesitant to jump the gun in dismissing the Education Secretary’s claims until statistics are available, but he says that it is “frankly disrespectful” for a minority of students, however small the group may be, to be unfairly treated.
“The fact that we’re in a bad situation, the fact that things are difficult doesn’t mean we should go for the worst possible solution,” he says. “We need something better than this.”
Students at Newham Collegiate Sixth Form, east London, after receiving A-level results