The Star Malaysia - Star2

Mutant algorithms are coming for your education

- By CATHY O’NEIL (Cathy O’neil is a Bloomberg Opinion columnist. She is a mathematic­ian who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmi­c auditing company, and is the author of Weapons Of Math Destruct

BAD algorithms have been causing a lot of trouble lately. One, designed to supplant exam scores, blew the college prospects of untold numbers of students attending Internatio­nal Baccalaure­ate schools around the world.

Then another did the same for even more students in lieu of the UK’S high-stakes “A-level” exams, prompting Prime Minister Boris Johnson to call it a “mutant” and ultimately use human-assigned grades instead.

Actually, I would argue that pretty much all algorithms are mutants. People just haven’t noticed yet.

The foibles of algorithms usually go unseen and undiscusse­d, because people lack the informatio­n and power they need to recognise and address them. When, for example, computers issue scores that decide how much people pay for mortgage loans or insurance policies, or who gets a job, the victims of mistakes typically don’t know what’s going on.

Often, nobody even tells them their score, let alone how it was calculated or how to compare it against a benchmark. This makes it difficult for them to identify one another, band together and complain – or to compel authoritie­s to come in and fix things.

The situation with the student-grading algorithms was an exception. Tens of thousands of students were assessed at the same time. They had “ground truths” with which to compare their scores – for example, how they were doing before the assessment and what grades their teachers expected them to receive.

They were well equipped to share and inspect their results for consistenc­y and fairness – and to see, for example, bias against kids from disadvanta­ged neighbourh­oods and schools. And they had a powerful weapon – outraged parents – to push politician­s and university administra­tors to discard the flawed results.

So an unusually public scandal shed some light on how bad most algorithms really are. But it didn’t fix or eliminate them.

On the contrary, as the coronaviru­s crisis and pre-existing trends force budget cuts, computers are likely to be replacing human judgment even more, particular­ly in American higher education. Many admissions and studentser­vices offices already use algorithms, typically to supplement humans.

Now that’s likely to flip, with a few humans overseeing a fleet of algorithms that essentiall­y replace bureaucrac­y. And there’s no reason to expect those algorithms to be better than Boris Johnson’s mutant.

Worse, institutio­ns across the US and around the world tend to buy their algorithms from the same third-party data companies. So those algorithms’ weaknesses and biases – which tend to perpetuate and amplify the racial and gender biases baked into the historical data – have the potential to proliferat­e broadly through the entire process of admitting students, distributi­ng financial aid, grading, recommendi­ng classes or majors and surveillin­g students for Covid-19 or other risks.

This doesn’t bode well for kids who happen to be born in places with failing schools, poor health care, toxic environmen­ts and few successful college graduates.

The companies and institutio­ns that deploy high-stakes algorithms tend to rely on plausible deniabilit­y – if they don’t know what’s going on in the black box, they can’t be held responsibl­e. That’s not good enough. All algorithms should be seen as untrustwor­thy until proven otherwise.

Until we as a society acknowledg­e this, and insist on the transparen­cy required for the public to assess reliabilit­y and fairness, we’re not ready to use them. – Bloomberg

 ?? AP ?? UK students protesting over the government’s handling of A-level results, using an algorithm to work out marks, with many students receiving lower than expected grades after their exams were cancelled because of the coronaviru­s restrictio­ns. —
AP UK students protesting over the government’s handling of A-level results, using an algorithm to work out marks, with many students receiving lower than expected grades after their exams were cancelled because of the coronaviru­s restrictio­ns. —

Newspapers in English

Newspapers from Malaysia