Must do bet­ter

Ex­perts and stu­dents weigh in over the furore sur­round­ing A-level al­go­rithm modelling Hasan Chowd­hury

The Daily Telegraph - Business - - Front Page -

It is a rite of pas­sage for stu­dents reach­ing the fi­nal stages of sec­ondary ed­u­ca­tion, a sum­ma­tion of years of hard work to se­cure a place at univer­sity. But this year, A-level re­sults day has felt a lit­tle dif­fer­ent.

With coro­n­avirus shut­ting schools for the vast ma­jor­ity of pupils, exam season was thrown out too. So how should stu­dents be judged?

With al­go­rithms. Roughly 300,000 stu­dents who re­ceived re­sults yes­ter­day will al­most all have had a grade as­signed to them based on a piece of code put to­gether by Ofqual, the ex­ams reg­u­la­tor. But not ev­ery­one will be sat­is­fied.

Early fig­ures sug­gest 36pc of re­sults in Eng­land were marked down by one grade, while 3pc were down by two. Ofqual num­bers also show a 4.7pc rise in A and A* grades in pri­vate schools ver­sus 2019, and just a 0.3pc shift in the nee­dle for state in­sti­tu­tions.

“These out­comes are not sur­pris­ing – it is well es­tab­lished that the use of al­go­rithms to score and pre­dict cre­ates risks that ex­ist­ing so­cial bi­ases and in­equal­i­ties will be­come ex­ac­er­bated and en­trenched,” says Carly Kind, di­rec­tor of the Ada Lovelace In­sti­tute.

Years of brush­ing up on in­te­gra­tion cal­cu­la­tions, Chaucer, hy­dro­car­bons or game the­ory, then, were for nought, as stu­dents rage that their re­sults could have gone dif­fer­ently had their teach­ers’ pre­dic­tions been given greater weight ver­sus sta­tis­ti­cal modelling done by a com­puter.

Has a dras­tic mis­take been made by putting stu­dents’ fu­tures in the hands of ar­ti­fi­cial in­tel­li­gence?

Lit­tle trans­parency has been af­forded to date on the al­go­rithm. The Gov­ern­ment is ex­pected to re­lease a near-150 page doc­u­ment that could shine some light on its mech­a­nism.

In the run up to re­sults day, Ofqual changed how it said it would as­sess stu­dents. First, it asked teach­ers to sub­mit pre­dicted grades, along with a rank­ing of stu­dents. Later, it de­cided pre­dicted grades would play a min­i­mal role ver­sus sta­tis­ti­cal modelling for classes of over 15 stu­dents.

Curtis Parfitt-Ford, an A-level stu­dent at Elthorne Park High School in west Lon­don who re­ceived re­sults yes­ter­day, saw tears on the faces of

friends who had been marked down. One stu­dent was marked down “two en­tire grades”, snatch­ing their univer­sity place out of their grasp and forc­ing them to find a new of­fer through clear­ing. “When we found out it was all be­ing done by com­puter … that re­ally hit as a shock,” he says.

It’s why Parfitt-Ford is bring­ing a chal­lenge to the Gov­ern­ment over the al­go­rithm with the sup­port of Fox­glove, a team of le­gal spe­cial­ists chal­leng­ing “dig­i­tal in­jus­tices”.

Among the con­cerns raised by the or­gan­i­sa­tion is the fact a stu­dent’s life chances “hang on an es­ti­mated bal­ance on their school’s his­toric per­for­mance”, given the data fed to the al­go­rithm.

For Cori Crider, lawyer and co­founder of Fox­glove, it means the al­go­rithm robs stu­dents of an as­sess­ment that takes them on their own merit, par­tic­u­larly in state schools where classes tend to be big­ger, turn­ing code into bias that ex­ac­er­bates chasms in Bri­tain’s ed­u­ca­tion sys­tem.

“We think that’s legally su­per prob­lem­atic, the whole point of Ofqual is to as­sess in­di­vid­ual achieve­ment. If you are sub­sti­tut­ing teacher’s as­sess­ments with this al­go­rithm in the name of try­ing to main­tain your bell curve then you’re not do­ing what the statute says you can do,” she says.

Fox­glove is also ar­gu­ing that the al­go­rithm vi­o­lates a key part of GDPR and the UK Data Pro­tec­tion Act, which “pro­vide sig­nif­i­cant pro­tec­tions from au­to­mated de­ci­sions” that may have sig­nif­i­cant con­se­quences for peo­ple.

With A-level re­sults de­ter­min­ing whether or not a stu­dent se­cures a spot on their cho­sen univer­sity course to pur­sue a dream ca­reer, the al­go­rithm must be held to ac­count if it has been un­fair. Ex­perts have warned that al­go­rithms are eas­ily skewed by data un­less prop­erly as­sessed and can pro­lif­er­ate ex­ist­ing bi­ases.

The Home Of­fice pre­vi­ously had an al­go­rithm used to grade visa ap­pli­cants to the UK scrapped af­ter be­ing de­scribed as “racist” by cam­paign­ers.

“We won the UK’s first ju­di­cial re­view of a gov­ern­ment al­go­rithm and this one looks equally un­fair and bi­ased to us,” Crider says.

Gavin Wil­liamson, Ed­u­ca­tion Sec­re­tary, claimed the “ma­jor­ity of young peo­ple will have re­ceived a cal­cu­lated grade to­day” that will help them on to­wards “the des­ti­na­tion they de­serve”, stress­ing the op­tion to ap­peal over dis­par­i­ties with mock re­sults.

Parfitt-Ford is hes­i­tant to jump the gun in dis­miss­ing the Ed­u­ca­tion Sec­re­tary’s claims un­til statis­tics are avail­able, but he says that it is “frankly dis­re­spect­ful” for a mi­nor­ity of stu­dents, how­ever small the group may be, to be un­fairly treated.

“The fact that we’re in a bad sit­u­a­tion, the fact that things are dif­fi­cult doesn’t mean we should go for the worst pos­si­ble solution,” he says. “We need some­thing bet­ter than this.”

Stu­dents at Ne­wham Col­le­giate Sixth Form, east Lon­don, af­ter re­ceiv­ing A-level re­sults

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.