The fourth in­dus­trial rev­o­lu­tion is dis­crim­i­nat­ing against Africans

Tack­ling bias in tech­nol­ogy re­quires a new form of ac­tivism, writes Tshilidzi Marwala

The Sunday Independent - - Dispatches - ■ Marwala is the vice-chan­cel­lor and prin­ci­pal of the Univer­sity of Johannesburg as well as the au­thor of the book Ar­ti­fi­cial In­tel­li­gence for Ra­tio­nal De­ci­sion Mak­ing. He writes in his per­sonal ca­pac­ity.

RE­CENTLY I re­ceived a speech-recog­ni­tion-sys­tem-based search de­vice (“Google As­sis­tant”) cre­ated by the com­pany Google. This de­vice is able to hear and ex­e­cute ver­bal in­struc­tions us­ing ar­ti­fi­cial in­tel­li­gence.

Ar­ti­fi­cial in­tel­li­gence learns from past data about the prob­lems it is try­ing to solve. For ex­am­ple, one can ver­bally give this de­vice a ver­bal in­struc­tion to play Thuma

Mina by Hugh Masekela and it lis­tens, un­der­stands and then plays the re­quested song. It takes in­struc­tions through the voice just as hu­man be­ings do. This is a de­vice of the fourth in­dus­trial rev­o­lu­tion.

The fourth in­dus­trial rev­o­lu­tion is based on ar­ti­fi­cial in­tel­li­gence and is giv­ing us cy­ber-phys­i­cal sys­tems that make the dis­tinc­tion between man and ma­chine blurred. If I ask it about the weather it an­swers cor­rectly. But there is one thing that it does not seem to hear and that is when I ask it what my name is. How­ever, if I ask it who is the au­thor of the book Ar­ti­fi­cial In­tel­li­gence for Ra­tio­nal De­ci­sion

Mak­ing it an­swers my name but with an Amer­i­can ac­cent. If I ask what my name is with an Amer­i­can ac­cent, a feat that I am nat­u­rally not good at, it hears and an­swers cor­rectly. I have of­ten won­dered if I was to change my name from Tshilidzi to Jack as my grand­fa­ther did 80 years ago be­cause his boss could not pro­nounce the word Tshamano, per­haps it could un­der­stand me.

It seems this Google de­vice is bi­ased against my name, my ac­cent and my be­ing just as my grand­fa­ther’s boss was bi­ased against his name 80 years ago. This is an in­her­ent form of dis­crim­i­na­tion and has tra­di­tion­ally been per­pe­trated by peo­ple and it is now com­mit­ted by ma­chines.

The rea­son why this dis­crim­i­na­tion is present in this de­vice is be­cause the data that is used to train this de­vice is largely col­lected in North Amer­ica and not in South Africa. For us to make this de­vice not to be bi­ased we will have to record all words and as­so­ciate them to mean­ings in all our 11 of­fi­cial lan­guages and in­cor­po­rate these into this Google de­vice. Google Trans­late has started to in­cor­po­rate a num­ber of African lan­guages, in­clud­ing Swahili and Zulu.

Last year I went to Tho­hoyan­dou and booked for a house us­ing an ap­pli­ca­tion called Airbnb. To be able to book in this sys­tem, one needs to put credit card de­tails and then scan the pic­ture of one’s iden­tity doc­u­ment (ID) to match the credit card de­tails to the ID. To val­i­date my iden­tity then this Airbnb ap­pli­ca­tion re­quested me to take a selfie, which is a pic­ture of my face, to match the per­son do­ing the trans­ac­tion to the ID and by ex­ten­sion to the credit card de­tails.

This is a se­cure sys­tem ex­cept that the match of the ID photo to a selfie re­quires sev­eral more it­er­a­tions for African faces when com­pared to Euro­pean faces. Again, this ar­ti­fi­cial in­tel­li­gence-based sys­tem is im­plic­itly dis­crim­i­nat­ing against me for no other rea­son ex­cept my race. It is no longer hu­man be­ings that dis­crim­i­nate but ar­ti­fi­cial in­tel­li­gent ma­chines of the fourth in­dus­trial rev­o­lu­tion. In this case the rea­son why this match­ing al­go­rithm is bi­ased against African faces is be­cause the data that is used to train this sys­tem is largely gath­ered in North Amer­ica where the ma­jor­ity of peo­ple are of Euro­pean de­scent and not in South Africa where the ma­jor­ity of peo­ple are of African de­scent.

Banks are in­creas­ingly us­ing ar­ti­fi­cially in­tel­li­gent ma­chines to es­ti­mate the credit wor­thi­ness of in­di­vid­u­als who are ap­ply­ing for loans. This is done by us­ing his­tor­i­cal data of loans of peo­ple and, among others, their re­pay­ment be­hav­iours.

Un­for­tu­nately, in many banks this data­base is bi­ased to­wards the mid­dle class and the wealthy. This is ex­ac­er­bated by the fact that there is a strong cor­re­la­tion between class and race. Be­cause of this rea­son, this sys­tem is bi­ased against the poor. Again this in­tel­li­gent fourth in­dus­trial rev­o­lu­tion de­vice is ar­guably dis­crim­i­nat­ing based on race and class. Why are these de­vices im­plic­itly or oth­er­wise dis­crim­i­nat­ing against cer­tain peo­ple just as hu­man be­ings have his­tor­i­cally done? To an­swer this ques­tion one needs to go back to the philoso­pher Karl Marx who stated that: “It is not the con­scious­ness of a man that de­ter­mines his so­cial be­ing, but it is his so­cial be­ing that de­ter­mines his con­scious­ness.”

Now that we are in the fourth in­dus­trial rev­o­lu­tion where ma­chines are be­ing used to as­sist hu­man be­ings in mak­ing de­ci­sions, per­haps this state­ment should be re­stated as: “It is not the con­scious­ness of a ma­chine that de­ter­mines its so­cial be­ing, it is its so­cial be­ing that de­ter­mines its con­scious­ness.” So the so­cial, po­lit­i­cal and eco­nomic con­di­tions that are pre­vail­ing at this time, which is that Europe and North Amer­ica are dom­i­nat­ing the so­cial, eco­nomic, tech­no­log­i­cal and po­lit­i­cal spa­ces, are be­ing re­pro­duced by ar­ti­fi­cially in­tel­li­gent ma­chines and thus mak­ing these ma­chines bi­ased.

Tra­di­tion­ally when hu­man be­ings were bi­ased and dis­crim­i­nat­ing, so­cial and po­lit­i­cal ac­tivists in­ter­vened to cor­rect the sit­u­a­tions. Our strug­gle against apartheid is a case in point in this re­gard. Now what kind of ac­tivists do we re­quire to fight against the bias of these ar­ti­fi­cially in­tel­li­gent ma­chines, par­tic­u­larly, given the fact that these ma­chines are not hu­mans and, there­fore, do not have con­scious­ness? Is tra­di­tional ac­tivism that has of­ten in­volved marches and demon­stra­tions still rel­e­vant in the present era of the fourth in­dus­trial rev­o­lu­tion?

To tackle bias and dis­crim­i­na­tion in tech­nol­ogy re­quires a new form of ac­tivism that should per­me­ate the eco­nomic and ed­u­ca­tional spa­ces. For ex­am­ple, the Univer­sity of Johannesburg (UJ) is un­der­tak­ing a project of col­lect­ing African faces so that they can be in­cor­po­rated into these tech­no­log­i­cal de­vices. The other project be­ing con­ducted at UJ is how to en­sure that an ar­ti­fi­cially in­tel­li­gent se­cu­rity sys­tem based on the iris of the eye, which tends to deny peo­ple of African de­scent more than peo­ple of Euro­pean and Asian de­scent from en­ter­ing highly se­cure premises, does not dis­crim­i­nate. This is be­cause the con­trast between the iris and the pupil of the eye is lower for peo­ple of African de­scent than of Euro­pean and Asian de­scent. Go­ing for­ward as South Africa, we should em­bark on projects to record and ar­chive South African ac­cents and sell this data to com­pa­nies that cre­ate these de­vices. The Chi­nese have now in­cor­po­rated Chi­nese writ­ing char­ac­ters into the dig­i­tal ar­chive while the Ethiopian Ge’ez writ­ing char­ac­ters are not yet in­cor­po­rated and this has to change.

To an­swer the ques­tion posed in this ar­ti­cle, tra­di­tional ac­tivism is not suit­able for the present stage of the fourth in­dus­trial rev­o­lu­tion. What we need are lin­guists, sci­en­tists, en­gi­neers, and ethi­cists etc who are driven, and de­mo­graph­i­cally di­verse, to make a dif­fer­ence in so­ci­ety. Be­cause the bi­ases of ma­chines mir­ror the bi­ases of their mak­ers, for us to over­come those bi­ases we have to in­no­vate and thus ed­u­ca­tion of Africans should be the new form of ac­tivism.


DIG­I­TAL POWER: UJ vicechan­cel­lor Pro­fes­sor Tshilidzi Marwala holds a Google voice de­tec­tor de­vice.


Newspapers in English

Newspapers from South Africa

© PressReader. All rights reserved.