Con­sumer re­port: How sex­ism in Sil­i­con Val­ley af­fects your daily life.

Tech au­thor Sara Wachter Boettcher ex­am­ines the prej­u­dices built into every­day tech­nol­ogy

The Washington Post Sunday - - OUTLOOK - Twit­ter: @sara_an­n_­marie

It was a rough week at Google. On Aug. 4, a 10-page memo ti­tled “Google’s Ide­o­log­i­cal Echo Cham­ber” started cir­cu­lat­ing among em­ploy­ees. It ar­gued that the dis­par­i­ties be­tween men and women in tech and lead­er­ship roles were rooted in bi­ol­ogy, not bias. On Mon­day, James Damore, the soft­ware engi­neer who wrote it, was fired; he then filed a la­bor com­plaint to con­test his dis­missal.

We’ve heard lots about Sil­i­con Val­ley’s toxic cul­ture this sum­mer — ven­ture cap­i­tal­ists who propo­si­tion fe­male start-up founders, man­child CEOs like Uber’s Travis Kalan­ick, abu­sive nondis­par­age­ment agree­ments that pre­vent ha­rass­ment vic­tims from de­scrib­ing their ex­pe­ri­ences. Damore’s memo added fuel to the fire, ar­gu­ing that women are more neu­rotic and less stress-tol­er­ant than men, less likely to pur­sue sta­tus, and less in­ter­ested in the “sys­tem­iz­ing” work of pro­gram­ming. “We need to stop as­sum­ing that gen­der gaps im­ply sex­ism,” he con­cludes.

Like the sto­ries that came be­fore it, cov­er­age of this memo has fo­cused on how a sex­ist tech cul­ture harms peo­ple in the in­dus­try — the women and peo­ple of color who’ve been pa­tron­ized, passed over, and pushed out. But what hap­pens in Sil­i­con Val­ley doesn’t stay in Sil­i­con Val­ley. It comes into our homes and onto our screens, af­fect­ing all of us who use tech­nol­ogy, not just those who make it.

Take Snapchat. Last year, on April 20 (also known as 4/20, a hol­i­day of sorts for mar­i­juana fans), the app launched a new photo fil­ter: “Bob Mar­ley,” which ap­plied dread­locks and darker skin tones to users’ self­ies. The fil­ter was roundly crit­i­cized as “dig­i­tal black­face,” but Snapchat re­fused to apol­o­gize. In fact, a few months later, it launched an­other racially of­fen­sive fil­ter — this one mor­ph­ing peo­ple’s faces into Asian car­i­ca­tures com­plete with buck­teeth, squinty eyes and red cheeks.

Then there’s Ap­ple Health, which promised to mon­i­tor “your whole health pic­ture” when it launched in 2014. The app could track ex­er­cise habits, blood al­co­hol con­tent and even chromium in­take. But for a full year af­ter its launch, it couldn’t track men­stru­a­tion, which af­fects a huge por­tion of the pop­u­la­tion.

And con­sider smart­phone as­sis­tants such as Cor­tana and Siri. In 2016, re­searchers noted in JAMA In­ter­nal Medicine that th­ese ser­vices couldn’t un­der­stand phrases such as “I was raped” or “I was beaten up by my hus­band.” They of­ten re­sponded to such queries with jokes.

In many cases, sex­ist or racist bi­ases are also embed­ded in the pow­er­ful (yet in­vis­i­ble) al­go­rithms be­hind much of to­day’s soft­ware.

Look at FaceApp, which came un­der fire this spring for its “hot­ness” photo fil­ter. The fil­ter smoothed wrin­kles, slimmed cheeks — and dra­mat­i­cally light­ened skin. The com­pany be­hind the app ac­knowl­edged that the fil­ter’s al­go­rithm had been trained us­ing a bi­ased data set — mean­ing it learned what “beauty” was from faces that were pre­dom­i­nantly white.

Like­wise, in 2015, Google launched a new im­age-recog­ni­tion fea­ture for its Pho­tos app. The fea­ture would trawl users’ pho­tos, iden­tify their con­tents and au­to­mat­i­cally add la­bels to them — such as “dog,” “grad­u­a­tion” or “bi­cy­cle.” Brook­lyn res­i­dent Jacky Al­ciné no­ticed a more up­set­ting tag: A se­ries of pho­tos of him and a friend, both black, was la­beled with the word “go­ril­las.” The racial slur wasn’t in­ten­tional. The sys­tem sim­ply wasn’t as good at iden­ti­fy­ing black peo­ple as it was white peo­ple. Af­ter the in­ci­dent, Google en­gi­neers ac­knowl­edged this, promis­ing im­prove­ments fo­cused on “bet­ter recog­ni­tion of dark-skinned faces.”

Then there’s Word2vec, a neu­ral net­work Google re­searchers cre­ated in 2013 to as­sist with nat­u­ral lan­guage pro­cess­ing — that is, com­put­ers’ abil­ity to un­der­stand hu­man speech. Word2vec combs through Google News ar­ti­cles to learn about the re­la­tion­ships be­tween words. The pro­gram can com­plete analo­gies such as “Paris is to France as Tokyo is to _____.” But Word2vec also con­cluded “Man is to woman as com­puter pro­gram­mer is to home­maker” and “Man is to ar­chi­tect as woman is to in­te­rior de­signer.”

Th­ese pair­ings aren’t sur­pris­ing — they sim­ply re­flect the Google News data set the net­work was built on. But in an in­dus­try where white men are the norm and “dis­rup­tion” trumps all else, tech­nol­ogy such as Word2vec is of­ten as­sumed to be ob­jec­tive and then embed­ded into all sorts of other soft­ware — rec­om­men­da­tion en­gines, job-search sys­tems. Kathryn Hume, of ar­ti­fi­cial-in­tel­li­gence com­pany In­te­grate.ai, calls this the “time warp” of AI: “Cap­tur­ing trends in hu­man be­hav­ior from our near past and pro­ject­ing them into our near fu­ture.” The ef­fects are far-reach­ing. Stud­ies show that bi­ased ma­chine-learn­ing sys­tems re­sult in prob­lems from job-search re­sults that show women lower-pay­ing po­si­tions to pre­dic­tive-polic­ing soft­ware that per­pet­u­ates dis­par­i­ties in com­mu­ni­ties of color.

Some of th­ese flaws might seem small. But to­gether, they paint a pic­ture of an in­dus­try that’s out of touch with the peo­ple who use its prod­ucts. And with­out a fun­da­men­tal over­haul in the way Sil­i­con Val­ley works — who is funded, who is hired, who is pro­moted and who is be­lieved when abuses hap­pen — it’s go­ing to stay that way. That’s why calls to kill tech di­ver­sity ini­tia­tives are so mis­guided. The sooner we stop let­ting tech get away with be­ing in­su­lar, in­equitable and hos­tile to di­ver­sity, the sooner we’ll start build­ing tech­nol­ogy that works for all of us. Sara Wachter-Boettcher is a Web con­sul­tant and the au­thor of the forth­com­ing book “Tech­ni­cally Wrong: Sex­ist Apps, Bi­ased Al­go­rithms, and Other Threats of Toxic Tech.”

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.