How job list­ings can hurt di­ver­sity

Us­ing cer­tain words in ads can im­ply that some peo­ple are not wel­come

San Francisco Chronicle - - BUSINESS REPORT - By Marissa Lang

“Want to bro down and crush code?”

Hid­den bi­ases in job post­ings — like the in­fa­mous line above that tech startup Klout used in 2012 — as well as the words re­cruiters use to de­scribe a po­si­tion may be turn­ing away po­ten­tial em­ploy­ees long before they’ve had a chance to send in a re­sume.

As tech com­pa­nies strug­gle with bring­ing more di­verse can­di­dates into the fold, that kind of slip-up is one di­ver­sity ex­perts say they sim­ply can’t af­ford.

In an ef­fort to rout out coded lan­guage and di­vi­sive phrases, sev­eral com­pa­nies have de­vised soft­ware that reads, de­tects and, in some cases, learns from job list­ings.

The most re­cent entry comes from busi­ness soft­ware firm SAP, which this month re­leased a new fea­ture for Suc­cessFac­tors, its on­line hu­man re­sources ser­vice: a job-list­ing anal­y­sis tool that mines text and uses ma­chine learn­ing to iden­tify prob­lem­atic phrases, sug­gest more in­nocu­ous al­ter­na­tives and adapt to how can­di­dates re­act.

The soft­ware will be­come widely avail­able in Au­gust.

“What we’ve seen is this clas­sic ap­proach in HR, which is look­ing in the rear-view mir­ror go­ing, ‘Huh, your gen­der ra­tio is not right, you need to fix that.’ Or, ‘Your

cul­tural di­ver­sity is lack­ing, you need to fix that.’ But isn’t that too late?” said Mike Et­tling, pres­i­dent of SAP Suc­cessFac­tors. “You should be able to pre­vent the bias at the point where it gets orig­i­nated. Ac­tu­ally when you write jobs specs, some­times the bias is right there.”

In the two years since Google and In­tel re­leased dis­mal di­ver­sity num­bers, the sec­tor’s skewed work­force de­mo­graph­ics re­mains largely un­changed.

On av­er­age, tech firms are more than 70 per­cent male and more than 80 per­cent white and Asian. Women and un­der­rep­re­sented mi­nori­ties — blacks, Lati­nos and Na­tive Amer­i­cans — are small in num­ber. Among ex­ec­u­tives, those num­bers shrink fur­ther.

Older work­ers are vastly out­num­bered by young em­ploy­ees. Ac­cord­ing to num­bers col­lected and re­leased by Payscale Inc., a com­pen­sa­tion-data firm, the av­er­age age of em­ploy­ees at big Bay Area tech com­pa­nies ranges from 28 at Face­book to 33 at Adobe.

Count­less meet­ings, sem­i­nars and think pieces have ex­am­ined why di­ver­sity re­mains chal­leng­ing.

Many be­lieve the prob­lem be­gins before un­der­rep­re­sented can­di­dates even sit down for an in­ter­view.

In a 2013 pa­per pub­lished in the Jour­nal of Per­son­al­ity and So­cial Psy­chol­ogy, five stud­ies by re­searchers at the Univer­sity of Water­loo and Duke Univer­sity were used to show that job list­ings for po­si­tions in engi­neer­ing and other largely male pro­fes­sions tended to pre­fer more mas­cu­line lan­guage, but that list­ings for more fe­male-dom­i­nated pro­fes­sions like nurs­ing or hu­man re­sources did not in­clude those words, fa­vor­ing in­stead what ex­perts call “more in­clu­sive lan­guage.”

The stud­ies showed that even women who were qual­i­fied and well­suited for a job would re­frain from ap­ply­ing if mas­cu­line-coded lan­guage was present.

In the highly com­pet­i­tive world of tech, where com­pa­nies con­stantly seek to outdo each other in their at­tempts to at­tract the best tal­ent, many com­pa­nies re­peat and re­use words that re­search has shown have clear bi­ases.

Job post­ings ask­ing for “rock­stars,” “wiz­ards” and “nin­jas” skew male. Those seek­ing re­cent grad­u­ates or ex­plic­itly not­ing a max­i­mum amount of ex­pe­ri­ence alien­ate older ap­pli­cants. Ads ask­ing for grad­u­ates from “top­tier” col­leges may lose out on un­der­rep­re­sented mi­nor­ity can­di­dates who may not think his­tor­i­cally black col­leges and univer­si­ties, Lati­noserv­ing in­sti­tu­tions or state col­leges count as “top tier.” When tech em­ploy­ees seem over­whelm­ingly white and male, di­verse can­di­dates may not see them­selves as fit­ting a job ask­ing for peo­ple with a “startup men­tal­ity,” ex­perts said.

Other prob­lem­atic lan­guage may be more sub­tle.

Ask­ing for the “best of the best” will largely give you white male ap­pli­cants. Telling would-be em­ploy­ees that a com­pany has a “work hard, play hard” cul­ture may sig­nal to older work­ers that it’s a Mil­len­ni­als club. In­clud­ing the phrase “com­pet­i­tive salary” can be a turn-off to women, who may be less in­clined to ne­go­ti­ate.

“The words we use send a re­ally pow­er­ful mes­sage to peo­ple about whether this is a place you’ll fit in or a place where you’ll be­long,” said Joelle Emer­son, founder and chief ex­ec­u­tive of Par­a­digm, a con­sult­ing firm that fo­cuses on grow­ing com­pa­nies’ di­ver­sity. “Typ­i­cally, job de­scrip­tions re­flect in many ways the un­der­ly­ing cul­ture and val­ues of the or­ga­ni­za­tions that write them.”

In­clu­sive lan­guage that in­cludes words like “cre­ative” in­stead of “hacker,” “thought­ful” in­stead of “ge­nius” or “will­ing to learn” in­stead of “natural abil­ity” will broaden the scope of can­di­dates to in­clude many that may have oth­er­wise not thought to ap­ply.

Tex­tio, a com­pany that grew out of a suc­cess­ful crowd­fund­ing cam­paign, has un­til now been the only such pro­gram that re­lies on ma­chine learn­ing to build its lex­i­con of prob­lem terms and sug­gest more in­clu­sive syn­onyms to its users.

Et­tling said the SAP soft­ware was built ini­tially to scrub out gen­der bi­ases, though as it be­comes more widely adopted and the al­go­rithm can track a greater breadth of be­hav­ior, he ex­pects they will ex­pand it to in­clude other ar­eas of bias in­clud­ing age and race.

“The tech in­dus­try is, as we say in South Africa, very male and pale, but the world’s not that way. The world has never been that, and the in­dus­try needs to change,” said Et­tling, a South African na­tive who fits his rhyming char­ac­ter­i­za­tion of the in­dus­try. “In tech, it’s not about your gen­der or the color of your skin or the lan­guage you speak, it re­ally is about your in­tel­li­gence and your abil­ity to innovate. It’s an in­dus­try where you would think di­ver­sity should be su­per high on their pri­or­i­ties be­cause it will only help in­no­va­tion.”

The is­sue of di­ver­si­fy­ing tech, and Sil­i­con Val­ley at large, has at­tracted at­ten­tion from work­ers, fed­eral reg­u­la­tors and en­trepreneur­s who have built busi­nesses around that goal.

Ex­plic­itly ad­ver­tis­ing for work­ers of a par­tic­u­lar race or gen­der is out­lawed by the Civil Rights Act, but, ex­perts said, com­pa­nies of­ten hint at what kind of per­son they imag­ine in the job — whether they know it or not.

The soft­ware the in­dus­try builds, as it turns out, may be part of the so­lu­tion.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.