How job listings can hurt diversity
Using certain words in ads can imply that some people are not welcome
“Want to bro down and crush code?”
Hidden biases in job postings — like the infamous line above that tech startup Klout used in 2012 — as well as the words recruiters use to describe a position may be turning away potential employees long before they’ve had a chance to send in a resume.
As tech companies struggle with bringing more diverse candidates into the fold, that kind of slip-up is one diversity experts say they simply can’t afford.
In an effort to rout out coded language and divisive phrases, several companies have devised software that reads, detects and, in some cases, learns from job listings.
The most recent entry comes from business software firm SAP, which this month released a new feature for SuccessFactors, its online human resources service: a job-listing analysis tool that mines text and uses machine learning to identify problematic phrases, suggest more innocuous alternatives and adapt to how candidates react.
The software will become widely available in August.
“What we’ve seen is this classic approach in HR, which is looking in the rear-view mirror going, ‘Huh, your gender ratio is not right, you need to fix that.’ Or, ‘Your
cultural diversity is lacking, you need to fix that.’ But isn’t that too late?” said Mike Ettling, president of SAP SuccessFactors. “You should be able to prevent the bias at the point where it gets originated. Actually when you write jobs specs, sometimes the bias is right there.”
In the two years since Google and Intel released dismal diversity numbers, the sector’s skewed workforce demographics remains largely unchanged.
On average, tech firms are more than 70 percent male and more than 80 percent white and Asian. Women and underrepresented minorities — blacks, Latinos and Native Americans — are small in number. Among executives, those numbers shrink further.
Older workers are vastly outnumbered by young employees. According to numbers collected and released by Payscale Inc., a compensation-data firm, the average age of employees at big Bay Area tech companies ranges from 28 at Facebook to 33 at Adobe.
Countless meetings, seminars and think pieces have examined why diversity remains challenging.
Many believe the problem begins before underrepresented candidates even sit down for an interview.
In a 2013 paper published in the Journal of Personality and Social Psychology, five studies by researchers at the University of Waterloo and Duke University were used to show that job listings for positions in engineering and other largely male professions tended to prefer more masculine language, but that listings for more female-dominated professions like nursing or human resources did not include those words, favoring instead what experts call “more inclusive language.”
The studies showed that even women who were qualified and wellsuited for a job would refrain from applying if masculine-coded language was present.
In the highly competitive world of tech, where companies constantly seek to outdo each other in their attempts to attract the best talent, many companies repeat and reuse words that research has shown have clear biases.
Job postings asking for “rockstars,” “wizards” and “ninjas” skew male. Those seeking recent graduates or explicitly noting a maximum amount of experience alienate older applicants. Ads asking for graduates from “toptier” colleges may lose out on underrepresented minority candidates who may not think historically black colleges and universities, Latinoserving institutions or state colleges count as “top tier.” When tech employees seem overwhelmingly white and male, diverse candidates may not see themselves as fitting a job asking for people with a “startup mentality,” experts said.
Other problematic language may be more subtle.
Asking for the “best of the best” will largely give you white male applicants. Telling would-be employees that a company has a “work hard, play hard” culture may signal to older workers that it’s a Millennials club. Including the phrase “competitive salary” can be a turn-off to women, who may be less inclined to negotiate.
“The words we use send a really powerful message to people about whether this is a place you’ll fit in or a place where you’ll belong,” said Joelle Emerson, founder and chief executive of Paradigm, a consulting firm that focuses on growing companies’ diversity. “Typically, job descriptions reflect in many ways the underlying culture and values of the organizations that write them.”
Inclusive language that includes words like “creative” instead of “hacker,” “thoughtful” instead of “genius” or “willing to learn” instead of “natural ability” will broaden the scope of candidates to include many that may have otherwise not thought to apply.
Textio, a company that grew out of a successful crowdfunding campaign, has until now been the only such program that relies on machine learning to build its lexicon of problem terms and suggest more inclusive synonyms to its users.
Ettling said the SAP software was built initially to scrub out gender biases, though as it becomes more widely adopted and the algorithm can track a greater breadth of behavior, he expects they will expand it to include other areas of bias including age and race.
“The tech industry is, as we say in South Africa, very male and pale, but the world’s not that way. The world has never been that, and the industry needs to change,” said Ettling, a South African native who fits his rhyming characterization of the industry. “In tech, it’s not about your gender or the color of your skin or the language you speak, it really is about your intelligence and your ability to innovate. It’s an industry where you would think diversity should be super high on their priorities because it will only help innovation.”
The issue of diversifying tech, and Silicon Valley at large, has attracted attention from workers, federal regulators and entrepreneurs who have built businesses around that goal.
Explicitly advertising for workers of a particular race or gender is outlawed by the Civil Rights Act, but, experts said, companies often hint at what kind of person they imagine in the job — whether they know it or not.
The software the industry builds, as it turns out, may be part of the solution.