Can We Rely on Algorithmic Hiring?
Let’s agree on one point: humans are biased decision makers. Here’s a majestic example of this: hiring managers have a tendency to hire candidates who remind them of themselves, resulting in further homogeneity in the workplace. In the tech sector, this homogeneity has been especially incredible: Google’s first diversity report, released a year ago, reported only 2 percent of its staff are black, and 3 percent Hispanic. Facebook announced it’s going to try the NFL’S “Rooney Rule”which requires that NFL teams interview minority candidates for coaching positions, in order to be more diverse.
One proposed solution is to try to remove some of those predispositions with a systematic analysis of data, i.e. rely on algorithmic hiring. Companies employ personality tests during screening, then use data analysis to determine its ideal hires. Generally, the algorithm depends on what a company is looking for, some common variables include using the data from identity tests to predict whether a candidate will quit or steal on the job.
Algorithmic hiring has become the norm, as of late. Google used an algorithm to staff up rapidly, using a detailed overview to zone in on candidates who will fit into the workplace culture. One study of algorithmic hiring found that a simply equation was essentially superior than humans at identifying high-performing workers. The result held across various industries and levels of employment, and the researchers attributed the outcome to humans giving careful consideration to unimportant details and using data about candidates conflictingly.
Presently, one company is reporting that algorithmic hiring can enhance diversity. Infor Talent Science gives software that helps companies hire using behavioral data, and then make a predictive model based on top performers. They then hire candidates based on how they match up to those top performers. One particular company used data of 50,000 hires for their clients and found an average increase of 26 percent in African Americans and Hispanics across a range of industries and jobs.
Regardless of the industry, whether it’s a call center, restaurant, or retail, algorithmic hiring increases the diversity of the workforce. In an Infor report it was found that a wholesale client was able to expand Hispanic contracts by 31 percent. A food joint was able to recruit African American contracts by 60 percent.
One of the admonitions of Infor’s study is that their data is only based on hires who disclosed ethnic background. Similarly as with most overviews, checking the racial box is willful. Accumulating racial data has for some time been tricky as candidates regularly worry that it will result into racial discrmination. (The Census Bureau too suffers from this issue, and it is exploring different avenues to collecting data about race and origin.) But it’s not clear that, minority candidates are undercounted: Others may believe that disclosing race will pull in diversity-minded employers.
So will hiring algorithms eliminate the biased hiring process? Researchers warn that big data’s objectivity can also cover other biases built into the algorithm. Chelsea Barabas, a researcher at MIT’S Center for Civic Media, says -
Decisions taking into account algorithms, are getting to be used for everything from predicting workplace conduct to denying opportunity in a way that can mask partialities while keeping up a patina of scientific objectivity. These are concerns by different researchers, for example, Kate Crawford, who has made arguments against the case that big data doesn’t oppress social groups.
There’s a lot of research on the reasons that diversity is useful for the work culture: It increases efficiency; it increases critical thinking; it’s even been shown to increase sales and generate more revenue. The question of whether work diversity is good appears to have been answered. But, how do we accomplish such diversity?
These results may appear to show that algorithmic hiring can decrease biases, however an organization needs to think about doing as such. Infor’s results are incredible, however, there are very few companies interested in increasing workplace diversity.
Hiring managers are great at specifying what’s required for a position and inspiring data from hopefuls-yet they’re terrible at measuring the outcomes. An analysis of numerous studies of candidate assessments shows that even the simplest equations beats human hiring decisions by no less than 25%. The results are the same even with a large pool of candidates, regardless of the job function they’re getting interviewed for.
In a study conducted by HBR, Brian S. Connelly, of the University of Toronto, it was found that people making the call were more familiar with the company and had more information about the applicants than included in the equation. The issue is the people are easily distracted by things that might only be marginally relevant, and they often use this information inconsistently.
Studies recommend that while evaluating people, 85% to 97% of hiring experts depend to some degree on instinct. Hiring managers believe they can settle on the best choice by contemplating a candidate’s envelope and looking into his or her eyes-no algorithm necessary. They would argue that an algorithm cannot substitute a veteran’s vast knowledge.
If not appropriate to leave the decision making process up to the machines altogether. Companies must use an algorithmic hiring process in order to narrow the field before calling on human judgment to pick a few finalists for the job. Even better if we have several managers weight on the final hiring decision. This way, you can boost the benefits offered by the algorithm and cater to the managers’ needs to exercise their hiring power.
Softwares like Doxa, match candidates with tech companies and even particular groups and directors based on values, skills, and compatibility, like whether a team has more collaboration, or is there a team where a women’s options are taken more seriously?
In this way, Doxa has revealed parts of working at companies that are never made public to candidates. This data, from anonymous employee surveys, also includes time keeping, weekly working hours, and which departments have the biggest gender pay gaps.
Another software, Textio, uses machine learning and language analysis to break down job postings for companies like Starbucks and Barclays. Textio revealed more than 25,000 phrases that show gender bias. Language like “top-tier” and “aggressive” and sports or military analogies like “mission critical” diminishing the proportion of women who apply for work. Language like “passion for learning” and “collaboration” pull in more women.
We cannot get too overconfident relying on data, human expertise is still necessary while making hiring decisions.