Los Angeles Times (Sunday)

Provide a resume, cover letter and access to your brain?

Regulators are right to take on ‘mind reading’ by employers who subject workers and job applicants to potentiall­y invasive testing

- By Nita Farahany Nita Farahany is a professor of law and philosophy at Duke University and the author of “The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechn­ology.”

Modern workers increasing­ly find companies no longer content to consider their résumés, cover letters and job performanc­e. More and more, employers want to evaluate their brains.

Businesses are screening prospectiv­e job candidates with techassist­ed cognitive and personalit­y tests, deploying wearable technology to monitor brain activity on the job and using artificial intelligen­ce to make decisions about hiring, promoting and firing people. The brain is becoming the ultimate workplace sorting hat — the technologi­cal version of the magical device that distribute­s young wizards among Hogwarts houses in the “Harry Potter” series.

Companies touting technologi­cal tools to assess applicants’ brains promise to dramatical­ly “increase your quality of hires” by measuring the “basic building blocks of the way we think and act.” They claim their tools can even decrease bias in hiring by “relying solely on cognitive ability.”

But research has shown that such assessment­s can lead to racial disparitie­s that are “three to five times greater than other predictors of job performanc­e.” When social and emotional tests are part of the battery, they may also screen out people with autism and other neurodiver­se candidates. And applicants may be required to reveal their thoughts and emotions through AI-based, gamified hiring tools without fully understand­ing the implicatio­ns of the data being collected. With recent surveys showing that more than 40% of companies use assessment­s of cognitive ability in hiring, federal employment regulators have rightly begun to pay attention.

Once workers are hired, new wearable devices are integratin­g brain assessment into workplaces worldwide for attention monitoring and productivi­ty scoring on the job. The SmartCap tracks worker fatigue, Neurable’s Enten headphones promote focus and Emotiv’s MN8 earbuds promise to monitor “your employees’ levels of stress and attention using … proprietar­y machine learning algorithms” — though, the company assures, they “cannot read thoughts or feelings.”

The growing use of brain-oriented wearables in the workplace will undoubtedl­y put pressure on managers to use the insights gleaned from them to inform hiring and promotion decisions. We are vulnerable to the seductive allure of neuroscien­tific explanatio­ns for complex human phenomena and drawn to measuremen­t even when we don’t know what we should be measuring.

Relying on AI-based cognitive and personalit­y testing can lead to simplistic explanatio­ns of human behavior that ignore the broader social and cultural factors that shape the human experience and predict workplace success. A cognitive assessment for a software engineer may test for spatial and analytical skills but ignore the ability to collaborat­e with people from diverse background­s. The temptation is to turn human thinking and feeling into puzzle pieces that can be sorted into the right fit.

The U.S. Equal Employment Opportunit­y Commission seems to have awakened to these potential problems. It recently issued draft enforcemen­t guidelines on “technology-related employment discrimina­tion,” including the use of technology for “recruitmen­t, selection, or production and performanc­e management tools.”

While the commission has yet to clarify how employers can comply with nondiscrim­ination statutes while using technologi­cal assessment­s, it should work to ensure that cognitive and personalit­y testing is limited to employment­related skills lest it intrude on the mental privacy of employees.

The growing power of these tools may tempt employers to “hack” candidates’ brains and screen them based on beliefs and biases, assuming such decisions aren’t unlawfully discrimina­tory because they aren’t directly based on protected characteri­stics. Facebook “likes” can already be used to infer sexual orientatio­n and race with considerab­le accuracy. Political affiliatio­n and religious beliefs are just as easily identifiab­le. As wearables and brain wellness programs begin to track mental processes over time, age-related cognitive decline will also become detectable.

All of this points to an urgent need for regulators to develop specific rules governing the use of cognitive and personalit­y testing in the workplace. Employers should be required to obtain informed consent from candidates before they undergo cognitive and personalit­y assessment, including clear disclosure of how candidates’ data is being collected, stored, shared and used. Regulators should also require that assessment­s be regularly tested for validity and reliabilit­y to ensure that they’re accurate, reproducib­le and related to job performanc­e and outcomes — and not unduly sensitive to factors such as fatigue, stress, mood or medication­s.

Assessment tools should also be regularly audited to ensure that they don’t discrimina­te against candidates based on age, gender, race, ethnicity, disability, thoughts or emotions. And companies developing and administer­ing these tests should regularly update them to account for changing contextual and cultural factors. More broadly, we should consider whether these methods of assessing job applicants are promoting excessivel­y reductioni­st views of human abilities. That’s especially true as the capabiliti­es of human workers are more frequently compared with those of generative AI.

While the use of cognitive and personalit­y assessment­s is not new, the increasing sophistica­tion of neurotechn­ology and AI-based tools to decode the human brain raises important ethical and legal questions about cognitive liberty.

Employees’ minds and personalit­ies should be subject to the most stringent protection. While these new tests may offer some benefits for employers, they must not come at the cost of workers’ privacy, dignity and freedom of thought.

Cognitive and personalit­y tests are not new. But growing sophistica­tion of neurotechn­ology and AI-based tools to decode the human brain raises big questions about fairness and privacy.

 ?? Jim Cooke Los Angeles Times ??
Jim Cooke Los Angeles Times

Newspapers in English

Newspapers from United States