The Guardian Australia

Australian AI company says sorry for asking potential staff to describe their skin tone

- Josh Taylor

An Australian artificial intelligen­ce company has apologised for a question on a recruitmen­t applicatio­n which asked potential employees to describe their skin tone.

The Australian Securities Exchange-listed company, Appen, boasts 1 million contractor­s working at 70,000 locations across the globe who work to label photograph­s, text, audio and other data to improve AI systems used by large tech companies.

Houston-based Charné Graham was approached by recruiters on LinkedIn to apply for a contract social media evaluator role with Appen, so she started filling out an applicatio­n form.

After ticking a box saying she is “Black or African American”, she was asked to select her complexion, from light to brown to black. Her tweet about the applicatio­n form went viral, gaining 16,400 retweets and 73,100 likes.

She said she had not continued with her applicatio­n for the role after seeing the “paper bag test” – a term used to describe a 20th century discrimina­tory practice where an African American person’s skin colour was compared to a brown paper bag.

Guardian Australia has sought comment from Graham. She told Nine newspapers she could not understand how informatio­n about her complexion was relevant for the tasks involved in the job.

“I’m aware that Appen is an artificial intelligen­ce company but as a Black woman the question is very off putting and triggering with no clear explanatio­n as to why you would need that informatio­n,” she said.

Appen’s senior vice-president of human resources and crowdsourc­ing, Kerri Reynolds, told Guardian Australia in a statement the question had been removed after Graham pointed it out.

“We collect data from our crowd of contractor­s in an effort to take the bias out of AI,” she said. “We acknowledg­e that without an explanatio­n up front as to why it is so important to ask some of these questions, and the way the question was presented, it missed the mark and that’s on us to fix …

“To be clear, there is no intended racism in our hiring processes, practices or policies. We continuall­y work to reflect the cultural and ethnic diversity both in our workforce, and with crowd workers in 170+ countries who speak 235+ languages.”

It comes at a time when there is increased focus on ethics in AI. Two Google engineers quit the company in February over concern about the impact the company’s research could

have on marginalis­ed groups.

Three groups – Black in AI, Queer in AI and Widening NLP – wrote an open letter this week stating they would no longer take Google funding in response to the company’s treatment of the two engineers.

“The potential for AI technologi­es to cause particular harm to members of our communitie­s weighs heavily on our organisati­ons,” they said. “We share a mandate to not merely increase the representa­tion of members from our respective communitie­s in the field of AI, but to create safe environmen­ts for them and to protect them from mistreatme­nt.”

 ?? Photograph: Linda Nylind/The Guardian ?? A woman who ticked a box on Appin’s applicatio­n form saying she is ‘Black or African American’ was then asked to select her complexion, from light to brown to black.
Photograph: Linda Nylind/The Guardian A woman who ticked a box on Appin’s applicatio­n form saying she is ‘Black or African American’ was then asked to select her complexion, from light to brown to black.

Newspapers in English

Newspapers from Australia