The National - News

Facebook risks traumatisi­ng its moderators ▶

Content monitors for the social media giant are often left with psychologi­cal damage

-

This week, The National reported on the working conditions endured by the 15,000 people hired by Facebook to monitor potentiall­y inappropri­ate content on the social networking site.

In 2019, Chris Gray, a content moderator previously employed by one of the company’s Facebook contractor­s for monitoring work, spoke out about the post-traumatic stress disorder he sustained as a result of his role. It is a tragic, but perhaps expected outcome, given that his job involved watching executions, rapes and child abuse on a routine basis.

Last May, Facebook was ordered to pay $52 million in compensati­on to more than 11,000 current and former content moderators in the US for job-related harm to their mental health. In Europe, 30 workers from Ireland, Spain and Germany have launched legal action against the social media giant and four of its third-party outsourcin­g agents, seeking compensati­on for psychologi­cal damages. The fact that judges are finding not only Facebook, but also sub-contracted companies liable indicates the nebulous manner in which the social media platform allegedly distances itself from the reality of moderators’ jobs.

Mr Gray reports targets of checking 1,000 flagged posts a day, which he would have to categorise with 98 per cent accuracy. He was given only eight days’ training before being handed a copy of Facebook’s guidelines and beginning the job. Mr Gray said the office was staffed in large part by young language students, who were being paid about $32,000 a year.

Politician­s are critical of Facebook’s slow response. On Wednesday, the chairman of Ireland’s justice committee suggested that Facebook is failing to address these problems. Alan Rusbridger, a member of Facebook’s semi-independen­t oversight board, said the group would investigat­e the company’s core algorithm, in an attempt to delve deeper into how the website censors content and gives it prominence. Whether or not the board is successful, their intentions are correct.

All of this points to a company that has lost control of a technologi­cal revolution it created. And with inadequate training and occupation­al support, it is no surprise that moderators are struggling.

Employees will, of course, have had some indication about the difficult nature of the job before they applied. But this is no excuse for inadequate safety measures and access to counsellin­g, of the kind that other profession­s dealing with traumatic circumstan­ces, such as medics and police officers, would be entitled to.

Identifyin­g and adjudicati­ng on evidence of abuses by human beings will always require other human beings to make the necessary moral judgments. Algorithms, especially in their current form, can only go so far in that process. Those who profit from social media platforms should support moderators when they demand better conditions. And the public, shielded on a daily basis from exposure to the horrors the moderators must witness, ought to second those demands.

Newspapers in English

Newspapers from United Arab Emirates