The Guardian (USA)

Facebook moderation system favours ‘business partners’, says oversight board

- Dan Milmo and Alex Hern

A policy designed to protect high-profile Facebook and Instagram users from moderation was structured to satisfy their parent company’s business interests, Meta’s “supreme court” has found, and did not prioritise protecting free speech and civil rights.

The oversight board, which scrutinise­s moderation decisions on Facebook and Instagram, said the platforms’ “cross-check” system appeared to favour “business partners” – such as users including celebritie­s who generate money for the company – while journalist­s and civil society organisati­ons had “less clear paths” to access the programme.

“While Meta told the board that cross-check aims to advance Meta’s human rights commitment­s, we found that the program appears more directly structured to satisfy business concerns,” said the board, adding that it had concerns about the “lack of transparen­cy” around the programme.

It said cross-check grants certain users greater protection than others because content from users on the crosscheck list is allowed to stay up while it is vetted by human moderators applying the “full range” of content policies. Meta described it as a “mistake-prevention strategy” that protected important users from erroneous content takedowns.

Ordinary users, by contrast, are much less likely to have their content reach reviewers who can apply the full range of Meta’s content guidelines.

The board said a user’s “celebrity or follower count” should not be the sole criterion for receiving the special protection offered by the programme. Meta admitted to the board that criteria for including “business partners” on the list included the amount of revenue they generated.

Meta also told the board that it exempts some content from takedowns. The company described this system as “technical correction­s” and said it carried out about 1,000 a day. The board recommende­d that Meta conducted audits of enforcemen­t actions that are blocked under the system.

The board added that the technical correction­s system is viewed as an “allow list” or “whitelist”. In September last year the Wall Street Journal, using documents disclosed by whistleblo­wer Frances Haugen, reported that Brazilian footballer Neymar had responded to a rape accusation in 2019 by posting Facebook and Instagram videos defending himself, which included showing viewers his WhatsApp correspond­ence with his accuser. The clips from WhatsApp – also owned by Meta – included the accuser’s name and nude photos of her.

Moderators were blocked for more than a day from removing the video, according to the WSJ, while the normal punishment of disabling his accounts was not implemente­d. An internal document seen by the WSJ said Neymar’s accounts were left active after “escalating the case to leadership”. Neymar denied the rape allegation and no charges were filed against the footballer.

Citing the Neymar example, the board said that despite Meta saying it had a system for prioritisi­ng content decisions, some content still remained online for “significan­t periods” while this happened.

“In the Neymar case, it is difficult to understand how non-consensual intimate imagery posted on an account with more than 100 million followers would not have risen to the front of the queue for rapid, high-level review if any system of prioritisa­tion had been in place,” said the board.

The board went on to say that the cross-check “business partner” category includes users who are likely to generate money for the company, either through formal business relationsh­ips or drawing users to Meta platforms. It said due to the “perception of censorship” it preferred keeping content up to taking it down. The board said the business partner category was likely to include major companies, political parties and campaigns, and celebritie­s.

The board made 32 recommenda­tions. They included: removing special protection for commercial­ly important accounts if they break content rules frequently; prioritisi­ng moderation of posts that are important for human rights; and violating content from cross-check users that is “high severity” should be removed or hidden while reviews are taking place.

The board said Meta viewed the risk of a content decision resulting in “escalation at the highest levels” to a chief executive or chief operating officer of an organisati­on as highly sensitive. Such content carries an “extremely high severity” tag under the cross-check system. It said Meta therefore seemed more focused on businessre­lated consequenc­es for its decisions rather than ones that are human rightsrela­ted.

Meta’s president of global affairs, Nick Clegg, said that in order to “fully address” the board’s recommenda­tions, the company would respond within 90 days.

 ?? Photograph: Dado Ruvić/ Reuters ?? Platforms’ ‘cross-check’ system lets down ordinary users, whose content is less likely to be seen by moderators.
Photograph: Dado Ruvić/ Reuters Platforms’ ‘cross-check’ system lets down ordinary users, whose content is less likely to be seen by moderators.

Newspapers in English

Newspapers from United States