The Sunday Post (Inverness)

Safety chief: We can’t do it on our own

-

Facebook has insisted it needs the support of users to win the fight against online hatred.

The social media giant has asked for help from the public after being criticised for not doing enough to remove extremist views. According to the company’s own figures, its software algorithms are failing to pick up two thirds of hate speech on the platform.

Antigone Davis, Facebook’s head of global safety, said more “human involvemen­t” was needed to weed out hate speech.

She added: “When you’re talking about hate speech, it can require a good deal of context in which to understand the term that someone has used or how they’re using it.

“I think that is an area in which we need human involvemen­t.”

Ms Davis said she did not believe the firm was struggling with the offensive text, and said: “I’d say it’s not as good or not as valuable for hate speech as it is for other content.” A company report showed Facebook’s detection programmes found 99.7% of spam and 99.5% of terrorist material before it was reported by users, but just 38% of hate speech was found by its software.

Bosses have promised to increase the number of moderators from 10,000 to 20,000 in a bid to tackle hateful content, but refused to say how many would be working on UK pages. Germany, which has a law penalising websites that do not remove hate speech within a set time frame, has 400 Facebook moderators.

 ??  ?? Antigone Davis, head of global safety at Facebook
Antigone Davis, head of global safety at Facebook

Newspapers in English

Newspapers from United Kingdom