The National - News

Facebook reveals secret guidelines used to police extremist content

- CAROLINE BYRNE

Facebook has published the secretive rules its 7,500 content monitors use to remove posts likely to promote terrorism, incite violence or breach company policies covering everything from hate speech to child exploitati­on, sex, bullying and drugs.

The 27-page Facebook rulebook released yesterday offers an unpreceden­ted insight into how the company decides what its two billion users may or may not share, and how the social media giant navigates the line between censorship and free speech. The rules update the short “community standards” guidelines Facebook has previously allowed users to see.

“You should, when you come to Facebook, understand where we draw these lines and what is OK and what’s not OK,” Facebook’s vice president of product policy and counter-terrorism, Monika Bickert, a former US federal prosecutor, said yesterday.

In Facebook’s Graphic Violence guidelines section, for example, Facebook explains that it removes content that “glorifies violence or celebrates the suffering or humiliatio­n of others” but allows graphic content, with some limitation­s, to help people raise awareness about issues.

In its Hate Speech section, Facebook said it does not allow speech that “creates an environmen­t of intimidati­on and exclusion and in some cases may promote real-world violence”.

The rule book does not address controvers­ial issues that have dogged Facebook for months, however, including the publicatio­n of fake news, the Cambridge Analytica data harvesting scandal, or questions about whether Facebook is doing enough to protect the welfare of children online.

On Monday, Facebook took another hit when it was sued for defamation by Martin Lewis, a British financial expert who claims his image has been used in 50 fake Facebook adverts to scam millions from vulnerable people.

Siobhan Cummiskey, Facebook’s head of policy for Europe, the Middle East and Africa, admitted the company’s enforcemen­t of policy breaches was not perfect but insisted Facebook had the interests of its users at heart and plans to hire additional content reviewers to beef up its 7,500-strong team worldwide.

In an interview with Sky News, Ms Cummiskey said the company uses a combinatio­n of technology, human reviewers and the flagging of problem content to remove texts, pictures and video posts that

breach the site’s standards. She said that Facebook considers the safety of its users to be paramount “and that’s really why we are publishing this new set of community standards”.

Facebook said told reporters that it considers changes to its content policy every two weeks at a meeting called the Content Standards Forum, led by Ms Bickert.

Its standards are based, in part, on feedback from more than 100 organisati­ons and experts in counter-terrorism, counter child exploitati­on and other areas.

Facebook said it would also introduce a mechanism that will allow users to appeal against decisions to take down content. Previously, users could appeal only against the removal of accounts, groups and pages.

The new standards highlight Facebook’s determinat­ion to act on unacceptab­le content, but they are also an admission by Facebook that it needs to improve.

“Our policies are only as good as the strength and accuracy of our enforcemen­t and our enforcemen­t isn’t perfect. We make mistakes because our processes involve people and people are not infallible,” Ms Bickert wrote in a blog post.

Newspapers in English

Newspapers from United Arab Emirates