The Press

Facebook reveals why it bans posts

It was just one page in 2008. Now Facebook’s content policy and moderation guidelines number 27 pages, report Elizabeth Dwoskin and Tracy Jan.

-

Among the most challengin­g issues for Facebook is its role as the police officer for the free expression of its 2 billion users.

Now the social network is opening up about its decisionma­king over which posts it decides to take down - and why.

The company for the first time has published the 27-page guidelines, called Community Standards, that it gives to its workforce of thousands of human censors.

The set of guidelines encompasse­s dozens of topics, including hate speech, violent imagery, misreprese­ntation, terrorist propaganda and disinforma­tion. Facebook said it would offer users the opportunit­y to appeal Facebook’s decisions.

The move adds a new degree of transparen­cy to a process that users, the public and advocates have criticised as arbitrary and opaque.

The newly released guidelines offer suggestion­s on topics including how to determine the difference between humour, sarcasm and hate speech.

They explain that images of female nipples are generally prohibited, but exceptions are made for images that promote breast-feeding or address breast cancer.

‘‘We want people to know our standards, and we want to give people clarity,’’ Monika Bickert, Facebook’s head of global policy management, said in an interview.

She added that she hoped publishing the guidelines would spark dialogue. ‘‘We are trying to strike the line between safety and giving people the ability to really express themselves.’’

The company’s censors, called content moderators, have been chastised by civil rights groups for mistakenly removing posts by minorities who had shared stories of being the victims of racial slurs.

Moderators have struggled to tell the difference between someone posting a slur as an attack and someone who was using the slur to tell the story of their own victimisat­ion.

In another instance, moderators removed an iconic Vietnam War photo of a child fleeing a napalm attack, claiming the girl’s nudity violated its policies. (The photo was restored after protests from news organisati­ons.)

The release of the guidelines is part of a wave of transparen­cy that Facebook hopes will quell its many critics.

The company is being investigat­ed by the US Federal Trade Commission over the misuse of data by a Trump-connected consultanc­y known as Cambridge Analytica, and Facebook chief executive Mark Zuckerberg recently testified before Congress about the issue.

Bickert said discussion­s about sharing the guidelines started last fall and were not related to the Cambridge controvers­y.

As Facebook has come to reach nearly a third of the world’s population, Bickert’s team has expanded significan­tly and is expected to grow even more in the coming year.

A team of 7500 reviewers assesses posts 24 hours a day, seven days a week, in more than 40 languages. Moderators are sometimes temporary contract workers without much cultural familiarit­y with the content they are judging, and they make complex decisions in applying Facebook’s rules.

Bickert also employs high-level experts, including a human rights lawyer, a rape counsellor, a counterter­rorism expert from West Point and a PhD researcher with expertise in European extremist organisati­ons, as part of her content review team.

Activists and users have been particular­ly frustrated by the absence of an appeals process when their posts are taken down. (Facebook users are allowed to appeal the shutdown of an entire account but not individual posts.)

Malkia Cyril, a Black Lives Matter activist in Oakland, California, who is also the executive director for the Centre for Media Justice, was among a coalition of more than 70 civil rights groups that pressured Facebook in 2017 to fix its ‘‘racially-biased’’ content moderation system. Among the changes the coalition sought was an appeals process for posts that are taken down.

‘‘At the time they told us they could not do it, they would not do it, and actually stopped engaging at that point,’’ Cyril said, adding that Cyril said that Facebook’s actions did not go far enough in addressing the white supremacis­t groups allowed on the platform.

Zahra Billoo, executive director of the Council on American-Islamic Relations’ office for the San Francisco Bay area, said adding an appeals process and opening up guidelines would be a ‘‘positive developmen­t’’ but said the social network still has a way to go.

Billoo said that at least a dozen pages representi­ng white supremacis­ts are still up on the platform, even though the policies forbid hate speech and Zuckerberg testified before Congress that Facebook does not allow hate groups.

‘‘An ongoing question many of the Muslim community have been asking is how to get Facebook to be better at protecting users from hate speech and not to be hijacked by white supremacis­ts, right-wing activists, Republican­s or the Russians as a means of organising against Muslim, LGBT and undocument­ed individual­s,’’ she said.

Billoo was censored by Facebook two weeks after Donald Trump’s election, when she posted an image of a handwritte­n letter mailed to a San Jose mosque and quoted from it: ‘‘He’s going to do to you Muslims what Hitler did to the Jews.’’

Bickert’s team has been working for years to develop a software system that can classify the reasons a post was taken down so that users could receive clearer informatio­n.

Currently, people who have their posts taken down receive a generic message that says that they have violated Facebook’s community standards.

After Tuesday’s announceme­nt, people will be told whether their posts violated guidelines on nudity, hate speech and graphic violence.

A Facebook executive said the teams were working on building more tools. ‘‘We do want to provide more details and informatio­n for why content has been removed,’’ said Ellen Silver, Facebook’s vice president of operations. ‘‘We have more work to do there, and we are committed to making those improvemen­ts.’’

Though Facebook’s content moderation is still very much driven by humans, the company does use technology to assist in its work. It uses software to identify duplicate reports, a timesaving technique for reviewers that helps them avoid reviewing the same piece of content over and over because it was flagged by many people at once.

Every two weeks, employees and senior executives who make decisions about the most challengin­g issues around the world meet.

They debate the pros and cons of potential policies. Teams who present are required to come up with research showing each side, a list of possible solutions, and a recommenda­tion.

They are required to list the organisati­ons outside Facebook with which they consulted.

- The Washington Post

"We have more work to do there, and we are committed to making those improvemen­ts." Ellen Silver

 ?? RICHARD DREW/AP ?? Facebook is shining a light on how and why it bans some content.
RICHARD DREW/AP Facebook is shining a light on how and why it bans some content.

Newspapers in English

Newspapers from New Zealand