The Denver Post

Facebook explains why it bans some content

- By Elizabeth Dwoskin and Tracy Jan

SAN FRANCISCO» Among the most challengin­g issues for Facebook is its role as the policeman for the free expression of its 2 billion users.

Now the social network is opening up about its decisionma­king over which posts it decides to take down — and why. On Tuesday the company for the first time published the 27-page guidelines, called Community Standards, that it gives to its workforce of thousands of human censors. The set of guidelines encompasse­s dozens of topics including hate speech, violent imagery, misreprese­ntation, terrorist propaganda and disinforma­tion. Facebook said it would offer users the opportunit­y to appeal Facebook’s decisions.

The move adds a new degree of transparen­cy to a process that users, the public and advocates have criticized as arbitrary and opaque. The newly released guidelines offer suggestion­s on topics including how to determine the difference between humor, sarcasm and hate speech. They explain that images of female nipples are generally prohibited, but exceptions are made for images that promote breastfeed­ing or address breast cancer.

“We want people to know our standards, and we want to give people clarity,” Monika Bickert, Facebook’s head of global policy management, said in an interview. She added that she hoped publishing the guidelines would spark dialogue. “We are trying to strike the line between safety and giving people the ability to really express themselves.”

The company’s censors, called content moderators, have been chastised by civil rights groups for mistakenly removing posts by minorities who had shared stories of being the victims of racial slurs. Moderators have struggled to tell the difference between someone posting a slur as an attack and someone who was using the slur to tell the story of their own victimizat­ion.

In another instance, moderators removed an iconic Vietnam War photo of a child fleeing a napalm attack, claiming the girl’s nudity violated its policies. (The photo was restored after protests from news organizati­ons.) Moderators have deleted posts from activists and journalist­s in Myanmar and in disputed areas such as the Palestinia­n territorie­s and Kashmir and have told pro-Trump activists Diamond and Silk they were “unsafe to the community.”

The release of the guidelines is part of a wave of transparen­cy that Facebook hopes will quell its many critics. It has also published political ads and streamline­d its privacy controls after coming under fire for its lax approach to protecting consumer data.

The company’s content policies, which began in earnest in 2005, addressed nudity and Holocaust denial in the early years.

As Facebook has come to reach nearly a third of the world’s population, Bickert’s team has expanded significan­tly and is expected to grow even more in the coming year. A team of 7,500 reviewers, in places like Austin, Dublin and the Philippine­s, assesses posts 24 hours a day, seven days a week, in more than 40 languages. Moderators are sometimes temporary contract workers without much cultural familiarit­y with the content they are judging, and they make complex decisions in applying Facebook’s rules.

Currently, people who have their posts taken down receive a generic message that says that they have violated Facebook’s community standards. After Tuesday’s announceme­nt, people will be told whether their posts violated guidelines on nudity, hate speech and graphic violence. A Facebook executive said the teams were working on building more tools. “We do want to provide more details and informatio­n for why content has been removed,” said Ellen Silver, Facebook’s vice president of operations.

Newspapers in English

Newspapers from United States