USA TODAY US Edition

Facebook ‘guidelines’ dictate your posts

‘Guardian’ report shows where social media giant draws line on free speech

- Brett Molina @brettmolin­a23 USA TODAY

A report published Monday sheds light on how Facebook moderates content, providing insight into where it draws a line on violent or threatenin­g posts as the social network faces criticism that it has allowed the platform to become a dangerous tool. The report from Britain’s The

Guardian cites “more than 100 internal training manuals, spreadshee­ts and flowcharts,” providing insight into how Facebook chooses which content you will see.

A key example is what comments are deemed “credible violence.” For example, Facebook’s policies would allow someone to write “let’s beat up fat kids” or to describe how to “snap” a woman’s neck, but not “someone shoot Trump.”

The former, according to the documents, don’t signal an intent to act, while the latter could.

“People commonly express disdain or disagreeme­nt by threatenin­g or calling for violence in generally facetious and unserious ways,” reads one of the documents leaked to The Guardian. “Not all disagreeab­le or disturbing content violates our community standards.”

Other guidelines include allowing images or videos of non-sexual physical abuse or bullying of children, so long as there is not a “sadistic or celebrator­y element.” The guidelines also let users post live streams of users trying to harm themselves because they don’t want to “censor or punish people in distress.”

In a statement, Monica Bickert, Facebook’s head of global policy management, says the social network’s most important priority is keeping its users safe.

“We work hard to make Facebook as safe as possible while enabling free speech,” Bickert said. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”

The report surfaced as Facebook’s response to questionab­le content — such as videos depicting violence — has been criticized as harmful to users, victims and possibly encouragin­g copycats. Recently in Thailand, a father used Facebook Live to stream the killing of his 11-month-old daughter. He later committed suicide off camera. The videos remained on Facebook for 24 hours before they were taken down.

Following the death of Cleveland grandfathe­r Robert Godwin, whose murder was captured in a video posted to Facebook, Jesse Jackson and officials in Chicago sought a 30-day moratorium on the use of Facebook Live.

CEO Mark Zuckerberg has announced Facebook would add 3,000 more people to its community team to review videos. “If we’re going to build a safe community, we need to respond quickly,” he said this month.

Even in cases where it follows standards to curtail violence, Facebook sometimes finds itself in trouble. Last fall, Facebook reinstated the iconic photo of a naked girl fleeing a napalm attack during the Vietnam War after originally removing it for violating community standards.

To enforce user standards, Facebook employs a community operations team with the assist of automated systems and enlists users to help weed out any controvers­ial content. However, in a separate piece,

The Guardian reports moderators often feel overwhelme­d by the amount of reports they receive.

Along with adding more people to its team of moderators, Facebook says it will offer them more tools to help them respond faster.

A key example is what comments are deemed “credible violence.” For example, Facebook’s policies would allow someone to write “let’s beat up fat kids” or to describe how to “snap” a woman’s neck, but not “someone shoot Trump.”

 ?? JOSH EDELSON, AFP/GETTY IMAGES ?? Facebook says its most important priority is keeping users safe.
JOSH EDELSON, AFP/GETTY IMAGES Facebook says its most important priority is keeping users safe.

Newspapers in English

Newspapers from United States