Facebook publishes content removal rules for first time
CA’S FORMER CEO LIED IN TESTIMONY, SAYS ACADEMIC
SAN FRANCISCO: For the first time, Facebook Inc. is letting people know its specific rules for taking down content once it’s reported to the social network’s moderators.
The 27-page document governs the behaviour of more than 2 billion users, giving Facebook’s definitions of hate speech, violent threats, sexual exploitation and more. It’s the closest the world has come to seeing an international code of conduct that was previously enacted behind closed doors. The release of the document follows frequent criticism and confusion about the company’s policies.
The community standards read like the result of years of trial and error and are used to provide workers with enough specificity to make quick and consistent judgments. While fully nude close-ups of buttocks aren’t allowed, they are permitted if “photoshopped on a public figure.”
Facebook published the policies to help people understand where the company draws the line on nuanced issues, Monika Bickert, vice-president of global policy management, said in a blog post. The company will for the first time give people a right to appeal its decisions.
The release of the content policies comes just days after chief executive officer Mark Zuckerberg testified to Congress, where he faced frequent questions about the company’s practices. They included lawmakers asking if Facebook unfairly takes down more conservative content than that from liberals or why bad content—such as fake profiles and posts selling opioid drugs -- stay up even though they have been reported.
“Our policies are only as good as the strength and accuracy of our enforcement—and our enforcement isn’t perfect,” Bickert said. “In some cases, we make mistakes because our policies are not sufficiently clear to our content reviewers. More often than not, however, we make mistakes because our processes involve people, and people are fallible.”