Facebook forced to expand content-screening team
CEO says company will hire 3,000 new people to police posts for offensive material
SAN FRANCISCO— As a business, Facebook is more successful than ever. On Wednesday afternoon, it reported another quarter of huge growth, with nearly two billion people actively using the service and revenue up 49 per cent in the first quarter compared with a year ago.
But with the company’s vast reach has come another kind of problem: Facebook is becoming too big for its computer algorithms and relatively small team of employees and contractors to manage the trillions of posts on its social network.
Mark Zuckerberg, the company’s chief executive, acknowledged the problem on Wednesday. In a Facebook post, he said that over the next year, the company would add 3,000 people to the team that polices the site for inappropriate or offensive content, especially in the live videos the company is encouraging users to broadcast.
“If we’re going to build a safe community, we need to respond quickly,” he wrote. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.” He offered no details on what would change.
The announcement comes after Facebook Live, the company’s popular video-streaming service, was used to broadcast a series of horrible acts to viewers, including a man boasting about his apparently random killing of a Cleveland man and the murder of an infant in Thailand.
More broadly, the company has been criticized for doing a poor job weeding out content that violates its rules, including sharing nude photographs of female Marines without their consent and illegal gun sales.
Facebook is also grappling with the limitations of its automated algorithms on other fronts, from the prevalence of fake news on the service to a News Feed that tends to show people information that reinforces their views rather than challenges them.
Despite Zuckerberg’s pledge to do a better job in screening content, many Facebook users did not seem to believe that much would change. Hundreds of commenters on Zuckerberg’s post related personal experiences of reporting inappropriate content to Facebook that the company declined to remove.