More to keep vigil on content
SAN FRANCISCO — As a business, Facebook is more successful than ever. On Wednesday, it reported another quarter of huge growth, with nearly 2 billion people using the service and revenue up 49 percent in the first quarter compared with a year ago.
But with the company’s vast reach has come another kind of problem: Facebook is becoming too big for its computer algorithms and relatively small team of employees and contractors to manage the trillions of posts on its social network.
Earlier Wednesday, Mark Zuckerberg, the company’s chief executive, acknowledged the problem. In a Facebook post, he said that over the next year, the company would add 3,000 people to the team that polices the site for inappropriate or offensive content, especially in the live videos the company is encouraging users to broadcast.
“If we’re going to build a safe community, we need to respond quickly,” he wrote. “We’re working to make these videos easier to report so we can take
the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.” He offered no details on what would change.
The announcement comes after Facebook Live, the company’s videostreaming service, was used to broadcast a series of horrible acts to viewers, including a man boasting about his apparently random killing of a Cleveland man and the murder of an infant in Thailand.
More broadly, the company has been criticized for doing a poor job weeding out content that violates its rules, including the sharing of nude photographs of female Marines without their consent and illegal gun sales.
Facebook is also grappling with the limitations of its automated algorithms on other fronts, from the prevalence of fake news on the service to a News Feed that tends to show people information that reinforces their views rather than challenges them.
Most of the company’s reviewers are low-paid contractors overseas who spend on average of just a few seconds on each post.