Facebook’s year of reckoning
SAN FRANCISCO – Mark Zuckerberg’s crusade to fix Facebook this year is beginning with a startling retreat. The social network, its chief executive said, would step back from its role in choosing the news that 2 billion users see on its site every month.
The company is too “uncomfortable” to make such decisions in a world that has become so divided, Zuckerberg explained recently.
The move was one result of a tumultuous, 18-month struggle by Facebook to come to grips with its dark side, interviews with 11 current and former executives show. As outsiders criticized the social network’s harmful side effects, such as the spread of disinformation and violent imagery, vigorous internal debates played out over whether to denounce Donald Trump directly, how forthcoming to be about Russian meddling on its platform in the 2016 election, and how to fight the perception that Facebook is politically biased.
Whether Zuckerberg’s proposed changes can address these issues will soon be tested with another major election only 10 months away. Right now, the company isn’t confident that it can prevent the problems that roiled Facebook during the 2016 presidential election, a top executive acknowledged.
“I can’t make a final assessment, other than we are substantially better today in light of the experience than we were a year ago,” Elliot Schrage, Facebook’s vice president for communications, policy and marketing, said in an interview. “We will be dramatically better even still a year from now.”
Some current and former executives think Facebook has not fully owned up to the negative consequences of its tremendous power. At the heart of the dilemma is the very technology that makes the social network work, they said.
“The problem with Facebook’s whole position is that the algorithm exists to maximize attention, and the best way to do that is to make people angry and afraid,” said Roger McNamee, an investor and mentor to Zuckerberg in Facebook’s early days. He and others – including the company’s first president, its former head of growth and Zuckerberg’s former speechwriter – have been criticizing the company in increasingly harsh terms.
Altering the formula may diminish what made Facebook successful in the first place – a risk Zuckerberg and his team have said they are willing to take.
“Until the last few years, and certainly until the last year, the focus of our investment has been on building our service and offering new tools and experiences,” Schrage said. “One of the discoveries that the election in particular really demonstrated was, at the same time that we were making investments in positive new experiences, we were underinvesting in the tools and oversights to protect against inappropriate, abusive and exploitative experiences.”
That insight led to changes this month to revamp the news feed, the scrolling page that pops up when Facebook users sign in. Posts shared by family and close friends will now rank above content from news organizations and brands, Facebook said. On Friday, the company said it is also going to let users vote on which news organizations are the most trustworthy and should get the biggest play on Facebook, diminishing its own role in the distribution of news.
Much is at stake for Facebook as it seeks to fix itself. The company is now the fifth most valuable in the United States and one of the world’s biggest distributors of news.
The chorus of criticism – especially from within its own ranks – threatens to demoralize its workforce and spur regulators to use a stronger hand. The focus on quality control could come at the cost of growth and disappoint investors – who already signaled their frustration by sending Facebook’s stock down 4.5 per cent the day after Zuckerberg began laying out changes this year.
Before Facebook became an online advertising behemoth, and before an array of humbling events in 2017, the company was far more willing to tout its ability to influence society. In 2011, managers published a white paper bragging that an ad campaign on the social network was able to tilt the results of a local election in Florida. In 2014, Facebook rolled out an “I voted” button that aimed to increase voter turnout. Facebook researchers found that it increased turnout in a California election by more than 300,000 votes.
That same year, Facebook openly endorsed social engineering. Its prestigious data science research division published an emotion manipulation study on 700,000 Facebook users, which showed that the company could make people feel happier or more depressed by tweaking the content of news feeds.
As the 2016 election neared, concerns about appearing biased continued to weigh on Facebook executives. In addition, by October of 2016, Facebook’s security team had identified and purged 5.8 million fake accounts, some of which had spread phony news reports – including one about Pope Francis endorsing Trump.
The day after the election, Facebook employees, including several top executives, came to work “sick to their stomachs,” according to a former executive, who said that the internal discussions turned immediately to the role of false stories on the platform.
Still, the following week, Zuckerberg dismissed the idea that fake news on Facebook had an impact on the election, describing that possibility as “crazy.”
Even after receiving a personal warning from President Barack Obama, who cautioned Zuckerberg to take the subject more seriously, executives worried that taking down too many accounts would make the company appear biased against conservatives, according to a former executive.
Over the following months, Facebook’s security teams began to unearth more evidence pointing to the role of Russian operatives.
As the company was discovering evidence of Russian meddling, Sen. Mark Warner, D-Va., visited Facebook’s headquarters, where he pressured the firm to disclose what it knew. In the months after Warner’s visit, the company found evidence of 3,000 ads bought by the Internet Research Agency, a troll farm with Kremlin ties.
Initially, the company planned to only disclose the ads to Congress – and declined to share information on Russian strategies and their reach, such as the content of the ads, which types of Americans were targeted, and how many were exposed to Russian content that was not advertising and looked like regular Facebook posts.
After independent researchers claimed that the reach of Russian operatives was much greater than the company had disclosed, Facebook conducted an investigation and discovered that 126 million Americans were exposed to Russian disinformation through posts on Facebook, aside from an estimated 10 million users who saw the ads created by the Russian troll farm. (At the same time, the company blocked access to metrics and posts about Russian accounts that had enabled outside researchers to estimate the reach of the Russian ads.)
“They only act in response to public pressure,” said Tristan Harris, Google’s former design ethicist and the founder of TimeWellSpent, an advocacy group pressuring technology companies to make less addictive products.
Facebook ultimately promised to publish all political ads going forward and built a feature, released on the Friday before Christmas, that allows individual Facebook users to see whether they were targets of Russian disinformation. The company has also vowed to hire more than 10,000 workers, including academics, subject matter experts, and content moderators, to step up its efforts to ensure election integrity, boost its understanding of propaganda, and evaluate violent videos and hate speech.
Zuckerberg has since apologized for dismissing the impact of fake news on the election. And Schrage says that, after the growing pains of 2017, the company is learning to be more forthcoming – and to make painful choices. But Schrage emphasized Zuckerberg’s willingness to hurt the company’s bottom line if it meant making decisions that would improve user safety. He added that Facebook had still been more transparent than other tech giants, pointing out that Facebook was the first Internet company to admit Russian interference.