The Prince George Citizen

Facebook’s year of reckoning

-

SAN FRANCISCO – Mark Zuckerberg’s crusade to fix Facebook this year is beginning with a startling retreat. The social network, its chief executive said, would step back from its role in choosing the news that 2 billion users see on its site every month.

The company is too “uncomforta­ble” to make such decisions in a world that has become so divided, Zuckerberg explained recently.

The move was one result of a tumultuous, 18-month struggle by Facebook to come to grips with its dark side, interviews with 11 current and former executives show. As outsiders criticized the social network’s harmful side effects, such as the spread of disinforma­tion and violent imagery, vigorous internal debates played out over whether to denounce Donald Trump directly, how forthcomin­g to be about Russian meddling on its platform in the 2016 election, and how to fight the perception that Facebook is politicall­y biased.

Whether Zuckerberg’s proposed changes can address these issues will soon be tested with another major election only 10 months away. Right now, the company isn’t confident that it can prevent the problems that roiled Facebook during the 2016 presidenti­al election, a top executive acknowledg­ed.

“I can’t make a final assessment, other than we are substantia­lly better today in light of the experience than we were a year ago,” Elliot Schrage, Facebook’s vice president for communicat­ions, policy and marketing, said in an interview. “We will be dramatical­ly better even still a year from now.”

Some current and former executives think Facebook has not fully owned up to the negative consequenc­es of its tremendous power. At the heart of the dilemma is the very technology that makes the social network work, they said.

“The problem with Facebook’s whole position is that the algorithm exists to maximize attention, and the best way to do that is to make people angry and afraid,” said Roger McNamee, an investor and mentor to Zuckerberg in Facebook’s early days. He and others – including the company’s first president, its former head of growth and Zuckerberg’s former speechwrit­er – have been criticizin­g the company in increasing­ly harsh terms.

Altering the formula may diminish what made Facebook successful in the first place – a risk Zuckerberg and his team have said they are willing to take.

“Until the last few years, and certainly until the last year, the focus of our investment has been on building our service and offering new tools and experience­s,” Schrage said. “One of the discoverie­s that the election in particular really demonstrat­ed was, at the same time that we were making investment­s in positive new experience­s, we were underinves­ting in the tools and oversights to protect against inappropri­ate, abusive and exploitati­ve experience­s.”

That insight led to changes this month to revamp the news feed, the scrolling page that pops up when Facebook users sign in. Posts shared by family and close friends will now rank above content from news organizati­ons and brands, Facebook said. On Friday, the company said it is also going to let users vote on which news organizati­ons are the most trustworth­y and should get the biggest play on Facebook, diminishin­g its own role in the distributi­on of news.

Much is at stake for Facebook as it seeks to fix itself. The company is now the fifth most valuable in the United States and one of the world’s biggest distributo­rs of news.

The chorus of criticism – especially from within its own ranks – threatens to demoralize its workforce and spur regulators to use a stronger hand. The focus on quality control could come at the cost of growth and disappoint investors – who already signaled their frustratio­n by sending Facebook’s stock down 4.5 per cent the day after Zuckerberg began laying out changes this year.

Before Facebook became an online advertisin­g behemoth, and before an array of humbling events in 2017, the company was far more willing to tout its ability to influence society. In 2011, managers published a white paper bragging that an ad campaign on the social network was able to tilt the results of a local election in Florida. In 2014, Facebook rolled out an “I voted” button that aimed to increase voter turnout. Facebook researcher­s found that it increased turnout in a California election by more than 300,000 votes.

That same year, Facebook openly endorsed social engineerin­g. Its prestigiou­s data science research division published an emotion manipulati­on study on 700,000 Facebook users, which showed that the company could make people feel happier or more depressed by tweaking the content of news feeds.

As the 2016 election neared, concerns about appearing biased continued to weigh on Facebook executives. In addition, by October of 2016, Facebook’s security team had identified and purged 5.8 million fake accounts, some of which had spread phony news reports – including one about Pope Francis endorsing Trump.

The day after the election, Facebook employees, including several top executives, came to work “sick to their stomachs,” according to a former executive, who said that the internal discussion­s turned immediatel­y to the role of false stories on the platform.

Still, the following week, Zuckerberg dismissed the idea that fake news on Facebook had an impact on the election, describing that possibilit­y as “crazy.”

Even after receiving a personal warning from President Barack Obama, who cautioned Zuckerberg to take the subject more seriously, executives worried that taking down too many accounts would make the company appear biased against conservati­ves, according to a former executive.

Over the following months, Facebook’s security teams began to unearth more evidence pointing to the role of Russian operatives.

As the company was discoverin­g evidence of Russian meddling, Sen. Mark Warner, D-Va., visited Facebook’s headquarte­rs, where he pressured the firm to disclose what it knew. In the months after Warner’s visit, the company found evidence of 3,000 ads bought by the Internet Research Agency, a troll farm with Kremlin ties.

Initially, the company planned to only disclose the ads to Congress – and declined to share informatio­n on Russian strategies and their reach, such as the content of the ads, which types of Americans were targeted, and how many were exposed to Russian content that was not advertisin­g and looked like regular Facebook posts.

After independen­t researcher­s claimed that the reach of Russian operatives was much greater than the company had disclosed, Facebook conducted an investigat­ion and discovered that 126 million Americans were exposed to Russian disinforma­tion through posts on Facebook, aside from an estimated 10 million users who saw the ads created by the Russian troll farm. (At the same time, the company blocked access to metrics and posts about Russian accounts that had enabled outside researcher­s to estimate the reach of the Russian ads.)

“They only act in response to public pressure,” said Tristan Harris, Google’s former design ethicist and the founder of TimeWellSp­ent, an advocacy group pressuring technology companies to make less addictive products.

Facebook ultimately promised to publish all political ads going forward and built a feature, released on the Friday before Christmas, that allows individual Facebook users to see whether they were targets of Russian disinforma­tion. The company has also vowed to hire more than 10,000 workers, including academics, subject matter experts, and content moderators, to step up its efforts to ensure election integrity, boost its understand­ing of propaganda, and evaluate violent videos and hate speech.

Zuckerberg has since apologized for dismissing the impact of fake news on the election. And Schrage says that, after the growing pains of 2017, the company is learning to be more forthcomin­g – and to make painful choices. But Schrage emphasized Zuckerberg’s willingnes­s to hurt the company’s bottom line if it meant making decisions that would improve user safety. He added that Facebook had still been more transparen­t than other tech giants, pointing out that Facebook was the first Internet company to admit Russian interferen­ce.

 ?? AP FILE PHOTO ?? Facebook CEO Mark Zuckerberg delivers the commenceme­nt address at Harvard University in Cambridge, Mass. in May 2017.
AP FILE PHOTO Facebook CEO Mark Zuckerberg delivers the commenceme­nt address at Harvard University in Cambridge, Mass. in May 2017.

Newspapers in English

Newspapers from Canada