National Post

FACEBOOK’S FAKE NEWS MEA CULPA

BARACK OBAMA TRIED TO GIVE CHIEF EXECUTIVE MARK ZUCKERBERG A WAKE-UP CALL IN A PERSONAL MEETING FOLLOWING THE U. S. ELECTION

- Adam Entous

Nine days after Facebook chief executive Mark Zuckerberg dismissed as “crazy” the idea that fake news on his company’s social network played a key role in the U. S. election, president Barack Obama pulled the youthful tech billionair­e aside and delivered what he hoped would be a wake-up call.

For months leading up to the vote, Obama and his top aides quietly agonized over how to respond to Russia’s brazen interventi­on on behalf of the Donald Trump campaign without making matters worse. Weeks after Trump’s surprise victory, some of Obama’s aides looked back with regret and wished they had done more.

Now huddled in a private room on the sidelines of a meeting of world leaders in Lima, Peru, two months before Trump’s inaugurati­on, Obama made a personal appeal to Zuckerberg to take the threat of fake news and political disinforma­tion seriously. Unless Facebook and the government did more to address the threat, Obama warned, it would only get worse in the next presidenti­al race.

Zuckerberg acknowledg­ed the problem posed by fake news, but he told Obama those messages weren’t widespread on Facebook and that there was no easy fix, according to people briefed on the exchange.

The conversati­on on Nov. 19 was a flashpoint in a tumultuous year in which Zuckerberg came to recognize the magnitude of a new threat — a co-ordinated assault on a U. S. election by a shadowy foreign force that exploited the social network he created.

Like the U. S. government, Facebook didn’t foresee the wave of disinforma­tion that was coming and the political pressure that followed. The company then grappled with a series of hard choices designed to shore up its own systems without impinging on free discourse for its users around the world.

One outcome of those efforts was Zuckerberg’s admission last week that Facebook had indeed been manipulate­d and that the company would now turn over to Congress more than 3,000 politicall­y themed advertisem­ents that were bought by suspected Russian operatives.

But that highly public moment came after months of manoeuvrin­g behind the scenes. Some critics say Facebook dragged its feet and is acting now only because of outside political pressure.

“There’s been a systematic failure of responsibi­lity” on Facebook’s part, said Zeynep Tufekci, an associate professor at the University of North Carolina at Chapel Hill who studies social media companies’ impact on society and government­s. “It’s rooted in their overconfid­ence that they know best, their naiveté about how the world works, their expensive effort to avoid oversight, and their business model of having very few employees so that no one is minding the store.”

Facebook says it responded appropriat­ely.

“We believe in the power of democracy, which is why we’re taking this work on elections integrity so seriously, and have come forward at every opportunit­y to share what we’ve found,” said Elliot Schrage, vice-president for public policy and communicat­ions.

This account — based on interviews with more than a dozen people involved in the government’s i nvestigati­on and Facebook’s response — provides the first detailed backstory of a 16- month journey in which the company came to terms with an unanticipa­ted foreign attack on the U. S. political system and its search for tools to limit the damage. Among the revelation­s is how Facebook detected elements of the Russian informatio­n operation in June 2016 and then notified the FBI.

Yet in the months that followed, the government and the private sector struggled to work together to diagnose and fix the problem.

The growing political drama over these issues has come at a time of broader reckoning for Facebook, as Zuckerberg has wrestled with whether to take a more active role in combating an emerging dark side on the social network — including fake news, suicides on live video, and allegation­s that the company was censoring political speech.

These issues have forced Facebook and other Silicon Valley companies to weigh core values, including freedom of speech, against the problems created when malevolent actors use those same freedoms to pump messages of violence, hate and disinforma­tion.

There has been a rising bipartisan clamour, meanwhile, for new regulation of a tech industry that has largely had its way in Washington despite concerns raised by critics about its behaviour.

“There is no question that the idea that Silicon Valley is the darling of our markets and of our society — that sentiment is definitely turning,” said Tim O’Reilly, an adviser to tech executives and chief executive of the influentia­l Silicon Valley- based publisher O’Reilly Media.

The encounter in Lima was not the first time Obama had sought Facebook’s help.

In the aftermath of the December 2015 shooting in San Bernardino, Calif., the president dispatched members of his national security team to huddle with leading Silicon Valley executives over ways to thwart the Islamic State’s practice of using U. S.- based technology platforms to recruit members and inspire attacks.

The result was a summit, on Jan. 8, 2016, that was attended by one of Zuckerberg’s top deputies, chief operating officer Sheryl Sandberg. The outreach effort paid off in the view of the Obama administra­tion when Facebook agreed to set up a special unit to develop tools for finding Islamic State messages and blocking their disseminat­ion.

Facebook’s efforts were aided in part by the relatively transparen­t ways in which the extremist group sought to build its global brand. Most of its propaganda messages on Facebook incorporat­ed the Islamic State’s distinctiv­e black flag — the kind of image that software programs can be trained to automatica­lly detect.

In contrast, the Russian disinforma­tion effort has proven far harder to track and combat because Russian operatives were taking advantage of Facebook’s core functions, connecting users with shared content and with targeted native ads to shape the political environmen­t in an unusually contentiou­s political season, say people familiar with Facebook’s response.

Unlike the Islamic State, what Russian operatives posted on Facebook was, for the most part, indistingu­ishable from legitimate political speech. The difference was the accounts that were set up to spread the misinforma­tion and hate were illegitima­te.

It turned out that Facebook, without realizing it, had stumbled into the Russian operation as it was getting underway in June 2016.

At the time, cybersecur­ity experts at the company were tracking a Russian hacker group known as APT28, or Fancy Bear, which U. S. intelligen­ce officials considered an arm of the Russian military intelligen­ce service, the GRU, according to people familiar with Facebook’s activities.

Members of the Russian hacker group were best known for stealing military plans and data from political targets, so the security experts assumed that they were planning some sort of espionage operation — not a sweeping disinforma­tion campaign designed to shape the outcome of the U. S. presidenti­al race.

Facebook executives shared with the FBI their suspicions that a Russian espionage operation was in the works, a person familiar with the matter said. An FBI spokespers­on had no immediate comment.

Soon thereafter, Facebook’s cyber experts found evidence that members of APT28 were setting up a series of shadowy accounts — including a persona known as Guccifer 2.0 and a Facebook page called DCLeaks — to promote stolen emails and other documents during the presidenti­al race. Facebook officials once again contacted the FBI to share what they had seen.

After the November election, Facebook began to l ook more broadly at the accounts that had been created during the campaign.

A review by the company found that most of the groups behind the problemati­c pages had clear financial motives, which suggested that they weren’t working for a foreign government.

But amid the mass of data the company was analyzing, the security team did not find clear evidence of Russian disinforma­tion or ad purchases by Russian-linked accounts.

Nor did any U. S. law enforcemen­t or intelligen­ce officials visit the company to lay out what they knew, said people familiar with the effort, even after the nation’s top intelligen­ce official, James Clapper, testified on Capitol Hill in January that the Russians had waged a massive propaganda campaign online.

As Facebook struggled to find clear evidence of Russian manipulati­on, the idea was gaining credence in other influentia­l quarters.

In the electrifie­d aftermath of the election, aides to Hillary Clinton and Obama pored over polling numbers and turnout data, looking for clues to explain what they saw as an unnatural turn of events.

One of the theories to emerge from their post- mortem was that Russian operatives who were directed by the Kremlin to support Trump may have taken advantage of Facebook and other social media platforms to direct their messages to American voters in key demographi­c areas in order to increase enthusiasm for Trump and suppress support for Clinton.

These former advisers didn’t have hard evidence that Russian trolls were using Facebook to micro- target voters in swing districts — at least not yet — but they shared their theories with the House and Senate intelligen­ce committees, which launched parallel investigat­ions into Russia’s role in the presidenti­al campaign in January.

Sen. Mark Warner, vice- chairman of the Senate Intelligen­ce Committee, initially wasn’t sure what to make of Facebook’s role. U. S. i ntelligenc­e agencies had briefed the Virginia Democrat and other members of the committee about alleged Russian contacts with the Trump campaign and about how the Kremlin leaked Democratic emails to WikiLeaks to undercut Clinton.

But the intelligen­ce agencies had little data on Russia’s use of Facebook and other U. S.- based social media platforms, in part because of rules designed to protect the privacy of communicat­ions between Americans.

Facebook’s effort to understand Russia’s multi- faceted influence campaign continued as well.

Zuckerberg announced in a 6,000- word blog post in February that Facebook needed to play a greater role in controllin­g its dark side.

“It is our responsibi­lity,” he wrote, “to amplify the good effects ( of the Facebook platform) and mitigate the bad — to continue increasing diversity while strength- ening our common understand­ing so our community can create the greatest positive impact on the world.” The extent of Facebook’s internal self- examinatio­n became clear in April, when Facebook chief security officer Alex Stamos co-authored a 13-page white paper detailing the results of a sprawling research effort that included input from experts from across the company, who in some cases also worked to build new software aimed specifical­ly at detecting foreign propaganda.

“Facebook sits at a critical juncture,” Stamos wrote in the paper, adding that the effort focused on “actions taken by organized actors ( government­s or non- state actors) to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitic­al outcome.” He described how the company had used a technique known as machine learning to build specialize­d data- mining software that can detect patterns of behaviour — for example, the repeated posting of the same content — that malevolent actors might use.

The software tool was given a secret designatio­n, and Facebook is now deploying it and others in the run-up to elections around the world. It was used in the French election in May, where it helped disable 30,000 fake accounts, the company said. It was put to the test again on Sunday when Germans went to the polls. Facebook declined to share the software tool’s code name. Another recently developed tool shows users when articles have been disputed by third-party fact checkers.

Notably, Stamos’s paper did not raise the topic of political advertisin­g — an omission that was noticed by Capitol Hill investigat­ors.

A few weeks after the French election, Warner flew out to California to visit Facebook in person. It was an opportunit­y for the senator to press Stamos directly on whether the Russians had used the company’s tools to disseminat­e anti- Clinton ads to key districts.

Officials said Stamos underlined to Warner the magnitude of the challenge Facebook faced policing political content that looked legitimate.

Stamos told Warner that Facebook had found no accounts that used advertisin­g but agreed with the senator that some likely existed. The difficulty for Facebook was finding them.

Finally, Stamos appealed to Warner for help: If U. S. intelligen­ce agencies had any informatio­n about the Russian operation or the troll farms it used to disseminat­e misinforma­tion, they should share it with Facebook. The company is still waiting, people involved in the matter said.

Warner’s visit spurred Facebook to make some changes in how it was conducting its internal investigat­ion. Instead of searching through impossibly large batches of data, it decided to focus on a subset of political ads.

Technician­s then searched for “indicators” that would link those ads to Russia. To narrow down the search further, Facebook zeroed in on a Russian entity known as the Internet Research Agency, which had been publicly identified as a troll farm.

By early August, Facebook had identified more than 3,000 ads addressing social and political issues that ran in the United States between 2015 and 2017 and that appear to have come from accounts associated with the Internet Research Agency.

After making the discovery, Facebook reached out to Warner’s staff to share what they had learned.

Congressio­nal investigat­ors say the disclosure only scratches the surface. One called Facebook’s discoverie­s thus far “the tip of the iceberg.” Nobody really knows how many accounts are out there and how to prevent more of them from being created to shape the next election — and turn American society against itself.

TECHNICIAN­S SEARCHED FOR “INDICATORS” THAT WOULD LINK ADS TO RUSSIA.

 ?? JOSH EDELSON / AFP / GETTY IMAGES ?? The Facebook sign and logo in Menlo Park, Calif. The social media platform played a huge role in the 2016 U. S. presidenti­al election.
JOSH EDELSON / AFP / GETTY IMAGES The Facebook sign and logo in Menlo Park, Calif. The social media platform played a huge role in the 2016 U. S. presidenti­al election.
 ?? JUSTIN SULLIVAN / GETTY IMAGES ?? Then-U. S. President Barack Obama talks with Facebook CEO Mark Zuckerberg during a town hall meeting at Facebook headquarte­rs in April 2011 in Palo Alto, Calif. Obama met with Zuckerberg last year to warn of the dangers of Facebook being used to spread...
JUSTIN SULLIVAN / GETTY IMAGES Then-U. S. President Barack Obama talks with Facebook CEO Mark Zuckerberg during a town hall meeting at Facebook headquarte­rs in April 2011 in Palo Alto, Calif. Obama met with Zuckerberg last year to warn of the dangers of Facebook being used to spread...
 ?? JUSTIN SULLIVAN / GETTY IMAGES ?? U. S. President Barack Obama and Facebook CEO Mark Zuckerberg in April 2011 in Palo Alto, Calif. Warnings from Obama prompted Facebook to investigat­e the use of their platform by Russian hackers to spread propaganda and misinforma­tion during the 2016...
JUSTIN SULLIVAN / GETTY IMAGES U. S. President Barack Obama and Facebook CEO Mark Zuckerberg in April 2011 in Palo Alto, Calif. Warnings from Obama prompted Facebook to investigat­e the use of their platform by Russian hackers to spread propaganda and misinforma­tion during the 2016...
 ?? MANDEL NGAN / AFP / GETTY IMAGES ?? Originally skeptical of warnings that his platform was being manipulate­d, Facebook CEO Zuckerberg, seen here with Obama in 2011, started an investigat­ion that so far has discovered at least 3,000 political ads traced to a recognized Russian troll farm.
MANDEL NGAN / AFP / GETTY IMAGES Originally skeptical of warnings that his platform was being manipulate­d, Facebook CEO Zuckerberg, seen here with Obama in 2011, started an investigat­ion that so far has discovered at least 3,000 political ads traced to a recognized Russian troll farm.

Newspapers in English

Newspapers from Canada