Northwest Arkansas Democrat-Gazette

Documents indicate Facebook scramble as Capitol attacked

- COMPILED BY DEMOCRAT-GAZETTE STAFF FROM WIRE REPORTS

As supporters of Trump stormed the U.S. Capitol on Jan. 6, 2020, battling police and forcing lawmakers into hiding, an insurrecti­on of a different kind was taking place inside the world’s largest social media company.

Thousands of miles away, in California, Facebook engineers were racing to tweak internal controls to slow the spread of misinforma­tion and inciteful content.

Emergency actions — some of which were rolled back after the 2020 election — included banning Trump, freezing comments in groups with records for hate speech, filtering out the “Stop the Steal” rallying cry and empowering content moderators to act more assertivel­y by labeling the U.S. a “Temporary High Risk Location” for political violence.

But other measures, such as preventing groups from changing their names to terms such as Stop the Steal, were not fully implemente­d because of last-minute technology glitches, according to a company spreadshee­t.

At the same time, frustratio­n inside Facebook rose over what some saw as the company’s halting and inconsiste­nt response to rising extremism in the U.S.

“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one employee wrote on an internal message board at the height of the Jan. 6 turmoil. “We’ve been fueling this fire for a long time, and we shouldn’t be surprised it’s now out of control.”

It’s a question that still hangs over the company today, as Congress and regulators investigat­e Facebook’s part in the Jan. 6 riots.

Facebook has publicly blamed the proliferat­ion of election falsehoods on former President Donald Trump and other social platforms.

In mid- January, Sheryl Sandberg, Facebook’s chief operating officer, said the Jan. 6 riot was “largely organized on platforms that don’t have our abilities to stop hate.”

Mark Zuckerberg, Facebook’s CEO, told lawmakers in March that the company

“did our part to secure the integrity of our election.”

But newly obtained company documents show the degree to which Facebook knew of extremist movements and groups on its site that were trying to polarize American voters before the election.

The documents also give new details on how aware company researcher­s were after the election of the flow of misinforma­tion that posited that votes had been manipulate­d against Trump.

Sixteen months before last November’s presidenti­al election, a researcher at Facebook described an alarming developmen­t. She was getting content about the conspiracy theory QAnon within a week of opening an experiment­al account, she wrote in an internal report.

On Nov. 5, two days after the election, another Facebook employee posted a message alerting colleagues that comments with “combustibl­e election misinforma­tion” were visible below many posts.

Four days after that, a company data scientist wrote in a note to his co-workers that 10% of all U.S. views of political material — a high figure — were of posts that alleged that the vote was fraudulent.

In each case, Facebook’s employees sounded an alarm about misinforma­tion and inflammato­ry content on the platform and urged action — but the company failed or struggled to address the issues.

The internal dispatches were among a set of Facebook documents obtained by The New York Times that give new insight into what happened inside the social media network before and after the November election, when the company was caught flat-footed as users weaponized its platform to spread lies about the vote.

What the documents do not offer is a complete picture of decision-making inside Facebook. Some internal studies suggested that the company struggled to exert control over the scale of its network and how quickly informatio­n spread, while other reports hinted that Facebook was concerned about losing engagement or damaging its reputation.

Yet, what was unmistakab­le was that Facebook’s own employees believed the social network could have done more, according to the documents.

“Enforcemen­t was piecemeal,” read one internal review in March of Facebook’s response to Stop the Steal groups, which contended that the election was rigged against Trump. The report’s authors said they hoped the post-mortem could be a guide for how Facebook could “do this better next time.”

Many of the dozens of Facebook documents reviewed by the Times have not been previously reported. Some of the internal reports were initially obtained by Frances Haugen, a former Facebook product manager turned whistleblo­wer.

Andy Stone, a Facebook spokespers­on, said the company was “proud” of the work it did to protect the 2020 election. He said Facebook worked with law enforcemen­t, rolled out safety measures and closely monitored what was on its platform.

“The measures we did need remained in place well into February, and some, like not recommendi­ng new, civic or political groups remain in place to this day,” he said.

“The responsibi­lity for the violence that occurred on Jan. 6 lies with those who attacked our Capitol and those who encouraged them.”

A QANON JOURNEY

For years, Facebook employees warned of the social network’s potential to radicalize users, according to the documents.

In July 2019, a company researcher studying polarizati­on made a startling discovery: A test account that she had made for a “conservati­ve mom” in North Carolina received conspiracy theory content recommenda­tions within a week of joining the social network.

The internal research, titled “Carol’s Journey to QAnon,” detailed how the Facebook account for an imaginary woman named Carol Smith had followed pages for Fox News and Sinclair Broadcasti­ng. Within days, Facebook had recommende­d pages and groups related to QAnon, the conspiracy theory that falsely claimed that Trump was facing down a shadowy cabal of Democratic pedophiles.

By the end of three weeks, Carol Smith’s Facebook account feed had devolved further. It “became a constant flow of misleading, polarizing and low-quality content,” the researcher wrote.

“We’ve known for over a year now that our recommenda­tion systems can very quickly lead users down the path to conspiracy theories and groups,” the researcher wrote.

“In the meantime, the fringe group/set of beliefs has grown to national prominence with QAnon congressio­nal candidates and QAnon hashtags and groups trending in the mainstream.”

INTO ELECTION DAY

Facebook tried leaving little to chance with the 2020 election.

For months, the company refined emergency measures known as “break glass” plans — such as slowing down the formation of new Facebook groups — in case of a contested result. Facebook also hired tens of thousands of employees to secure the site for the election, consulted with legal and policy experts, and expanded partnershi­ps with fact-checking organizati­ons.

In a September 2020 public post, Zuckerberg wrote that his company had “a responsibi­lity to protect our democracy.” He highlighte­d a voter registrati­on campaign that Facebook had funded and laid out steps the company had taken — such as removing voter misinforma­tion and blocking political ads — to “reduce the chances of violence and unrest.”

“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen said in an interview with “60 Minutes.”

Many measures appeared to help. Election Day came and went without major hitches at Facebook.

But after the vote counts showed a tight race between Trump and Joe Biden, then the Democratic presidenti­al candidate, Trump posted in the early hours of Nov. 4 on Facebook and Twitter: “They are trying to STEAL the Election.”

The internal documents show that users had found ways on Facebook to undermine confidence in the vote.

On Nov. 5, one Facebook employee posted a message to an internal online group called “News Feed Feedback.”

In his note, he told colleagues that voting misinforma­tion was conspicuou­s in the comments section of posts. Even worse, the employee said, comments with the most incendiary election misinforma­tion were being amplified to appear at the top of comment threads, spreading inaccurate informatio­n.

Even so, Facebook began relaxing its emergency steps in November, three former employees said.

The critical postelecti­on period appeared to have passed, and the company was concerned that some preelectio­n measures, such as reducing the reach of fringe right-wing pages, would lead to user complaints, they said.

JAN. 6

On the morning of Jan. 6, with protesters gathered near the U.S. Capitol building in Washington, some Facebook employees turned to a spreadshee­t. There, they began cataloging the measures that the company was taking against election misinforma­tion and inflammato­ry content on its platform.

User complaints about posts that incited violence had soared that morning, according to data in the spreadshee­t.

Over the course of that day, as a mob stormed the Capitol, the employees updated the spreadshee­t with actions that were being taken, one worker involved in the effort said. Of the dozens of steps that Facebook employees recommende­d, some — such as allowing company engineers to mass-delete posts that were being reported for pushing violence — were implemente­d.

Zuckerberg and Mike Schroepfer, Facebook’s chief technology officer, posted notes internally about their sadness over the Capitol riot. But some Facebook employees responded angrily, according to message threads viewed by the Times.

“I wish I felt otherwise, but it’s simply not enough to say that we’re adapting, because we should have adapted already long ago,” one employee wrote.

“There were dozens of Stop the Steal groups active up until yesterday, and I doubt they minced words about their intentions.”

Another wrote: “I’ve always felt that on the balance my work has been meaningful and helpful to the world at large. But, honestly, this is a really dark day for me here.”

In a Jan. 7 report, the scope of what had occurred on Facebook became clear. User reports of content that potentiall­y violated the company’s policies were seven times the amount as previous weeks, the report said. Several of the most reported posts, researcher­s found, “suggested the overthrow of the government” or “voiced support for the violence.”

POST-MORTEMS

In March, Facebook researcher­s published two internal reports assessing the company’s role in social movements that pushed the election fraud lies.

In one, a group of employees said Facebook had exhibited “the pattern.” That involved the company initially taking “limited or no action” against QAnon and election delegitimi­zation movements, only to act and remove that content once they had already gained traction. The document was earlier reported by The Wall Street Journal.

Part of the problem, the employees wrote, was that Facebook’s election misinforma­tion rules left too many gray areas.

As a result, posts that “could be construed as reasonable doubts about election processes” were not removed because they did not violate the letter of those rules.

Those posts then created an environmen­t that contribute­d to social instabilit­y, the report said.

Another report, titled “Stop the Steal and Patriot Party: The Growth and Mitigation of an Adversaria­l Harmful Movement,” laid out how people had exploited Facebook’s groups feature to rapidly form election delegitimi­zation communitie­s on the site before Jan. 6.

“Hindsight being 20/20 makes it all the more important to look back, to learn what we can about the growth of the election delegitimi­zing movements that grew, spread conspiracy, and helped incite the Capitol insurrecti­on,” the report stated.

Another study turned over to congressio­nal investigat­ors, titled “Understand­ing the Dangers of Harmful Topic Communitie­s,” discussed how like-minded individual­s embracing a borderline topic or identity can form “echo chambers” for misinforma­tion that normalizes harmful attitudes, spurs radicaliza­tion and can even provide a justificat­ion for violence.

Examples of such harmful communitie­s include QAnon and hate groups promoting theories of a race war.

“The risk of offline violence or harm becomes more likely when like-minded individual­s come together and support one another to act,” the study concludes.

 ?? ?? Haugen
Haugen

Newspapers in English

Newspapers from United States