Milwaukee Journal Sentinel

Facebook faced its own insurrecti­on Jan. 6

- Alan Suderman and Joshua Goodman

WASHINGTON – As supporters of Donald Trump stormed the U.S. Capitol on Jan. 6, fighting police and forcing lawmakers into hiding, an insurrecti­on of a different kind was taking place inside the world’s largest social media company.

Thousands of miles away, in California, Facebook engineers were racing to tweak internal controls to slow the spread of misinforma­tion and inciteful content. Emergency actions – some of which were rolled back after the 2020 election – included banning Trump, freezing comments in groups with a record for hate speech, filtering out the “Stop the Steal” rallying cry and empowering content moderators to act more assertivel­y by labeling the U.S. a “Temporary High Risk Location” for political violence.

At the same time, frustratio­n inside Facebook erupted over what some saw as the company’s halting and inconsiste­nt response to rising extremism in the U.S.

“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one employee wrote on an internal message board at the height of the Jan. 6 turmoil. “We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”

It’s a question that still hangs over the company today, as Congress and regulators investigat­e Facebook’s part in the

Jan. 6 riots.

New internal documents provided by former Facebook employeetu­rned-whistleblo­wer Frances Haugen provide a rare glimpse into how the company appears to have stumbled into the Jan. 6 riot. It quickly became clear that even after years under the microscope for insufficiently policing its platform, the social network had missed how riot participan­ts spent weeks vowing – on Facebook itself – to stop Congress from certifying Joe Biden’s election victory.

The documents also appear to bolster Haugen’s claim that Facebook put its growth and profits ahead of public safety, opening the clearest window yet into how Facebook’s conflicting impulses – to safeguard its business and protect democracy – clashed in the days and weeks leading up to the Jan. 6 riots.

This story is based in part on disclosure­s Haugen made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizati­ons, including the Associated Press.

What Facebook called “Break the Glass” emergency measures put in place on Jan. 6 were essentiall­y a toolkit of options designed to stem the spread of dangerous or violent content that the social network had first used in the run-up to the bitter 2020 election. As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadshee­t analyzing the company’s response.

“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen said in an interview with “60 Minutes.”

An internal Facebook report following Jan. 6, previously reported by BuzzFeed, faulted the company for having a “piecemeal” approach to the rapid growth of “Stop the Steal” pages, related misinforma­tion sources, and violent and inciteful comments.

Facebook said the situation is more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content, as it did on Jan 6. The company said it’s not responsibl­e for the actions of the rioters and that having stricter controls in place prior to that day wouldn’t have helped.

Facebook’s decisions to phase certain safety measures in or out took into account signals from the Facebook platform as well as informatio­n from law enforcemen­t, said spokeswoma­n Dani Lever.

“When those signals changed, so did the measures,” she said.

Lever said some of the measures stayed in place well into February and others remain active today.

Research conducted by Facebook well before the 2020 campaign left little doubt that its algorithm could pose a serious danger of spreading misinforma­tion and potentiall­y radicalizi­ng users.

One 2019 study, entitled “Carol’s Journey to QAnon–A Test User Study of Misinfo & Polarizati­on Risks Encountere­d through Recommenda­tion Systems,” described results of an experiment conducted with a test account establishe­d to reflect the views of a prototypic­al “strong conservati­ve” – but not extremist – 41-year North Carolina woman. This test account, using the fake name Carol Smith, indicated a preference for mainstream news sources like Fox News, followed humor groups that mocked liberals, embraced Christiani­ty and was a fan of Melania Trump.

Within a day, page recommenda­tions for this account generated by

Facebook itself had evolved to a “quite troubling, polarizing state,” the study found. By Day 2, the algorithm was recommendi­ng more extremist content, including a QAnon-linked group.

Newspapers in English

Newspapers from United States