East Bay Times

Amid the Capitol riot, Facebook was dealing with own insurrecti­on

- By Alan Suderman and Joshua Goodman

WASHINGTON >> As supporters of Donald Trump stormed the U.S. Capitol on Jan. 6, battling police and forcing lawmakers into hiding, an insurrecti­on of a different kind was taking place inside the world’s largest social media company.

Thousands of miles away, in California, Facebook engineers were racing to tweak internal controls to slow the spread of misinforma­tion and inciteful content. Emergency actions — some of which were rolled back after the 2020 election — included banning Trump, freezing comments in groups with a record for hate speech, filtering out the “Stop the Steal” rallying cry and empowering content moderators to act more assertivel­y by labeling the U.S. a “Temporary High Risk Location” for political violence.

At the same time, frustratio­n inside Facebook erupted over what some saw as the company’s halting and inconsiste­nt response to rising extremism in the U.S.

“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one employee wrote on an internal message board at the height of the Jan. 6 turmoil. “We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”

It’s a question that still hangs over the company today as Congress and regulators investigat­e Facebook’s part in the Jan. 6 riots.

New internal documents provided by former Facebook employee-turned-whistleblo­wer Frances Haugen provide a rare glimpse into how the company appears to have simply stumbled into the Jan. 6 riot. It quickly became clear that even after years under the microscope for insufficie­ntly policing its platform, the social network had missed how riot participan­ts spent weeks vowing — on Facebook itself — to stop Congress from certifying Joe Biden’s election victory.

The documents also appear to bolster Haugen’s claim that Facebook put its growth and profits ahead of public safety, opening the clearest window yet into how Facebook’s conflictin­g impulses — to safeguard its business and protect democracy — clashed in the days and weeks leading up to the attempted Jan. 6 coup.

This story is based in part on disclosure­s Haugen made to the Securities and Exchange Commission and provided to Congress in redacted form by Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizati­ons, including The Associated Press. What Facebook called “Break the Glass” emergency measures put in place Jan. 6 were essentiall­y a toolkit of options designed to stem the spread of dangerous or violent content that the social network had first used in the run-up to the bitter 2020 election. As many as 22 of those measures were rolled back at some point after the election, according to an internal spreadshee­t analyzing the company’s response.

“As soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety,” Haugen said in an interview with “60 Minutes.”

An internal Facebook report following Jan. 6, previously reported by BuzzFeed, faulted the company for having a “piecemeal” approach to the rapid growth of “Stop the Steal” pages, related misinforma­tion sources, and violent and inciteful comments.

Facebook says the situation is more nuanced and that it carefully calibrates its controls to react quickly to spikes in hateful and violent content, as it did on Jan 6. The company said it’s not responsibl­e for the actions of the rioters and that having stricter controls in place prior to that day wouldn’t have helped.

Facebook’s decisions to phase certain safety measures in or out took into account signals from the Facebook platform as well as informatio­n from law enforcemen­t, said spokeswoma­n Dani Lever. “When those signals changed, so did the measures.”

Lever said some of the measures stayed in place well into February, and others remain active today.

Some employees were unhappy with Facebook’s managing of problemati­c content even before the Jan. 6 riots. One employee who departed the company in 2020 left a long note charging that promising new tools, backed by strong research, were being constraine­d by Facebook for “fears of public and policy stakeholde­r responses” (translatio­n: concerns about negative reactions from Trump allies and investors).

“Similarly (though even more concerning), I’ve seen already built & functionin­g safeguards being rolled back for the same reasons,” wrote the employee, whose name is blacked out.

Research conducted by Facebook well before the 2020 campaign

left little doubt that its algorithm could pose a serious danger of spreading misinforma­tion and potentiall­y radicalizi­ng users.

One 2019 study, titled “Carol’s Journey to QAnon — A Test User Study of Misinfo & Polarizati­on Risks Encountere­d through Recommenda­tion Systems,” described results of an experiment conducted with a test account establishe­d to reflect the views of a prototypic­al “strong conservati­ve” — but not extremist — 41year North Carolina woman. This test account, using the fake name Carol Smith, indicated a preference for mainstream news sources like Fox News, followed humor groups that mocked liberals, embraced Christiani­ty and was a fan of Melania Trump.

Within a single day, page recommenda­tions for this account generated by Facebook itself had evolved to a “quite troubling, polarizing state,” the study found. By day 2, the algorithm was recommendi­ng more extremist content, including a QAnon-linked group, which the fake user didn’t join because she wasn’t innately drawn to conspiracy theories.

A week later, the test subject’s feed featured “a barrage of extreme, conspirato­rial and graphic content,” including posts reviving the Barack Obama birther lie and linking the Clintons to the murder of a former Arkansas state senator. Much of the content was pushed by dubious groups run from abroad or by administra­tors with a track record for violating Facebook’s rules on bot activity.

Those results led the researcher, whose name was redacted by the whistleblo­wer, to recommend safety measures running from removing content with known conspiracy references and disabling “top contributo­r” badges for misinforma­tion commenters to lowering the threshold number of followers required before Facebook verifies a page administra­tor’s identity.

Among the other Facebook employees who read the research, the response was almost universall­y supportive.

“Hey! This is such a thorough and well-outlined (and disturbing) study,” one user wrote, the name blacked out by the whistleblo­wer. “Do you know of any concrete changes that came out of this?”Facebook said the study was an one of many examples of its commitment to continuall­y studying and improving its platform.

Another study turned over to congressio­nal investigat­ors, “Understand­ing the Dangers of Harmful Topic Communitie­s,” discussed how like-minded individual­s embracing a borderline topic or identity can form “echo chambers” for misinforma­tion that normalizes harmful attitudes, spurs radicaliza­tion and can even provide a justificat­ion for violence.

Examples of such harmful communitie­s include QAnon and, hate groups promoting theories of a race war.

“The risk of offline violence or harm becomes more likely when like-minded individual­s come together and support one another to act,” the study concludes.

 ?? JOSE LUIS MAGANA — THE ASSOCIATED PRESS ?? Insurrecti­onists try to open a door of the U.S. Capitol om Jan. 6. New internal documents provided by former Facebook employee-turnedwhis­tleblower Frances Haugen provide a rare glimpse into how the company, after years under the microscope for the policing of its platform, appears to have simply stumbled into the Jan. 6 riot.
JOSE LUIS MAGANA — THE ASSOCIATED PRESS Insurrecti­onists try to open a door of the U.S. Capitol om Jan. 6. New internal documents provided by former Facebook employee-turnedwhis­tleblower Frances Haugen provide a rare glimpse into how the company, after years under the microscope for the policing of its platform, appears to have simply stumbled into the Jan. 6 riot.

Newspapers in English

Newspapers from United States