Houston Chronicle Sunday

Facebook’s staff dissects its election role

Employees sounded alarm about falsehoods spread on the website

- By Ryan Mac and Sheera Frenkel

Sixteen months before last November’s presidenti­al election, a researcher at Facebook described an alarming developmen­t. She was getting content about the conspiracy theory QAnon within a week of opening an experiment­al account, she wrote in an internal report.

On Nov. 5, two days after the election, another Facebook employee posted a message alerting colleagues that comments with “combustibl­e election misinforma­tion” were visible below many posts.

Four days after that, a company data scientist wrote in a note to his co-workers that 10 percent of all U.S. views of political material — a startlingl­y high figure — were of posts that alleged the vote was fraudulent.

In each case, Facebook’s employees sounded an alarm about misinforma­tion and inflammato­ry content on the platform and urged action — but the company failed or struggled to address the issues.

The internal dispatches were among a set of Facebook documents obtained by the New York Times that give new insight into what happened inside the social network before and after the November election, when the company was caught flat-footed as users weaponized its platform to spread lies about the vote.

Facebook publicly has blamed the proliferat­ion of election falsehoods on former President Donald Trump and other social platforms.

In mid-January, Sheryl Sandberg, Facebook’s chief operating officer, said the Jan. 6 riot at the Capitol was “largely organized on platforms that don’t have our abilities to stop hate.” Mark Zuckerberg, Facebook’s CEO, told lawmakers in March that the company “did our part to secure the integrity of our election.”

But the company documents show the degree to which Facebook knew of extremist movements and groups on its site that were trying to polarize American voters before the election. The documents also give new detail on how aware company researcher­s were after the election of the flow of misinforma­tion that claimed votes had been manipulate­d against Trump.

What the documents don’t offer is a complete picture of decisionma­king inside Facebook. Some internal studies suggested that the company struggled to exert control over the scale of its network and how quickly informatio­n spread, while other reports hinted that Facebook was concerned about losing engagement or damaging its reputation.

Yet what was unmistakab­le was that Facebook’s own employees believed the social network could have done more, according to the documents.

“Enforcemen­t was piecemeal,” read one internal review in March of Facebook’s response to Stop the Steal groups, which contended that the election was rigged against Trump. The report’s authors said they hoped the post-mortem could be a guide for how Facebook could “do this better next time.”

Many of the dozens of Facebook documents reviewed by the Times haven’t been previously reported.

Some of the internal reports initially were obtained by Frances Haugen, a former Facebook product manager turned whistleblo­wer.

Andy Stone, a Facebook spokespers­on, said the company was “proud” of the work it did to protect the 2020 election. He said Facebook worked with law enforcemen­t, rolled out safety measures, and closely monitored what was on its platform.

“The measures we did need remained in place well into February, and some, like not recommendi­ng new, civic or political groups, remain in place to this day,” he said. “The responsibi­lity for the violence that occurred on Jan. 6 lies with those who attacked our Capitol and those who encouraged them.”

For years, Facebook employees warned of the social network’s potential to radicalize users, according to the documents.

In July 2019, a company researcher studying polarizati­on made a startling discovery: A test account she had made for a “conservati­ve mom” in North Carolina received conspiracy theory content recommenda­tions within a week of joining the social network.

The internal research, titled “Carol’s Journey to QAnon,” detailed how the Facebook account for an imaginary woman named Carol Smith had followed pages for Fox News and Sinclair Broadcasti­ng. Within days, Facebook had recommende­d pages and groups related to QAnon, the conspiracy theory that falsely claimed Trump was facing down a shadowy cabal of Democratic pedophiles.

By the end of three weeks, Carol Smith’s Facebook account feed had devolved further. It “became a constant flow of misleading, polarizing and low-quality content,” the researcher wrote.

“We’ve known for over a year now that our recommenda­tion systems can very quickly lead users down the path to conspiracy theories and groups,” the researcher wrote. “In the meantime, the fringe group/set of beliefs has grown to national prominence with QAnon congressio­nal candidates and QAnon hashtags and groups trending in the mainstream.”

 ?? Lev Radin / Tribune News Service ?? Rioters clashing with police use a ladder to try to force entry into the Capitol during the insurrecti­on on Jan. 6 in Washington, D.C.
Lev Radin / Tribune News Service Rioters clashing with police use a ladder to try to force entry into the Capitol during the insurrecti­on on Jan. 6 in Washington, D.C.
 ?? Alex Brandon / Associated Press ?? Former Facebook employee Frances Haugen spoke to a Senate committee on the company’s failings leading up to Jan. 6.
Alex Brandon / Associated Press Former Facebook employee Frances Haugen spoke to a Senate committee on the company’s failings leading up to Jan. 6.

Newspapers in English

Newspapers from United States