How Facebook amplifies anger and lies
What happened
Facebook was hit this week with a tidal wave of revelations from leaked internal documents, which depict the social media giant as a malignant force that knowingly prioritized profit over public good. A consortium of 17 media organizations released dozens of news stories culled from thousands of pages of reports, memos, discussion threads, and other documents provided by Frances Haugen, a Facebook product manager turned whistleblower. The documents reveal disillusioned employees expressing dismay when CEO and founder Mark Zuckerberg and other top executives ignored Facebook’s own internal research on its damaging social effects. Zuckerberg, employees said, would not listen to internal or external calls for reform, because he prizes reader engagement and ad revenue above all else. “It’s not normal for a large number of people in the ‘make the site safe’ team to leave saying, ‘Hey, we’re actively making the world worse FYI,’” wrote one employee on an internal message board. Another wrote: “History will not judge us kindly.”
The documents showed that Facebook intentionally amplified divisive content in its News Feed, promoting posts that provoked clicks on the “angry” emoji for years because they increased user engagement—even though they were more likely to contain misinformation and “leave users divided and depressed,” as a company researcher wrote. Documents depict how lax oversight in non-English-speaking countries such as India allowed the proliferation of hate speech and disinformation that fomented political violence. In the U.S., the documents revealed, Facebook executives deliberately allowed right-wing websites such as Breitbart and the Daily Wire to violate content rules by posting false or incendiary material, for fear that sanctioning them would unleash retaliation by conservatives or former President Trump. In some documents, employees express disgust about Facebook’s role in the Jan. 6 Capitol invasion, and its use by QAnon followers and Trump supporters to spread misinformation about the election and calls to violence. “We’ve been fueling this fire for a long time, and we shouldn’t be surprised it’s now out of control,” wrote one staffer.
In a call with financial analysts, Zuckerberg called the news coverage “a coordinated effort to selectively use leaked documents to create a false picture about our company.” The same day, Haugen told British lawmakers that Facebook’s algorithms promote and reward “anger and hate,” and called for more-aggressive regulation of Zuckerberg’s empire. “Until the incentives change, Facebook will not change,” she said.
What the editorials said
“The revelations about how Facebook platforms can spread misinformation and undermine democracy keep coming,” said The Sacramento Bee. Now we learn how its lax “security safeguards” following last year’s election “allowed the undemocratic lunacy” that led to “the storming of the U.S. Capitol on Jan. 6 to fester.” Facebook deserves all “the public scorn it has earned.”
The site has wreaked havoc in “vulnerable communities around the world,” said the Minneapolis Star-Tribune. Facebook has more than 2.8 billion global users—but applies only 16 percent of its monitoring efforts outside the U.S. “The results can be catastrophic.” The site allowed the proliferation of anti-Muslim hate speech in India that an internal report linked to deadly riots. A test News Feed set up by a Facebook researcher there became “a near-constant barrage of polarizing nationalist content, misinformation, and violence and gore.” In other countries, the site was used by drug cartels to find new members and by human traffickers to lure in women.
What the columnists said
The documents “leave little room for doubt about Facebook’s crucial role in advancing the cause of authoritarianism in America and around the world,” said Adrienne LaFrance in TheAtlantic .com. Employees alarmed at Facebook’s lack of “a moral compass” pleaded with company leaders to address how its algorithms amplify extremism and misinformation and encourage hatred and polarization. “Again and again they were ignored.”
The truth is that, with billions of users, any effort by Facebook to screen out repellent content “is always going to be a game of whack-a-mole,” said Jim Geraghty in NationalReview.com. “If you give the whole world a blank canvas,” some will make art and some “will create horrible, hateful stuff.” Is that really Facebook’s fault? How could it possibly control what users post?
It’s hard not to be cynical that these revelations will make any real difference, said Jacob Silverman in NY Mag.com. After all the previous “frontpage blockbusters,” congressional hearings, and “Zuckerberg’s robotic promises to do better,” Facebook’s “power and profitability continue to grow,” with $9.2 billion in profits last quarter, and no real competition. With only Zuckerberg’s “whims” dictating its direction, “the public sphere seems to be helpless to tame Facebook, and our lawmakers are similarly useless.”
There’s only one way to reach Zuckerberg, said Greg Bensinger in The
New York Times: if advertisers “start paring back their spending.” It may be a tall order to ask companies to deny themselves access to 3.6 billion marketing targets. “But if aligning with a site facilitating human trafficking, ethnic cleansing, and vicious cartels isn’t sufficient to give advertisers pause, it’s hard to imagine what would.”