Social media algorithms threaten democracy, researchers tell senators
WASHINGTON — A Senate hearing on Tuesday pitted three powerful social media companies against researchers who testified that the algorithms used by the platforms to generate revenue by keeping users engaged pose existential threats to individual thought, and democracy itself.
The hearing before the Judiciary Subcommittee on Privacy, Technology and the Law featured a bipartisan approach to the issue from the new chairman, Democratic Sen. Chris Coons of Delaware, and ranking member, GOP Sen. Ben Sasse of Nebraska. Algorithms can be useful, the senators agreed, but they also amplify harmful content and may need to be regulated.
Government relations and content policy executives from Facebook, YouTube, and Twitter described for the senators how their algorithms help them identify and remove content in violation of their terms of use, including hateful or harassing speech and disinformation. And they said their algorithms have begun “down-ranking,” or suppressing, “borderline” content.
Monika Bickert, Facebook’s vice president for content policy, said it would be “self-defeating” for social media companies to direct users toward extreme content.
But Tristan Harris, a former industry executive who became a data ethicist and now runs the Center for Humane Technology, told the committee that no matter what steps the companies took, their core business would still depend on steering users into individual “rabbit holes of reality.”
“It’s almost like having the heads of Exxon, BP, and Shell here and asking about what you’re doing to responsibly stop climate change,” Harris said. “Their business model is to create a society that’s addicted, outraged, polarized, performative and disinformed.”
“While they can try to skim the major harm off the top and do what they can — and we want to celebrate that, we really do — it’s just that they are fundamentally trapped in something they cannot change,” Harris continued.
Joan Donovan, the research director at the Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy, said the platforms should be required to offer users a “public interest” version of their news feeds or timelines and provide robust tools to moderate content.
“We didn’t build airports overnight but tech companies are flying the planes with nowhere to land,” Donovan said. “The cost of doing nothing is nothing short of democracy’s end.”
Coons and Sasse commended the platforms for steps taken to curb the spread of hate speech but questioned whether they would do enough if left to their own devices. Coons noted that Facebook recently took special measures to limit misinformation and violent content ahead of the verdict in the trial of former Minneapolis police officer Derek Chauvin, who was convicted in the May 2020 murder of George Floyd, a Black man.