The Morning Journal (Lorain, OH)

Regulate social media business models to mitigate harmful speech

- Yosef Getachew is director of the Media & Democracy Program for Common Cause. He wrote this for InsideSour­ces. com.

On Jan. 6, a violent mob of insurrecti­onists stormed the U.S. Capitol in an attempt to overturn our country’s 2020 presidenti­al election. The attack, which resulted in the death of five people, was fueled by a constant stream of disinforma­tion and hate speech Donald Trump and other bad actors flooded across social media platforms before, during, and after the election. Despite their civic integrity and content moderation policies, platforms have been slow or unwilling to take action to limit the spread of content designed to disrupt our democracy.

This failure is inherently tied to platforms’ business models and practices that incentiviz­e the proliferat­ion of harmful speech. Content that generates the most engagement on social media tends to be disinforma­tion, hate speech and conspiracy theories. Platforms have implemente­d business models designed to maximize user engagement and prioritize their profit shares over combating harmful content.

While the First Amendment limits our government from regulating speech, there are tools at its disposal that can rein in social media business practices bad actors exploit to spread and amplify speech that interferes with our democracy.

The core component of every major social media platforms’ business model is to collect as much user data as possible. Platforms then share relevant data points with advertiser­s for targeted advertisin­g. It should come as no surprise that disinforma­tion agents exploit social media platforms’ data-collection practices and targeted advertisin­g capabiliti­es to micro-target harmful content.

Comprehens­ive privacy legislatio­n, if passed, can require data minimizati­on standards, which limit the collection and sharing of personal data to what is necessary to provide service to the user. Legislatio­n can also restrict the use of personal data to engage in discrimina­tory practices that spread harmful content such as online voter suppressio­n. Without the vast troves of data platforms collect on their users, bad actors will face more obstacles targeting users with disinforma­tion.

In addition to data-collection practices, platforms use algorithms that determine what content users see. Algorithms track user-preference­s through clicks, likes and other forms of engagement. Platforms optimize their algorithms to maximize user-engagement, which can mean leading users down a rabbit hole of hate speech, disinforma­tion and conspiracy theories. Algorithms can also amplify disinforma­tion as conspiracy theorists used the “stop the steal” moniker across social media platforms to organize and mobilize offline violence.

Unfortunat­ely platform algorithms are a “black box” with little known about their inner workings. Congress should pass legislatio­n that holds platform algorithms accountabl­e.

Federal agencies with enforcemen­t and rulemaking capabiliti­es can apply their authority to limit the spread of harmful online speech that results from platform business practices. For example, the Federal Trade Commission can use its enforcemen­t power against unfair and deceptive practices to investigat­e platforms for running ads with election disinforma­tion despite having policies that prohibit such content. The Federal Election Commission can complete its longstandi­ng rulemaking to require greater disclosure of online political advertisem­ents in order to provide greater transparen­cy as to what entities are trying to influence our elections.

Outside of legislativ­e and regulatory processes, the Biden administra­tion should create a task force for the internet, consisting of representa­tives from federal, state and local government­s, business, labor, public interest organizati­ons, academia and journalist­s. The task force would identify tools to combat harmful speech online and make recommenda­tions for an internet that would better serve the public interest.

There is no silver bullet solution to eliminatin­g disinforma­tion, hate speech and other harmful online content. In addition to the policy ideas, federal lawmakers must provide greater support for local journalism to meet the informatio­n needs of communitie­s.

But social media companies have proven that profits are more important than the safety and security of our democracy. Lawmakers and regulators must enact policies as part of a holistic approach to hold social media platforms accountabl­e for the proliferat­ion of harmful and false content. The insurrecti­on revealed that our democracy may depend on that accountabi­lity.

 ??  ?? Yosef Getachew
Yosef Getachew

Newspapers in English

Newspapers from United States