Lodi News-Sentinel

Repealing Section 230 would limit Americans’ speech

-

Section 230 of the Communicat­ions Decency Act prevents digital intermedia­ries from being treated as the “publisher or speaker” of their users’ speech and blocks litigation over platforms’ decisions to remove speech they deem violent, obscene or otherwise objectiona­ble. Platforms are under no obligation to remove speech, with some exceptions, but cannot be required to carry speech either. The law applies universall­y to digital intermedia­ries; Facebook is not liable for its users’ speech, and The New York Times is not liable for its comments section. By properly placing responsibi­lity for harmful or unlawful speech with the speaker, Section 230 maximizes the ability of companies to produce publishing tools.

In the 25 years since its passage, this prescient rule has paid tremendous dividends. Americans are served by a dizzying array of publishing intermedia­ries, allowing us to communicat­e in real time via text, audio and video. We have created forums for work, worship, play and romance, serving every imaginable nice interest and minority. Of course, not all interconne­ction has been positive. Extremists and criminals use the internet too. Some argue that amending or repealing Section 230 would compel platforms to suppress extremist speech and criminal activity.

However, exposing platforms to broad liability for user speech would lead to the removal of much more than dangerous speech.

Platforms already make extensive use of their ability to remove unwanted speech, filtering spam, threats, advertisem­ents for illegal goods, foreign propaganda and even simply off-topic speech. Popular platforms review millions of posts a day, often with the assistance of imperfect software. At this scale, some innocent speech will inevitably be misunderst­ood, mislabeled and removed. Over the past few years, major platforms’ rules have become more stringent and expansive, prompting concerns about censorship and bias.

Demanding that platforms assume liability for their users’ speech will at best exacerbate the accidental removal of innocent speech. However, it also runs the risk of limiting who can speak online at all. Digital intermedia­ries usually review speech after publicatio­n. Speech may be flagged, either by other users, human moderators, or algorithms, and placed in queue for adjudicati­on. Section 230 allows platforms to remain open by default and worry about excluding misuse when it occurs, giving a voice to everyone with an internet connection.

In contrast, newspapers and other traditiona­l publishers filter, edit and modify submission­s before publicatio­n. While this allows them to safely assume full ownership of the speech they publish, it dramatical­ly limits who can speak. Editing is a laborious and time-consuming process. Even if a newspaper wanted to publish every letter to the editor, it would have neither the space nor the time to do so. This model often produces consistent­ly high-quality speech, but tends to favor some perspectiv­es over others, offering only a narrow slice of elite sentiment.

Repealing Section 230 would make social media more like traditiona­l media by making it exclusive. With limited resources to review speech before publicatio­n, platforms would have to determine whose perspectiv­es should be prioritize­d. There is little reason to think their selections would differ greatly from newspapers. If replies and responses had to be reviewed as well, social media would lose most of its interactiv­ity, becoming another conduit through which speech is passively received.

Without Section 230, platform moderators would not become more deliberate, they would simply remove more. The threat of costly litigation does little to inspire thoughtful decision making — moderators will act quickly to eliminate any source of legal risk. When Congress amended Section 230 in 2017 to expose platforms to liability for speech promoting prostituti­on or sex traffickin­g, Craigslist did not moderate its personal advertisem­ents page more cautiously, it shut the page down.

Indeed, without Section 230’s protection­s, many smaller forums would simply shut down, or look to be acquired by larger firms. Could the operators of V8Buick.com, a forum for antique car collectors with 38,000 users, afford even a single yearslong defamation lawsuit? The easiest way to avoid legal liability is acquisitio­n.

Apart from suppressin­g speech, repealing Section 230 would suppress competitio­n, agglomerat­ing activity onto large platforms such as Facebook. Without Section 230, Facebook, but not V8Buick.com, could afford to litigate controvers­ies over user speech.

Repealing Section 230 is a drastic step that would upend the internet, punishing successful firms and internet users for the behavior of an antisocial minority. Heaping legal liability on platforms will not render them more thoughtful or judicious. It will cause some to close, and others to exclude all but the most inoffensiv­e sentiments.

Will Duffield is a policy analyst in the Cato Institute’s Center for Representa­tive Governance. He wrote this for InsideSour­ces.com

Newspapers in English

Newspapers from United States