Chattanooga Times Free Press

BALANCING CENSORSHIP AND RESPONSIBI­LITY

-

After a white nationalis­t slaughtere­d 50 Muslims in New Zealand, Margaret Sullivan, media critic of The Washington Post, posed this question to the digital platforms used by the assassin to spread his murderous message: “Where are the lines between censorship and responsibi­lity?”

Those platforms — YouTube and Facebook, Twitter and Reddit — must now answer that question with clarity and candor, because their role in the massacre is undeniable. As Neal Mohan, YouTube’s chief product officer, told the Post: “This was a tragedy that was almost designed for the purpose of going viral.”

The shooter was, in effect, playing a deadly video game, live-streaming his attack while encouragin­g his followers to reproduce and repost the images of carnage faster than social media platforms could remove them. The platforms tried; Facebook blocked more than 1 million instances of the 17-minute clip in the first 24 hours, but they were hopelessly outmanned.

The internet did not create white nationalis­m or anti-Muslim fervor. And digital tools are used every day for countless positive purposes. But as New Zealand damnably demonstrat­es, social media platforms are highly vulnerable to corruption and abuse. Facebook, YouTube and the rest are not merely common carriers like the phone company, neutral pipes transmitti­ng any and all informatio­n. They constantly make editorial and ethical decisions that influence what consumers are exposed to, so the question is how those decisions are made and what standards are used. What is the proper balance between responsibi­lity and censorship?

As journalist­s who cherish the First Amendment, we always tilt against censorship. Social media outlets — let alone the federal government — should not be the ultimate arbiter of what people know and learn.

One area where social media companies must improve, however, is crisis management. Even ardent civil libertaria­ns admit that when words and images present a “clear and present danger,” when they threaten to unleash immediate violence, society has an obligation to protect itself and contain that danger.

When the New Zealand shooter’s videos started cascading through the internet, platforms relied on a combinatio­n of artificial intelligen­ce and human moderators to thwart their spread, and they failed miserably. Facebook didn’t even know the original video had been posted on its site until local police told them about it.

But crisis management is only a small part of the problem. A much deeper issue facing digital platforms is the way they encourage and enable radicaliza­tion online.

As users explore a topic, algorithms crafted by the platform suggest new videos that draw them deeper into “rabbit holes” of twisted and tendentiou­s ideologies. The goal is profit. Keep viewers watching, increase the time they spend online and maximize ad revenue.

But this relentless pursuit of eyeballs and earnings has devastatin­g side effects. Not only do users see and absorb increasing­ly extremist ideas, they bond online with others who are drawn into the same vortex of hate and violence.

Here’s where the balance between censorship and responsibi­lity must swing toward responsibi­lity.

If those platforms don’t act on their own, society will fight back in the form of onerous rules and regulation­s that restrict free speech. The only way to avoid censorship is to accept responsibi­lity.

 ??  ?? Cokie & Steve Roberts
Cokie & Steve Roberts

Newspapers in English

Newspapers from United States