USA TODAY International Edition

Liability suits before high court threaten internet as we know it

- Ken Paulson Ken Paulson is the director of the Free Speech Center at Middle Tennessee State University, a former editor of USA TODAY and a member of USA TODAY’s Board of Contributo­rs. Follow him on Twitter: @ kenpaulson­1

The Supreme Court is mulling the future of the internet. Critics warn that if the court isn’t careful, it could destroy the web as we know it. We’ll still have online shopping and plenty of porn. But the whole “marketplac­e of ideas” thing? That’s another matter.

Two cases with similar elements are before the Supreme Court. Families who lost loved ones at the hands of terrorists are suing social media companies they believe supported the Islamic State group on their platforms.

At the heart of the cases is Section 230 of the Communicat­ions Decency Act of 1996, a good idea in the middle of a bad piece of legislatio­n. While much of the act was struck down a year later as government overreach violating the First Amendment, Section 230 survived, providing cover for the thenemergi­ng online industries.

26 words ‘ created the internet’

Professor and author Jeff Kosseff has written that these are the 26 words that “created the internet:” “No provider or user of an interactiv­e computer service shall be treated as the publisher or speaker of any informatio­n provided by another informatio­n content provider.”

“Interactiv­e computer service” sounds so quaint. In 1996, it was a place early web users could pull over on the “Informatio­n Superhighw­ay.”

With internet- based media still in a nascent stage, Congress recognized that this new industry could never grow and succeed if it could be held liable for everything posted by users. Section 230 meant that internet companies didn’t have to review everything before it was posted, allowing dramatic growth over the next quarter- century.

The new cases before the court use novel theories to try to break through the Section 230 shield.

In Gonzalez v. Google, heard by the court Tuesday, the family of Nohemi Gonzalez, a young American woman killed in 2015 in an ISIS attack in Paris, argued that YouTube algorithms “recommend” ISIS propaganda. A user who clicks on an ISIS video may be offered similar videos.

In Twitter v. Taamneh on Wednesday, the court heard claims by the family of Nawras Alassaf, a Jordanian man killed in an ISIS attack in Istanbul in 2017. The family contends that access to Twitter gave ISIS “substantia­l assistance” in violation of a California antiterror­ism law. The contention is that Twitter didn’t work hard enough to bar terrorist posts.

Both YouTube, owned by Google, and Twitter have attempted to monitor and remove ISIS content, but some propaganda still gets posted, if only for a brief period. Can YouTube be sued for having an algorithm that doesn’t automatica­lly prevent ISIS propaganda from being displayed? Can Twitter be liable for not catching all the ISIS tweets before they’re posted?

Those would be high standards, in part because posts from terrorist organizati­ons aren’t always branded.

Since 1997, cases like these would never have made the docket of the U. S. Supreme Court. Section 230 was establishe­d law. But this court decided it wanted to review not one but two Section 230 cases. On Tuesday, Justice Elena Kagan aptly noted that the questions stem from a pre- algorithm statute in a post- algorithm world.

What the justices said

Kagan’s observatio­n came during oral arguments in Gonzalez, where the justices appeared to be generally supportive of Section 230. Justice Clarence Thomas noted that it appeared the YouTube algorithm was neutral in its applicatio­n, treating the videos in question the same it would videos about rice pilaf. Chief Justice John Roberts expanded on that idea.

“They’re still not responsibl­e for the content of the videos or the text that is transmitte­d. Your focus is on the actual selection and recommenda­tions,” Roberts told an attorney for the Gonzales family. “It may be significant if the algorithm is the same … across the different subject matters, because then they don’t have a focused algorithm with respect to terrorist activities or pilaf.

“And then I think it might be harder for you to say that there’s selection involved for which they could be held responsibl­e.”

It would be welcome if the court dispatches both cases with similarly practical reasoning.

The First Amendment rights of a private business limit the options of those who want to remake social media to their liking. That’s why Section 230 is an easy and obvious target.

Blocking criminals, terrorists

There’s no need for us to look to the Supreme Court to determine the fate of Section 230. The same Congress that passes laws can rewrite them. There’s already a federal law punishing online services that knowingly promote sexual exploitati­on of children. The same can be done to address terrorism.

Of course, it would mean a functionin­g Congress capable of crafting a law that both ensures internet freedom and mandates good- faith efforts to block criminal and terrorist communicat­ions. No robes required.

 ?? ALEX BRANDON/ AP ?? Beatriz Gonzalez and Jose Hernandez, mother and stepfather of ISIS victim Nohemi Gonzalez, speak Tuesday outside the Supreme Court.
ALEX BRANDON/ AP Beatriz Gonzalez and Jose Hernandez, mother and stepfather of ISIS victim Nohemi Gonzalez, speak Tuesday outside the Supreme Court.
 ?? ??

Newspapers in English

Newspapers from United States