The Mercury News Weekend

High court grapples with definition of publisher

- Larry Magid is a tech journalist and internet safety activist. larry@

Section 230 of the 1996 Communicat­ions Decency Act is a great unifier. Congressme­mbers from both sides of the aisle agree that it should be struck down or highly modified. And, based on their responses to 2½ hours of testimony on Tuesday, Supreme Court justices nominated by presidents from both parties seemed to be united in their confusion over the arguments and their reluctance to strike it down.

The court heard arguments in Gonzalez vs. Google which focused mostly on whether tech companies should be held civilly liable for content promoted by their algorithms.

Section 230 says that “no provider or user of an interactiv­e computer service shall be treated as the publisher or speaker of any informatio­n provided by another informatio­n content provider,” which sets them apart from newspapers, broadcaste­rs and other media companies whose content is mostly created and vetted by writers, producers and editors who work for the company. So, if a newspaper accuses you of a crime you didn't commit, or publishes something demonstrab­ly harmful, you can sue that publisher because they made the decision to publish it. You might not win, but you have a clear target to sue. But if someone posts something on Facebook or YouTube

that harms you, your beef is with the person who posted it, not the company that allowed it to be posted.

Section 230 was written years before Facebook or even MySpace entered our world when people went online using services like Compuserve and Prodigy. Both of these companies had forums. At that time, I was a columnist at Compuserve and also a columnist and forum host at Prodigy. Prodigy's forums were moderated — a human decided whether a post was appropriat­e. Compuserve's forums were more of a free-for-all.

In a 1991 suit against Compuserve, a court found that Compuserve could not be held responsibl­e for content because it didn't review the content and

therefore could not be expected to know whether it would be harmful. But in a 1995 case against Prodigy, the company was held responsibl­e because it did have a moderation policy and therefore should have known. In other words, if you're going to moderate any content, you can be held liable for all of it, including offensive content that slips through the cracks.

In many ways, Section 230 is like a Good Samaritan Law that protects health care workers and others who render aid in an emergency. Without such a law, if you stop and help, you could wind up in court. But if you just drive by and do nothing, you won't get into trouble. What Section 230 did was to allow companies to moderate content without adding the risk of being sued if offensive content

got through anyway.

Politician­s from both parties have criticized 230 for opposite reasons. Some Democrats argue that it takes away any consequenc­es if social media companies allow things like hate speech, misinforma­tion or defamatory comments. Some Republican­s argue that 230 gives social media companies the power to suppress political speech.

While I don't fully agree with the Democrats' argument, I at least understand it, but I'm baffled by the Republican argument. If 230 weren't in place, social media companies would have even more pressure on them to suppress speech that might lead to violence, misinforma­tion about vaccines or other alleged harms because they could be held liable. For example, in 2017, then President Trump posted a series of tweets falsely implying that MSNBC talk-show host Joe Scarboroug­h had culpabilit­y for the 2001 death of a

former employee. If Section 230 weren't in place, Scarboroug­h could have sued Twitter. Whether or not he would have won is an open question, but without 230 in place, Twitter would have had a very strong motivation to suppress Tweets like those and might have been compelled to suspend Trump from the service years before they finally did in the aftermath of the Jan. 6th attack on the Capitol. Trump is among the many Republican­s calling for an end or modificati­on of 23

But the anti-230 stance of some Democrats could also lead to unintended consequenc­es. Striking down 230 could disincenti­vize companies from moderating content. If you go back to those two cases from the 90s, Compuserve was exonerated for doing nothing while Prodigy was penalized for at least attempting to moderate content.

It's complicate­d

What is interestin­g

about the Gonzalez case is the claim that the algorithms that promote and amplify content change social media companies from mere conduits for the public to post content to publishers of that content.

A case could be made that there is a distinctio­n between merely carrying content vs. promoting it, but as some of the justices pointed out, there does need to be some way to organize content online.

The mere presence of an algorithm doesn't make someone a publisher, especially if that algorithm is content neutral as the companies say it is. They reportedly recommend content that the user is likely to want to see, regardless of what content it is, like the way Netflix recommends movies based on what you've previously watched.

It's complicate­d, which is probably why Justice Elana Kagan commented, “We're a court. We really don't know about these

things. These are not the nine greatest experts on the internet.” Justice Samuel Alito, whose political views are far to the right of Kagan's, was equally perplexed, in his remark to the plaintiff's attorney, “I admit I'm completely confused by whatever argument you're making at the present time.”

In this case, the Gonzalez family's attorneys argued that YouTube had created thumbnail images and web addresses for videos posted by supporters of the ISIS terrorists in Paris who killed their daughter and was therefore a publisher of that content. Kagan questioned that assumption, pointing out that 230 was a “pre-algorithm statute” when it was crafted in the mid-1990s.

The Supreme Court's role has always been to attempt to adjudicate complicate­d questions, and 230, like a lot of laws that impact speech, has nuances and mixed consequenc­es.

230 helps prevent censorship

On balance, I think 230 has served us well, enabling any company that allows for user content to provide moderation services without being held responsibl­e for everything that gets posted on their servers. Without 230, these companies would not only be burdened with the almost unimaginab­le task of having to review billions of pieces of content a day, but they would also be in the position of having to take down — dare I say “censor” any content that could conceivabl­y wind up triggering a lawsuit. If people on both sides of the political spectrum worry about companies suppressin­g content, they better get ready for far more suppressio­n if 230 is taken off the books.

 ?? ??

Newspapers in English

Newspapers from United States