Las Vegas Review-Journal (Sunday)

Imagine a more responsibl­e internet

-

U.S. Supreme Court Justice Elena Kagan sparked laughter in the court last week by speaking honestly about her colleagues’ and her own lack of expertise on the internet. “We are a court — we really don’t know about these things. We are not, like, the nine greatest experts on the internet,” she said.

Although made in jest, the comment is not a laughing matter. Significan­t updates to the interpreta­tion of current law are sorely needed to reflect evolving technology. Two cases came before the court last week that will shape legal liability for online communicat­ion for years to come. Experts or not, those nine justices will decide whether the online marketplac­e of ideas becomes an impregnabl­e haven for all manner of libel, slander and criminal communicat­ions.

At the heart of the cases is whether tech companies are legally responsibl­e for the content posted on their sites by third-party users and then distribute­d by algorithms that are designed to amplify the content to specific individual­s most likely to engage. We believe they should be, provided there are carefully drawn distinctio­ns.

After his daughter was killed in a 2015 ISIS attack, Reynaldo Gonzalez sued Google, now called Alphabet, the parent company of Youtube, for the company’s personaliz­ed distributi­on algorithms’ role in promoting ISIS recruitmen­t videos.

Current law, Section 230 of the Telecommun­ications Act of 1996, has been interprete­d to shield companies from almost all liability for content posted by third-party users. Under the law, “No provider or user of an interactiv­e computer service shall be treated as the publisher or speaker of any informatio­n provided by another informatio­n content provider.”

This interpreta­tion was, at the time, a natural outgrowth of common carrier laws that shielded telephone companies from liability for telephone conversati­ons about criminal activity. The logic held that because phone lines were merely conduits for the conversati­ons to travel through but otherwise did not speak, control, distribute or otherwise publish the words, the phone companies were not liable for the specific content of the conversati­ons.

Gonzalez contends that technology has changed and that the targeted personaliz­ed algorithms used by companies like Alphabet and Meta, the parent company of Facebook and Instagram, represent affirmativ­e action to distribute certain content to specific users. In other words, by utilizing algorithms that effectivel­y read, understand and redistribu­te content to specific users, online platforms are no longer simply conduits for conversati­ons and should not be given blanket immunity.

In a separate but related case, the family of Nawras Alassaf, another victim of ISIS, is seeking to hold Twitter, Alphabet and Meta liable for their failure to remove ISIS content from their platforms, which they claim helped ISIS grow and thrive.

Both families contend that online platforms already have a responsibi­lity to remove illegal materials, such as child pornograph­y and copyrighte­d material, from their sites. Expanding those responsibi­lities to include removing — rather than distributi­ng — recruitmen­t and organizing materials for terrorists and other violent extremists and allowing liability for their failure to do so is not an unreasonab­le expectatio­n.

The families’ arguments are compelling and should be sustained by the court. Congress has already carved out exceptions to Section 230 for speech that clearly violates federal law such as human traffickin­g, child pornograph­y and internatio­nal drug trade. These exceptions should extend to active recruitmen­t and conspiracy by criminal organizati­ons.

However, the court should go even further and recognize that, unlike platforms that provide nothing more than a conduit or “bulletin board” for user-generated content, any platform that promotes content for redistribu­tion without user consent is acting as a publisher. As such, they should be fully liable for any content that is redistribu­ted without the specific opt-in, consent or “subscripti­on” of a targeted user.

Under this model, users could still be presented with relevant content by following or subscribin­g to certain content creators, publicatio­ns, or subject-matter index tags determined by the content creators. The number of available tags would be limited and used only for basic indexing of content, in much the same way public libraries index content using the Dewey Decimal System.

Imagine logging on to Facebook, Twitter or Instagram and only seeing content that is exclusivel­y from your friends, people or organizati­ons that you signed up to follow, or directly related to subjects that you signed up to see and can change at any time.

Advertisin­g would not disappear, but you would have more control over it because rather than targeting ads at you based on an invasive profile of your personal activity, advertiser­s would only be able to target ads via tags, publicatio­ns or content creators. This would change the advertisin­g model of the internet but does not run the risk of “eliminatin­g innovation,” “destroying the internet” or any other hyperbole being offered by those with a financial interest in the current model.

Most importantl­y, content producers would be responsibl­e for the content they produce or distribute. Existing libel laws and slander laws could be enforced. Online bullies could be identified and held accountabl­e. And propaganda and spam farms would be far less effective because they wouldn’t have algorithms assisting them in the distributi­on of their lies.

This model would inoculate society from the worst of the internet without creating an overwhelmi­ng impediment to innovation, vigorous debate or the marketplac­e of ideas — which were the major concerns of the authors of Section 230, Sen. Ron Wyden, D-ore., and former Rep. Chris Cox, R-calif., when they proposed the law.

Overall, this would be a major step toward returning accountabi­lity and responsibi­lity for the cesspool that is too much of the internet today. And it would make it much harder for bad actors to use the internet to accomplish their goals.

Congress has already carved out exceptions to Section 230 for speech that clearly violates federal law such as human traffickin­g, child pornograph­y and internatio­nal drug trade. These exceptions should extend to active recruitmen­t and conspiracy by criminal organizati­ons.

Newspapers in English

Newspapers from United States