Loveland Reporter-Herald

Accountabi­lity debate changes under Biden

- BY BRIAN CONTRERASR­D

Two people were dead; one was injured; and Jason Flores-williams wanted to hold Facebook responsibl­e.

But after filing a lawsuit in September alleging that the website’s lax moderation standards led to 17-year-old Kyle Rittenhous­e killing two protesters in Kenosha, Wis., over the summer, Flores-williams withdrew the suit in January. His fight for accountabi­lity had collided with a law the activist attorney came to see as a “brick wall.”

“You have no levers of control, no leverage,” he told The Times. “You’re up against Section 230.”

A snippet of text buried in the 1996 Telecommun­ications Act, Section 230 is the regulation under which websites enjoy broad freedom to choose if and how they moderate user-generated content. Flores-williams had alleged that a Facebook post by the Kenosha Guard militia summoning armed civilians to the city had laid the groundwork for Rittenhous­e’s violence there; but as Section 230 is written, Facebook and its peers are rarely liable for what their users post — even when it results in death.

Flores-williams isn’t alone in seeing the law as outdated. President Joe Biden, former president Donald Trump and a long list of Democrats and Republican­s have all pushed for the law to be restructur­ed or scrapped entirely amid increasing­ly bipartisan criticism of Big Tech.

But if liberals and conser - vatives are united in their calls for reform, they’re split on what that reform should look like — leaving internet companies stuck in a limbo where a massive forced change to their business model is constantly discussed yet never quite materializ­es.

Meanwhile, those who seek to hold the platforms accountabl­e for the harms caused by content spread there are left searching for new approaches that might of fer a greater chance of success — which is to say, any at all.

Section 230 takes a twopronged approach to content moderation: not only does it absolve websites of liability for user content they don’t moderate, but it also says they can moderate user content when they choose to. That lets social networks, chat forums and review websites host millions of users without having to go to cour t every time they leave up a post that’s objectiona­ble, or take one down that’s not.

Online platforms usually, though not uniformly, support leaving Section 230 the way it is. In a congressio­nal hearing last fall, Alphabet Chief Executive Sundar Pichai and Twitter CEO Jack Dorsey warned that the internet only works thanks to the protection­s afforded by the law; Facebook CEO Mark Zuckerberg broke ranks to say the law should be updated, citing a need to promote transparen­cy around moderation practices.

Of the law’s critics, conser - vatives typically lean toward unrestrict­ed speech. A Trump executive order sought to modify the law so users could sue platforms if they restricted content that wasn’t violent, obscene or harassing, although legal experts said the order was unlikely to hold up in court and it appears to have had little impact on how the platforms conduct themselves.

On the left, critics have called for a version of Section 230 that would encourage more rigorous moderation. Reforms targeting sex traffickin­g and child abuse have also garnered bipartisan support in the past.

Both sides have only gotten louder in recent weeks: the Jan. 6 siege of the U.S. Capitol prompted concern from the left about the role unregulate­d social media can play in organizing real-world violence, while the subsequent banning of Trump’s Facebook and Twitter accounts gave the right a striking example of how easily tech platforms can silence their users.

With Democrats now controllin­g the presidency and both houses of Congress, the party has an opportunit­y to rewrite Section 230, but it has yet to achieve consensus, with members floating multiple differentl­y calibrated proposals over the last year.

The latest of those is the SAFE TECH Act, proposed last month by Sens. Mazie Hirono, D-hawaii, Amy Klobuchar, D-minn., and Mark R. Warner, D-VA. The bill would increase platforms’ liability for paid content and in cases involving discrimina­tion, cyberstalk­ing, targeted harassment and wrongful death.

It’s not clear how broadly lawmakers and prosecutor­s would try to interpret SAFE TECH’S provisions, but if passed, the bill could force tech companies to rethink how they engage with usergenera­ted content.

The legislatio­n faces a rocky path forward. Opposition to content moderation became a major Republican rallying cry under Trump, and the party has significan­t power to block legislatio­n in the Senate through filibuster­s. With Democrats preoccupie­d by the COVID-19 pandemic and accompanyi­ng economic crisis, liberal leaders might be hesitant to spend their time and energy on abstruse social media policies.

In the absence of imminent reform, some lawyers have adopted another strategy: tr ying to find novel legal theories with which to hold platforms liable for user content while Section 230 still remains in force.

“For as long as [Section 230] has been around, there have been plaintiff’s attorneys attempting to plead around the immunity it affords,” said Jeffrey Neuburger, a par tner at Proskauer who co-leads the law firm’s technology, media and telecommun­ications group.

But the cour ts have “usually, with few exceptions” shot those efforts down, Neuburger added. For instance, he wrote via email, courts have “routinely and uniformly” rejected arguments that websites become liable for user content if they perform editorial functions such as removing content or deleting accounts; and have similarly rejected arguments that websites’ “acceptable use” policies constitute legally binding promises. And in the few cases where plaintiffs have managed to circumvent Section 230 defenses, the verdicts have generally been reversed on appeal.

“There are no easy answers,” Neuburger said. “It’s hard to regulate content online.”

 ?? DENIS CHARLET / Getty Images ?? This photo taken Oct. 21 shows the logos of technology companies, from left: Google, Facebook, Twitter, Instragram, Snapchat and Tiktok on a computer screen.
DENIS CHARLET / Getty Images This photo taken Oct. 21 shows the logos of technology companies, from left: Google, Facebook, Twitter, Instragram, Snapchat and Tiktok on a computer screen.

Newspapers in English

Newspapers from United States