The Mercury News

How we can tame the Wild West of Big Tech media

- By Steven Hill Steven Hill is the former policy director at the Center for Humane Technology.

Why do so many people, including both former President Donald Trump and new President Joe Biden, keep talking about getting rid of an obscure law called Section 230?

The short answer is that Section 230, part of the Communicat­ions Decency Act of 1996, is the legal underpinni­ng for one of the largest and most consequent­ial experiment­s in American history.

Since the birth of Big Tech media 15 years ago, our nearly 250-year-old republic has become a test case. Can a nation’s news and informatio­n infrastruc­ture, the lifeblood of any democracy, be dependent on digital media technologi­es that allow a global free speech zone of unlimited audience size, combined with algorithmi­c (nonhuman) curation of massive volumes of disinforma­tion that can be spread with unpreceden­ted ease?

This experiment has been possible because Section 230 grants Big Tech media immunity from responsibi­lity for the mass content that is published and broadcast across their platforms. A mere 26 words in the bipartisan law were originally intended to protect “interactiv­e computer services” from being sued over what their users post, just like telephone companies can’t be sued over any gossip told by Aunt Mabel to every busybody in town.

But as Facebook, Google, Twitter and other services scaled over time to an unimaginab­le size, the platforms’ lack of human editors has resulted in a gushing firehose of mis- and disinforma­tion where scandals and conspiraci­es are prioritize­d over real news for mass distributi­on.

As the gripping videos and photos of a pro-Trump mob storming the Capitol make clear, this experiment has veered frightenin­gly off course. So, President Biden has called for ending Section 230 immunity in order to stop the Frankenste­in’s monster this law helped create.

Facebook is no longer simply a “social networking” website — it is the largest media giant in the history of the world, a combinatio­n publisher and broadcaste­r, with approximat­ely 2.6 billion regular users and billions more on the Facebook-owned WhatsApp and Instagram. One study found that 104 pieces of COVID-19 misinforma­tion on Facebook were shared 1.7 million times and had 117 million views. That’s far more than the number of daily viewers on the Wall Street Journal, New York Times, USA Today, ABC News, Fox News, CNN and other major networks combined.

Traditiona­l news organizati­ons are subject to certain laws and regulation­s, including a degree of liability over what they broadcast. While there is much to criticize about mainstream media, at least they use humans to pick and choose what’s in and out of the news stream. That results in a degree of accountabi­lity, including legal liability.

But Facebook-Google-Twitter’s robot algorithm curators are on automatic pilot, much like killer drones for which no human bears responsibi­lity or liability.

Our government must impose a whole new business model on these corporatio­ns — just as the United States did, in years past, with telephone, railroad and power companies.

The government should treat these companies more like investor-owned utilities, which would be guided by a digital license that would define the rules and regulation­s of the business model (Mark Zuckerberg himself has suggested such an approach).

To begin with, such a license would require platforms to obtain users’ permission before collecting anyone’s personal data — i.e., opt-in rather than opt-out.

The new model also should encourage more competitio­n by limiting the mega-scale audience size of these media machines. Smaller user pools could be accomplish­ed either through an antitrust breakup of the companies or through incentives to shift to a revenue model based more on monthly subscriber­s rather than on hyper-targeted advertisin­g, which would cause a decline in users. The utility model also should restrain the use of specific engagement techniques, such as hyper-targeting of content, automated recommenda­tions and addictive behavioral nudges (like autoplay and popup screens).

I believe we can retain what is good about the internet without the toxicities. It is crucial that regulation evolves in order to shape this new digital infrastruc­ture — and the future of our societies — in the right way.

Newspapers in English

Newspapers from United States