The Times Herald (Norristown, PA)

Rule for social media firms should be first, do no harm

-

Many Facebook employees felt the company helped instigate and organize the mob that stormed the U.S. Capitol on Jan. 6.

“Haven’t we had enough time to figure out how to manage discourse without enabling violence?” one worker wrote afterward. “We’ve been fueling this fire for a long time, and we shouldn’t be surprised it’s now out of control.”

An enormous trove of internal documents leaked to the press by former Facebook employee Frances Haugen make the answer to that question crystal clear. It is “no” — Facebook has not figured out how to encourage free speech, a bedrock principle of American democracy, while discouragi­ng the use of its platform to undermine that same system.

Many factors contribute­d to the polarizati­on that erupted on Jan. 6 — including treacherou­s leaders like President Donald Trump — but Facebook was a prime co-conspirato­r. As Haugen told Congress: “Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomic­al profits before people.”

The case against Facebook boils down to two points. One: It has been too slow to restrict the reach of figures like Trump and his toadies, who use the platform to spread damaging disinforma­tion — “the election was rigged, vaccines are dangerous, climate change is a hoax,” etc.

Two: Facebook doesn’t just tolerate disinforma­tion. It employs powerful algorithms to amplify its impact by promoting posts that trigger anger and outrage. These emotional reactions lead users to spend more time on Facebook, which in turn makes them far more valuable to advertiser­s.

At the core of this debate is the “harm principle.” The nonprofit Ethics Centre defines it: “The harm principle says people should be free to act however they wish unless their actions cause harm to somebody else.” The harm done by Facebook abusers is obvious. Disinforma­tion about vaccines, for example, can cost countless lives. Therefore, limiting how those abusers are free to act is certainly justified.

But who gets to define “harm”? What standards are used in reaching that judgment? And how is that definition applied to real-life situations? None of the answers are easy. But they are critical to the functionin­g of a healthy democracy. Overly harsh restrictio­ns on free speech can be even more detrimenta­l than overly timid ones. So what are the options?

Platforms like Facebook could regulate themselves, but it has largely failed to do that. The profit motive is simply too powerful. And in fact, company chief Mark Zuckerberg largely agrees with her. Zuckerberg has often said that he and his brainchild should not be the “arbiters of truth.” He wants to see a more active role for government­s and regulators.

But is that really the answer? The First Amendment says pretty bluntly, “Congress shall make no law … abridging the freedom of speech.”

Should the partisan politician­s who run the government have the power to define what counts as harmful speech, and therefore dilute it?

One promising option is the Oversight Board created by Facebook, a panel of 20 independen­t experts who are empowered to make critical decisions for the company. But that concept has flaws, too. The board recently accused Facebook of not being fully forthcomin­g about policies toward prominent platform users.

Another reasonable alternativ­e: legislatio­n that would force Facebook to be far more transparen­t about its algorithms.

Policymake­rs should remember another version of the harm principle, contained in an adage for doctors: “First, do no harm.”

 ?? ??

Newspapers in English

Newspapers from United States