Times-Herald

First, do no harm

- Steven Roberts

Many Facebook employees felt their own company helped instigate and organize the mob that stormed the U.S. Capitol on Jan. 6.

"Haven't we had enough time to figure out how to manage discourse without enabling violence?" one worker wrote afterward. "We've been fueling this fire for a long time, and we shouldn't be surprised it's now out of control."

An enormous trove of internal documents leaked to the press by a former Facebook employee, Frances Haugen, make the answer to that question crystal clear. It is "no" – Facebook has not figured out how to encourage free speech, a bedrock principle of American democracy, while discouragi­ng the use of its platform to undermine that same system.

Many factors contribute­d to the poisonous polarizati­on that erupted on Jan. 6 – including treacherou­s leaders like Donald Trump – but Facebook was a prime co-conspirato­r. As Haugen told Congress: "Facebook's products harm children, stoke division and weaken our democracy. The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomic­al profits before people. Congressio­nal action is needed. They cannot solve this crisis without your help."

The case against Facebook boils down to two points. One: The company has been far too slow to restrict the reach of figures like Trump and his toadies, who use the platform to spread damaging disinforma­tion – "the election was rigged, vaccines are dangerous, climate change is a hoax," etc.

Two: Facebook doesn't just tolerate disinforma­tion. The company employs powerful and secret algorithms to amplify its impact by promoting posts that trigger anger and outrage. These emotional reactions lead users to spend more time on Facebook, which in turn makes them far more valuable to advertiser­s. That's what Haugen means by putting "profits before people."

At the core of this debate is the "harm principle," articulate­d by the 19thcentur­y British philosophe­r John Stuart Mill. The nonprofit Ethics Centre defines it this way: "The harm principle says people should be free to act however they wish unless their actions cause harm to somebody else."

The harm done by Facebook abusers is obvious. Disinforma­tion about vaccines, for example, can cost countless lives. Therefore, limiting how those abusers are free to act is certainly justified.

But here's the problem: Who gets to define "harm"? What standards are used in reaching that judgment? And how is that definition applied to real-life situations?

None of the answers are easy. But they are critical to the functionin­g of a healthy democracy. Overly harsh restrictio­ns on free speech can be even more detrimenta­l than overly timid ones. So what are the options?

Platforms like Facebook could regulate themselves, but as Haugen notes, her former employer has largely failed to do that. The profit motive is simply too powerful. And in fact, the company's Maximum Leader, Mark Zuckerberg, who controls more than half of Facebook's stock, largely agrees with her.

Facebook hosts nearly 3 billion monthly users, and Zuckerberg has often said that he and his brainchild should not be the "arbiters of truth." Amen to that.

"Every day, we make decisions about what speech is harmful, what constitute­s political advertisin­g, and how to prevent sophistica­ted cyberattac­ks," he has written. "But if we were starting from scratch, we wouldn't ask companies to make these judgments alone. I believe we need a more active role for government­s and regulators."

But is that really the answer? Call me old-fashioned, but I kind of like the First Amendment, which says pretty bluntly, "Congress shall make no law ... abridging the freedom of speech."

"There oughta be a law!" is not always the right answer to a public policy crisis. In fact, it often is not. Should the partisan politician­s who run the government have the power to define what counts as harmful speech – and therefore dilute it?

One promising third option is the Oversight Board created by Facebook, a panel of 20 independen­t experts who are empowered to make critical decisions for Zuckerberg & Co. But that concept has flaws, too. The board recently issued a report accusing Facebook of not being "fully forthcomin­g" about its policies toward prominent platform users.

Another reasonable alternativ­e: legislatio­n that would force Facebook to be far more transparen­t about the algorithms it employs, which can spread so many toxic falsehoods so quickly.

As policymake­rs grapple with how to apply Mill's "harm principle" to the digital space, they should remember another version of that idea, contained in an adage often preached to young doctors: "First, do no harm."

(EDITORS NOTE: Steven Roberts teaches politics and journalism at George Washington University. He can be contacted by email at stevecokie@gmail.com.)

 ?? ??

Newspapers in English

Newspapers from United States