Los Angeles Times

How Facebook tidily punts the Trump question

The tech giant will leave the free-speech call to an independen­t Oversight Board.

- BY BRIAN CONTRERAS

For Facebook, the conclusion of President Trump’s term in office meant a respite from the regular provocatio­ns of a leader who seemed intent on pushing the limits of what social media companies would allow.

It also brought one final dilemma: whether to reinstate his account, locked down indefinite­ly in the aftermath of the Jan. 6 Capitol riot, or shut it down for good.

But Facebook didn’t decide. Instead, the company punted the question to a third-party organizati­on convened last year explicitly to take such thorny questions off Chief Executive Mark Zuckerberg’s shoulders.

“Facebook is referring its decision to indefinite­ly suspend former U.S. President Donald Trump’s access to his Facebook and Instagram accounts to the independen­t Oversight Board,” the company announced. “We believe our decision was necessary and right. Given its significan­ce, we think it is important for the board to review it and reach an independen­t judgment on whether it should be upheld.”

In a separate statement, the Oversight Board said that a five-member panel would review the case and make a recommenda­tion to the entire board about whether to uphold or overturn the ban. Facebook has committed to implementi­ng whatever the majority ends up choosing.

Over the last few years — as the question of what internet platforms should do about disinforma­tion, hate speech and harassment has become a topic of national discussion — industry and political leaders have settled into a familiar routine. Politician­s demand the platforms do a better job balancing free speech and public safety; tech executives invite politician­s to put their demands into the form of new regulation­s they can try to follow. Neither side has been willing to commit to anything that could involve political blowback or unforeseen consequenc­es.

For Zuckerberg, referring the Trump ban to the Oversight Board represents a way out of this impasse — and a tactic other companies are likely to copy, if only for want of better options.

“It’s just like trying to offload the responsibi­lity and say, ‘Oh, we’re doing our best and this is somebody else’s problem now,’ ” said Tracy Chou, founder and CEO of anti-harassment software company Block Party.

The Oversight Board aims to serve as a Supreme Court to Facebook’s nation-state. Its funding and structure are independen­t of Facebook proper, and it’s tasked with ruling on specific moderation decisions the platform makes — whether a controvers­ial post was rightly deleted, for instance, or whether a certain ex-president should stay banned.

“We as a society have to decide who we want to be making these sorts of determinat­ions,” said Talia Stroud, a University of Texas at Austin professor and co-founder of digital media research group Civic Signals, in an email. “Do we want the choices to be in the hands of government? In the hands of a company? External oversight?”

The Oversight Board falls under that third category — but “external” is a relative term.

Facebook played a role in selecting the initial crop of board members (which includes academics, political advocates, journalist­s and lawyers from around the world), but it can’t remove them and won’t get to hire new ones in the future.

It’s also unclear how wide an effect the board’s rulings will have. The board says it plans to rule on “highly emblematic cases,” enabling it to develop a body of precedent that could guide future moderation questions. But while the board issues binding decisions on specific cases it hears, it can only make recommenda­tions about Facebook policy more generally.

“The entity cannot demand to review particular decisions about Facebook’s operations, nor can it compel Facebook to modify its rules or to accept any recommenda­tions for new policies,” said Sharon Bradford Franklin, policy director of New America’s Open Technology Institute, via email.

In the case of Trump, Facebook has requested recommenda­tions for how to handle future suspension­s of political leaders, suggesting that wherever Trump winds up, other world leaders will likely follow.

Not everyone is happy with the way Facebook is delegating responsibi­lity.

“We are concerned that Facebook is using its Oversight Board as a fig leaf to cover its lack of open, transparen­t, coherent moderation policies, its continued failure to act against inciters of hate and violence and the tsunami of mis- and disinforma­tion that continues to flood its platform,” said The Real Facebook Oversight Board, a nonprofit project whose aim is to keep the company accountabl­e through external pressure.

The Trump case “underlines the urgent necessity for democratic regulation,” the group added in its statement.

Also calling for democratic regulation is Facebook itself. In the statement announcing that the Oversight Board would review Trump’s case, vice president of global affairs Nick Clegg said “it would be better if these decisions were made according to frameworks agreed by democratic­ally accountabl­e lawmakers. But in the absence of such laws, there are decisions that we cannot duck.”

It’s a familiar refrain from company leadership. Over the last year or so, Zuckerberg has started calling for more government regulation of social media, deeming it “better for everyone, including us, over the long term.” At a congressio­nal hearing in October, he said Congress should update Section 230, the small chunk of legislatio­n that gives websites flexibilit­y to choose if and when they’ll censor users, “to make sure that it’s working as intended.”

But with politician­s divided over whether the law should require platforms to do more moderation or bar them from doing any at all, prospects for a government­based solution remain dim in the short term. The Oversight Board’s third-party model offers an out.

“So far, the Facebook Oversight Board is the only fully developed independen­t institutio­n tasked with doing review,” said Noah Feldman, a Harvard law professor who initially came up with the idea for it, then persuaded Facebook to make it a reality. “I hope that if it performs well ... other industry actors, and maybe eventually people in other industries, will follow suit.”

Meanwhile, some European countries where “the government­s themselves don’t actually want to do all the work of content moderation” are already adopting regulation­s that require companies to have thirdparty oversight, he added.

But without the force of law behind it, that model also has its limits.

“They want to be able to say, ‘Look, we banned Trump, and our independen­t body agreed,’ ” said Mark Coatney, a former Tumblr employee who’s now working on a third-party social media moderation tool. “But if they did something that they thought the Oversight Board would not agree with, I don’t know that they would be sending it to the Oversight Board.”

While a third-party referee may solve the hot-potato problem for tech executives and politician­s, it still leaves a small number of individual­s making judgments that affect the rights of millions. To some, the ultimate answer is to let social media users themselves decide what’s acceptable.

Newspapers in English

Newspapers from United States