The Mercury News

Facebook’s oversight board has important work to do

- Larry Magid

In 2018, Facebook CEO Mark Zuckerberg announced that the company would appoint a content oversight board to act as the final decisionma­ker when it comes to content moderation decisions. Like the U.S. Supreme Court, the board wouldn’t look at all decisions, but a small selection that rise to the top, perhaps because they affect a great many people or because they are particular­ly thorny. Also, like the Supreme

Court, the board’s decisions are meant to be final. Even Zuckerberg can’t override the board.

On Wednesday, the newly formed oversight board announced its first 20 members and briefed the press on how it plans to operate. The roster of members reads like a who’s who of internatio­nal luminaries, including the former prime minister of Denmark, Helle Thorning-schmidt. Other

members include Nobel Peace Prize laureate Tawakkol Karman from Yemen; Pakistan’s Nighat Dad, founder of Digital Rights Foundation; several law professors and other high-level individual­s from around the world. You can learn more about the board and its members at Oversightb­oard.com.

At the news briefing, Facebook’s policy director Brent Harris said, “Facebook will implement the board’s decisions unless doing so violates the law.” Thorning-schmidt, one of four board co-chairs, said, “Some of the most difficult decisions around content have been made by Facebook, and you could say ultimately by Mark Zuckerberg. And that’s why I feel that this is a huge step for the global community that Facebook has decided to change that with the oversight board. We will now for the first time have an independen­t body, which will make final and binding decisions on what content stays up and what content is removed.”

One could argue that creating a board like this to handle moderator decisions appeals is overkill, but if you look at the impact of Facebook’s content decisions on how people make important decisions, it starts to make sense. Facebook’s role in helping to spread misinforma­tion during the 2016 election is well known. But even as the company strives to rid itself of fake or misleading informatio­n, it encounters obstacles, including from politician­s and elected officials who feel that Facebook is making decisions based on ideology and political preference­s rather than on a strict set of rules that apply to people of all persuasion­s. The company has tried arguing that its decisions are based on behavior and that it’s not putting its thumb on the political scales, but that hasn’t quieted its critics who are convinced that Facebook, along with Google and to some extent Apple and Amazon, are biased. Most of the criticism comes from conservati­ves, but I’ve seen it coming from the left as well.

On its website, the oversight board argues that “freedom of expression is a fundamenta­l human right … but there are times when speech can be at odds with authentici­ty, safety, privacy, and dignity. Some expression can endanger other people’s ability to express themselves freely.” The board will make content decisions for Facebook and its subsidiary Instagram.

Relevant to pandemic

Although the board was announced long before the COVID-19 pandemic, it’s relevant in that the pandemic has brought out a lot of people using Facebook to promote false cures, sell shoddy products and promote false informatio­n about the nature of the virus. Facebook has been pretty aggressive about taking down what it considers to be dangerous false informatio­n, but what one person considers dangerous, another person might consider to be a worthwhile cure. Right now Facebook is on its own when it comes to establishi­ng and enforcing policies regarding dangerous content, but now it has a board to advise it on policy and make final decisions on content for those cases selected for review.

The board can also make policy recommenda­tions but Facebook will continue to set its own policies. Facebook has provided the board with a $130 million operating budget that it says cannot be revoked. Members are compensate­d but board officers did not disclose the amount.

Analysis

I don’t blame you if you’re skeptical. After all, this board was set up and funded by Facebook, which has a lot of work to do to protect its reputation given all of the problems related to inappropri­ate content and accusation­s of bias. It’s also clear that Zuckerberg has been on the defensive in his dealings with Congress and other elected officials. But, after listening to several of the members of this new board and reading through its charter, I’m willing to give the board and Facebook the benefit of the doubt.

It won’t be perfect. Mistakes will be made, some important content decisions that should be reviewed probably won’t be, because the board can’t possibly review all the cases. Decisions will be made that many people disagree with. In that sense, it’s no different from courts that have to weigh competing values, sometimes unclear facts and come to a decision that some people will not like.

But one thing I like about this board is that it will be able to make its decisions without worrying about stockholde­r approval, the bottom line or the ire of Zuckerberg, who will no doubt agree with some of its decisions and disagree with others. I also like the fact that the board says it will be transparen­t and publish its decisions so the public knows what it considered.

As CEO of Connectsaf­ely.org and a member of safety advisory boards at Facebook, Snapchat, Twitter, Roblox and other companies, I’ve learned that content decisions are often nuanced and not always tied to the company’s business interests. Sometimes there are competing rights like the right of free speech versus the right not to be harassed or bullied or having content that is highly disturbing, racist, sexist or otherwise hateful. These are often difficult decisions that require serious deliberati­on, and I’m pleased to know that there is a board of people who are equipped to do that hard work.

 ??  ??

Newspapers in English

Newspapers from United States