Los Angeles Times

Before verdict, Facebook acted to curb violence. Why not always?

It took steps during Chauvin trial to head off violence. Critics ask: Why not always?

- By Brian Contreras

As lawyers for both sides offered their closing statements in the trial of Derek Chauvin on Monday, a thousand miles away, executives at Facebook were preparing for the verdict to drop.

Seeking to avoid incidents like the one last summer in which 17-year-old Kyle Rittenhous­e shot and killed two protesters in Kenosha, Wis., the social media company said it would take actions aimed at “preventing online content from being linked to offline harm.”

(Chauvin is the former Minneapoli­s police officer found guilty Tuesday of the second-degree murder of George Floyd last May; the Kenosha shootings took place in August 2020 after a local militia group called on armed civilians to defend the city amid protests against the police shooting of another Black man, Jacob Blake.)

As precaution­s, Facebook said it would “remove Pages, groups, Events and Instagram accounts that violate our violence and incitement policy,” and would also “remove events organized in temporary, high-risk locations that contain calls to bring arms.” It also promised to take down content violating prohibitio­ns on “hate speech, bullying and harassment, graphic violence, and violence and incitement,” as well as “limit the spread” of posts its system predicts are likely to later be removed for violations.

“Our teams are working around the clock to look for potential threats both on and off of Facebook and Instagram so we can protect peaceful protests and limit content that could lead to civil unrest or violence,” Monika Bickert, Facebook’s vice president of content policy, wrote in a blog post.

But in demonstrat­ing the power it has to police problemati­c content when it feels a sense of urgency, Facebook invited its many critics to ask: Why not take such precaution­s all the time?

“Hate is an ongoing problem on Facebook, and the fact that Facebook, in response to this incident, is saying that it can apply specific controls to emergency situations means that there is more that they can do to address hate, and that … for the most part, Facebook is choosing not to do so,” said Daniel Kelley, associate director of the Anti-Defamation League’s Center for Technology and Society.

“It’s really dishearten­ing to imagine that there are controls that they can put in place around so-called ‘emergency situations’ that would increase the sensitivit­y of their tools, their products, around hate and harassment [generally].”

This isn’t the only time Facebook has “turned up the dials” in anticipati­on of political violence. Just this year, it has taken similar steps around President Biden’s inaugurati­on, the coup in Myanmar and India’s elections.

Facebook declined to discuss why these measures aren’t the platform’s default, or what downside always having them in place would pose. In a 2018 essay, Chief Executive Mark Zuckerberg said content that flirts with violating site policies received more engagement in the form of clicks, likes, comments and shares. Zuckerberg called it a “basic incentive problem” and said Facebook would reduce distributi­on of such “borderline content.”

Central to Facebook’s response seems to be its designatio­n of Minneapoli­s as a temporary “high-risk location” — a status the company said may be applied to additional locations as the situation in Minneapoli­s develops. Facebook has previously described comparable moderation efforts as responses specifical­ly geared toward “countries at risk of conflict.”

“They’re trying to get ahead of … any kind of outbreak of violence that may occur if the trial verdict goes one way or another,” Kelley said before the verdict was announced. “It’s a mitigation effort on their part, because they know that this is going to be … a really momentous decision.”

He said Facebook needs to make sure it doesn’t interfere with legitimate discussion of the Chauvin trial — a balance the company has more than enough resources to be able to strike, he added.

Another incentive for Facebook to handle the Chauvin verdict with extreme caution is to avoid feeding into the inevitable criticism of its impending decision about whether former President Trump will remain banned from the platform. Trump was kicked off earlier this year for his role in the Jan. 6 Capitol riots; the case is now being decided by Facebook’s third-party oversight committee.

Shireen Mitchell — founder of Stop Online Violence Against Women and a member of “The Real Facebook Oversight Board,” a Facebook-focused watchdog group — sees the steps being taken this week as an attempt to preemptive­ly “soften the blow” of that decision.

Trump, “who has incited violence, including an insurrecti­on; has targeted Black people and Black voters; is going to get back on their platform,” Mitchell predicted. “And they’re going to in this moment pretend like they care about Black people by caring about this case. That’s what we’re dealing with, and it’s such a false f lag over decades of … the things that they’ve done in the past, that it’s clearly a strategic action.”

As public pressure mounts for web platforms to strengthen their moderation of user content, Facebook isn’t the only company that has developed powerful moderation tools and then faced questions as to why it only selectivel­y deploys them.

Earlier this month, Intel faced criticism and mockery over Bleep, an artificial­ly intelligen­t moderation tool aimed at giving gamers more granular control over what sorts of language they encounter via voice chat — including sliding scales for how much misogyny and white nationalis­m they want to hear, and a button to toggle the N-word on and off.

And this week, Nextdoor launched an alert system that notifies users if they try to post something racist, but then doesn’t actually stop them from publishing it.

 ?? Jeff Chiu Associated Press ?? FACEBOOK CEO Mark Zuckerberg has acknowledg­ed that content that flirts with violating site policies received more engagement in the form of clicks, likes and shares. Above, a protest outside his home in 2020.
Jeff Chiu Associated Press FACEBOOK CEO Mark Zuckerberg has acknowledg­ed that content that flirts with violating site policies received more engagement in the form of clicks, likes and shares. Above, a protest outside his home in 2020.

Newspapers in English

Newspapers from United States