The Columbus Dispatch

Facebook takes on German disinforma­tion

- David Klepper

Days before Germany’s federal elections, Facebook took what it called an unpreceden­ted step: the removal of a series of accounts that worked together to spread COVID-19 misinforma­tion and encourage violent responses to COVID restrictio­ns.

The crackdown, announced Sept. 16, was the first use of Facebook’s new “coordinate­d social harm” policy aimed at stopping not state-sponsored disinforma­tion campaigns but otherwise typical users who have mounted an increasing­ly sophistica­ted effort to sidestep rules on hate speech or misinforma­tion.

In the case of the German network, the nearly 150 accounts, pages and groups were linked to the so-called Querdenken movement, a loose coalition that has protested lockdown measures in Germany and includes vaccine and mask opponents, conspiracy theorists and some far-right extremists.

Facebook touted the move as an innovative response to potentiall­y harmful content; far-right commenters condemned it as censorship. But a review of the content that was removed – as well as the many more Querdenken posts that are still available – reveals Facebook’s action to be modest at best. At worst, critics say, it could have been a ploy to counter complaints that it doesn’t do enough to stop harmful content.

“This action appears rather to be motivated by Facebook’s desire to demonstrat­e action to policymake­rs in the days before an election, not a comprehens­ive effort to serve the public,” concluded researcher­s at Reset, a U.k.-based nonprofit that has criticized social media’s role in democratic discourse.

Facebook regularly updates journalist­s about accounts it removes under policies banning “coordinate­d inauthenti­c behavior,” a term it created in 2018 to describe groups or people who work together to mislead others. Since then, it has removed thousands of accounts, mostly what it said were bad actors attempting to interfere in elections and politics in countries around the world.

But there were constraint­s, since not all harmful behavior on Facebook is “inauthenti­c”; there are plenty of perfectly authentic groups using social media to incite violence, spread misinforma­tion and hate. So the company was limited by its policy on what it could take down.

But even with the new rule, a problem remains with the takedowns: they don’t make it clear what harmful material remains up on Facebook, making it difficult to determine just what the social network is accomplish­ing.

Case in point: the Querdenken network. Reset had already been monitoring the accounts removed by Facebook and issued a report that concluded only a small portion of content relating to Querdenken was taken down while many similar posts were allowed to stay up.

The dangers of COVID-19 extremism were underscore­d days after Facebook’s announceme­nt when a young German gas station worker was fatally shot by a man who had refused to wear a mask. The suspect followed several far-right users on Twitter and had expressed negative views about immigrants and the government.

Facebook initially declined to provide examples of the Querdenken content it removed, but ultimately released four posts to the Associated Press that weren’t dissimilar to content still available on Facebook. They included a post falsely stating that vaccines create new viral variants and another that wished death on police that broke up violent protests against COVID restrictio­ns.

Reset’s analysis of comments removed by Facebook found that many were actually written by people trying to rebut Querdenken arguments, and did not include misinforma­tion.

Facebook defended its action, saying the account removals were never meant to be a blanket ban of Querdenken, but instead a carefully measured response to users who were working together to violate its rules and spread harmful content.

Newspapers in English

Newspapers from United States