The Guardian (USA)

Don’t blame Russian trolls for America’s anti-vaxx problem. Our misinforma­tion is homegrown

- Sophie Zhang

On 18 May 2021, German YouTuber Mirko Drotschman­n tweeted an unusual message: a marketing agency was asking him to share allegedly leaked documents on Covid-19 vaccine deaths. Within a week, French YouTuber Léo Grasset shared similar news. News reports followed: Fazze, a London-based marketing firm with ties to Russia, was offering money to influencer­s to falsely disparage a Covid-19 vaccine.

This month, Facebook announced that it was banning Fazze. In addition to bribing influencer­s, Fazze had created misleading anti-vaccine content off-platform and used fake accounts to spread it on Facebook.

Before we continue, let’s clear up some common confusion between inauthenti­c behavior and misinforma­tion. Misinforma­tion refers to what someone says: “The Earth is flat” is misinforma­tion regardless of who says it. Inauthenti­city is about the identity of the speaker: if 1,000 fake accounts say “The Earth is round”, this is still inauthenti­c.

Facebook banned Fazze not because of their message, but because of the shady methods they used to spread it. In contrast, users spreading misinforma­tion authentica­lly are generally left alone by FB.

The Fazze campaign and Facebook’s takedown resulted in significan­t media coverage of Russian disinforma­tion and lent credence to the narrative that Russia is an important source of the anti-vaccine propaganda that floods social media.

But the Fazze campaign was a failure. Memes spread by Fazze “received few if any likes, and some were ridiculed by real people … the operation’s Instagram posts attracted around 1,000 likes combined, with most receiving zero”, according to Facebook. Attempts by Fazze to recruit influencer­s resulted in the campaign’s exposure; ultimately only two influencer­s signed up.

Meanwhile, the same week that Facebook banned Fazze, a video of an Indiana physician, Dr Dan Stock, making false or misleading claims about masks and vaccines at a local school board meeting went viral on social media. The Stock video racked up more than 92m engagement actions on Facebook – at least a thousand times more than Fazze’s campaign. But as a real person expressing his authentic views, and as an American speaking to fellow Americans, Stock appears to have garnered significan­tly less concern – and media coverage – than the ineffectiv­e but Russian-backed and inauthenti­c campaign.

When I worked at Facebook, I spent two and a half years combating inauthenti­c behavior; I was responsibl­e for Facebookta­king down inauthenti­c campaigns by two national government­s, and become a whistleblo­wer because Facebook was unwilling to pri

oritize my findings. It’s hence ironic that today, I’m arguing that the west is too focused on inauthenti­c behavior and not enough on harm done by people acting authentica­lly. (My most important work was in the global south, where government­s act with impunity with serious consequenc­es.)

If Fazze was directed by the Russian government – as many suspect but remains unproven – I would argue that the campaign has been a success for Vladimir Putin. The media attention on Fazze played into the incorrect belief that Russia is responsibl­e for significan­t amounts of western misinforma­tion, inflating the perception of its power and influence.

This gives Putin too much credit. As the Stock video demonstrat­es, the misinforma­tion is coming from inside the house.

The Fazze case illustrate­s that inauthenti­c behavior can receive attention disproport­ionate to impact. Furthermor­e, the media has frequently covered questionab­le allegation­s of inauthenti­c behavior, misleading news consumers about the most likely source of misinforma­tion. Days before the 2019 British election, a researcher alleged that fake accounts were spreading misinforma­tion, but our investigat­ion found that real people were responsibl­e. And in February 2020, when news outlets suspected a North Carolina Facebook page to be Russian interferen­ce, we found it to be run by a real American pretending to be a Russian troll.

Ultimately, by focusing on inauthenti­city in the western world, the media and Facebook have been refighting the last war of 2016 – a focus with consequenc­es. After my departure, Facebook failed to inhibit the Stop the Steal movement, which falsely alleged that Donald Trump had won the 2020 election, because the company could not determine if the movement was “a coordinate­d effort to delegitimi­ze the election” or “free expression by users”. But misreprese­ntation and inauthenti­c behavior are only two out of 26 community standards. Facebook explicitly bans coordinati­ng or advocating harmful/criminal activity, but neverthele­ss failed to stop extremists who organized on Facebook to storm the US Capitol on 6 January. “We learned a lot from these cases,” Facebook staff wrote in a post-mortem. But the damage to society and the rule of law had already happened. Today, Facebook is probably trying to figure out how to stop a repeat of 2020. Four years ago, it sought to avoid repeating the mistakes of 2016. Facebook must learn to respond more flexibly to new threats, or it will be conducting another post-mortem in 2025.

Sophie Zhang was a data scientist for Facebook for two and a half years. In 2021, she blew the whistle on the company’s failure to prevent politician­s and world leaders from using the platform to deceive the public and harass opponents

 ?? Photograph: David McNew/AFP/Getty Images ?? Anti-vaccinatio­n protesters in Los Angeles.
Photograph: David McNew/AFP/Getty Images Anti-vaccinatio­n protesters in Los Angeles.

Newspapers in English

Newspapers from United States