The Reporter (Lansdale, PA)

COVID-19, vaccine hesitancy and misinforma­tion

- Catherine Rampell

It’s hard to teach an algorithm to identify misinforma­tion when humans themselves can’t agree on what misinforma­tion is — and when political leaders can’t decide whether we should have more or less of whatever it entails.

Lately, vaccine hesitance has been calcifying into outright vaccine refusal. That’s partly because so many Americans have been fed a steady diet of misinforma­tion and conspiracy theories about vaccine risks. Roughly 90% of Americans who don’t plan to get vaccinated say they fear possible side effects from the shot more than they fear COVID-19 itself, a recent YouGov poll found. Roughly half of those who reject the vaccine believe the U.S. government is using the vaccine to microchip the population. (Hey, that would at least explain the global chip shortage.)

Where are Americans getting these kooky ideas? Politician­s and pundits have been quick to blame social media platforms.

That’s understand­able. Misinforma­tion has flourished on Facebook and other sites for many years. Unlike truths, lies are unconstrai­ned by reality, which means they can be crafted to be maximally interestin­g, sexy, terrifying. In other words, they’re optimized to generate traffic, which happens to be good for tech companies’ bottom lines.

Whether out of principle or financial self-interest, tech executives initially said they weren’t in the business of taking down content simply because it was false. (This included, infamously, Holocaust-denial claims.) Intense blowback followed, along with pressure for tech companies to recognize how their tools were being exploited to undermine democracy, stoke violence and generally poison people’s brains; the firms have since ramped up fact-checking and content moderation.

During the pandemic, Facebook has removed “over 18 million instances of COVID-19 misinforma­tion” and made less visible “more than 167 million pieces of COVID-19 content debunked by our network of factchecki­ng partners,” the company wrote in a blog post over the weekend. This was in response to President Joe Biden’s comments that social media platforms were “killing people” by allowing vaccine misinforma­tion to flourish.

On the one hand, yes, social media companies absolutely still can and must do more to scrub misinforma­tion from their platforms. Case in point: A recent report on the “Disinforma­tion Dozen” estimated that 12 accounts are responsibl­e for 65% of anti-vaccine content on Facebook and Twitter. But on the other hand: Actually doing more to stamp out this misinforma­tion is challengin­g. Not because these firms lack the workers or technology to identify problemati­c content; the real obstacle is political.

Politician­s of both parties hate Big Tech’s approach to content moderation and think it should change — but propose diametrica­lly opposite directions.

Democrats are mad that the companies suppress too little speech, allowing conspiracy theories to proliferat­e. Republican­s are mad that these companies are suppressin­g too much speech, since often it’s rightwing content that gets (rightly) flagged as fake. The fact that a lot of this same disinforma­tion is being disseminat­ed on primetime cable also makes it politicall­y harder for tech companies to justify taking it down.

Fox News host Tucker Carlson, for instance, recently gave an entire monologue linking the government’s coronaviru­s vaccinatio­n effort to historical forced sterilizat­ion campaigns. His show then posted the clip on Facebook, which flagged it with a generic note about how coronaviru­s vaccines have been tested for safety.

Should Carlson’s insinuatio­ns have been removed entirely? That’s risky. As conspiracy-theorizing becomes more mainstream, and gobbles up an entire political party and the media ecosystem that sustains it, policing those conspiracy theories and the conservati­ve leaders who promote them appears more politicall­y motivated.

Now, one could argue that tech firms should step up and impose the moderation policies they think are right, political (and perhaps financial) fallout be damned. Perhaps these companies could more forcefully rebut Republican­s’ claims of politicall­y motivated censorship and “shadow-banning” by pointing out that right-wing content still dominates the most popular posts every day on Facebook.

But if even White House officials appear tentative about picking fights with the rightwing industrial complex, it’s not surprising that tech firms would follow suit.

 ??  ??

Newspapers in English

Newspapers from United States