Santa Cruz Sentinel

Vaccine hesitancy and the misinforma­tion conundrum

- By Catherine Rampell

It’s hard to teach an algorithm to identify misinforma­tion when humans themselves can’t agree on what misinforma­tion is — and when political leaders can’t decide whether we should have more or less of whatever it entails.

Lately, vaccine hesitance has been calcifying into outright vaccine refusal. That’s partly because so many Americans have been fed a steady diet of misinforma­tion and conspiracy theories about vaccine risks. Roughly 90% of Americans who don’t plan to get vaccinated say they fear possible side effects from the shot more than they fear COVID-19 itself, a recent YouGov poll found. Roughly half of those who reject the vaccine believe the U.S. government is using the vaccine to microchip the population. (Hey, that would at least explain the global chip shortage.)

Where are Americans getting these kooky ideas? Politician­s and pundits have been quick to blame social media platforms.

That’s understand­able. Misinforma­tion has flourished on Facebook and other sites for many years. Unlike truths, lies are unconstrai­ned by reality, which means they can be crafted to be maximally interestin­g, sexy, terrifying. In other words, they’re optimized to generate traffic, which happens to be good for tech companies’ bottom lines. “Fake news” — whether fashioned by enterprisi­ng Macedonian teenagers, malicious state actors, U.S. political groups, snake-oil salesmen or your standard-issue tinfoil-hatters — drove tons of engagement on these sites in the lead-up to the 2016 election and has continued to do so.

Whether out of principle or financial self-interest, tech executives initially said they weren’t in the business of taking down content simply because it was false. Intense blowback followed, along with pressure for tech companies to recognize how their tools were being exploited to undermine democracy, stoke violence and generally poison people’s brains; the firms have since ramped up factchecki­ng and content moderation.

During the pandemic, Facebook has removed “over 18 million instances of COVID-19 misinforma­tion” and made less visible “more than 167 million pieces of COVID-19 content debunked by our network of factchecki­ng partners,” the company wrote in a blog post over the weekend. This was in response to President Joe Biden’s comments Friday that social media platforms were “killing people” by allowing vaccine misinforma­tion to flourish.

On the one hand, yes, social media companies absolutely still can and must do more to scrub misinforma­tion from their platforms. Case in point: A recent report on the “Disinforma­tion Dozen” estimated that 12 accounts are responsibl­e for 65% of anti-vaccine content on Facebook and Twitter. Their claims include that vaccines have killed more people than COVID and are a conspiracy to “wipe out” Black people. All 12 remain active on at least Facebook or Twitter.

But on the other hand: Actually doing more to stamp out this misinforma­tion is challengin­g. Not because these firms lack the workers or technology to identify problemati­c content; the real obstacle is political.

Democrats are mad that the companies suppress too little speech, allowing conspiracy theories to proliferat­e. Republican­s are mad that these companies are suppressin­g too much speech, since often it’s rightwing content that gets (rightly) flagged as fake. Absent some political consensus on which way these companies are at fault, or regulation that tells Facebook and other platforms what content is acceptable (a move that would likely face First Amendment challenges), the firms will always be nervous about censoring too aggressive­ly.

The fact that a lot of this same disinforma­tion is being disseminat­ed on prime-time cable also makes it politicall­y harder for tech companies to justify taking it down.

It’s not merely a few no-name Facebook accounts promoting anti-vaccine nonsense. Fox News host Tucker Carlson, for instance, recently gave an entire monologue linking the government’s coronaviru­s vaccinatio­n effort to historical forced sterilizat­ion campaigns. His show then posted the clip on Facebook, which flagged it with a generic note about how coronaviru­s vaccines have been tested for safety.

Should Carlson’s insinuatio­ns have been removed entirely? That’s risky. As conspiracy-theorizing becomes more mainstream, and gobbles up an entire political party and the media ecosystem that sustains it, policing those conspiracy theories and the conservati­ve leaders who promote them appears more politicall­y motivated. Not coincident­ally, the White House has reserved its harshest criticism about anti-vaccine content for social media companies rather than conservati­ve news organizati­ons parroting similar messages. Already despised by both parties, Big Tech is a safer target.

Newspapers in English

Newspapers from United States