Big tech has a vaccine misinformation problem
WITH less than half the United States population fully vaccinated for Covid-19 and as the Delta variant sweeps the nation, the US surgeon general issued an advisory that called misinformation an urgent threat to public health. The advisory said efforts by social-media companies to combat misinformation are “too little, too late and still don’t go far enough.” The advisory came more than a year after the World Health Organization warned of a Covid-related “infodemic.”
There’s good reason to be concerned. A study in the UK and the US found that exposure to online misinformation about Covid-19 vaccines reduced the number of people who said they would get vaccinated and increased the number of people who said they would not.
A serious threat in online settings is that fake news spreads faster than verified and validated news from credible sources. Articles connecting vaccines and death have been among the content people engage with most.
Algorithms on social-media platforms are primed for engagement. Recommendation engines in these platforms create a rabbit-hole effect by pushing users who click on anti-vaccine messages toward more antivaccine content. Individuals and groups that spread medical misinformation are well-organized to exploit the weaknesses of the engagement-driven ecosystems on social-media platforms.
Social media is being manipulated on an industrial scale, including a Russian campaign pushing disinformation about Covid-19 vaccines. Researchers have found that people who rely on Facebook as their primary source of news about the coronavirus are less likely to be vaccinated than people who get their coronavirus news from any other source.
While social-media companies have actively tagged and removed misinformation about Covid-19 generally, stories about vaccine side effects are more insidious because conspiracy theorists may not be trafficking in false information as much as engaging in selectively distorting risks from vaccination. These efforts are part of a well-developed disinformation ecosystem on social-media platforms that extends to offline anti-vaccine activism.
Here are two key steps social-media companies can take to reduce vaccine-related misinformation.
▪ BLOCK KNOWN SOURCES OF VACCINE MISINFORMATION. There have been popular antivaccine hashtags such as #vaccineskill. Though it was blocked on Instagram two years ago, it was allowed on Facebook until July 2021. Aside from vaccines, misinformation on multiple aspects of Covid-19 prevention and treatment abounds, including misinformation about the health benefits of wearing a mask.
Twitter recently suspended US Rep. Marjorie Taylor Greene for a couple of days, citing a post of Covid misinformation. But social-media companies could do a lot more to block disinformation spreaders. Reports suggest that most of the vaccine disinformation on Facebook and Twitter comes from a dozen users who are still active on social media referred to as the disinformation dozen. The list is topped by businessman and physician Joseph Mercola and prominent anti-vaccine activist Robert F. Kennedy Jr.
Evidence suggests that infodemic superspreaders engage in coordinated sharing of content, which increases their effectiveness in spreading disinformation and, correspondingly, makes it all the more important to block them. Social-media platforms need to more aggressively flag harmful content and remove people known to traffic in vaccine-related disinformation.
▪ DISCLOSE more ABOUT MEDICAL MISINFORMATION.
Facebook claims that it has taken down 18 million pieces of coronavirus misinformation. However, the company doesn’t share data about misinformation on its platforms. Researchers and policy-makers don’t know how much vaccine-related misinformation is on the platforms, and how many people are seeing and sharing misinformation.
Another challenge is distinguishing between different types of engagement. My own research studying medical information on Youtube found different levels of engagement, people simply viewing information that’s relevant to their interests and people commenting on and providing feedback about the information. The issue is how vaccine-related misinformation fits into people’s preexisting beliefs and to what extent their skepticism of vaccines is accentuated by what they are exposed to online.
Social-media companies can also partner with health organizations, medical journals and researchers to more thoroughly and credibly identify medical misinformation.
Data about social media will help researchers answer key questions about medical misinformation, and the answers in turn could lead to better ways of countering the misinformation.