“Anti-vaxxers” spreading conspiracy theories on Facebook
As a disturbing number of measles outbreaks crop up across the United States, Facebook is facing challenges combating widespread misinformation about vaccinations on its platform, which has become a haven for the antivaccination movement.
The World Health Organization recently named “vaccine hesitancy” as one of the biggest global health threats of 2019. But on Facebook, in public pages and private groups with tens of thousands of members, false information about vaccines — largely stemming from a now-debunked 1998 study that tied immunizations to autism — is rampant and tough to pin down. In the bubble of closed groups, users warn against the dangers of vaccinations, citing pseudoscience and conspiracy theories.
Facebook has publicly declared that fighting misinformation is one of its top priorities. But when it comes to policing misleading content about vaccinations, the site faces a thorny challenge. The bulk of antivaccination content doesn’t violate Facebook’s community guidelines for inciting “real-world harm,” according to a spokesperson, and the site’s algorithms often promote unscientific pages or posts about the issue. Parents are left to wade through the mire, and as the viral spread of fake news has shown, many users have trouble distinguishing between reliable sources and unreliable ones.
The rise of “anti-vaxx” Facebook groups is overlapping with a resurgence of measles, a disease that was declared “eliminated” in the U.S. in 2000 because of the measles, mumps and rubella vaccine. But cases have increased in recent years, and at least 10 states have seen outbreaks this winter.
Last month, Democratic Washington Gov. Jay Inslee declared a state of emergency after 35 cases of measles cropped up in a single county, where nearly a quarter of kids attend school without measles, mumps and rubella immunizations. The WHO has named the highly contagious disease a leading cause of death for children.
Although the spread of misinformation about immunizations has potentially fatal repercussions, a Facebook spokesperson said the company doesn’t believe removing such content doesn’t help to increase awareness.
“While we work hard to remove content that violates our policies, we also give our community tools to control what they see as well as use Facebook to speak up and share perspectives with the community around them,” Facebook said in a statement that was emailed to The Washington Post. “If the content they’re posting crosses the line and violates our policies, we would remove the content as soon as we become aware of it.”
The company is considering options to make accurate information about vaccinations more accessible to users, but these efforts are in the early stages. In the meantime, Facebook sees factually accurate counterspeech by users as a possible safeguard, he said.
Wendy Sue Swanson, a pediatrician at Seattle Children’s Hospital and spokeswoman for the American Academy of Pediatrics, recently met with Facebook strategists about dealing with public health issues, including misinformation about vaccines, on the platform. Swanson said it’s not Facebook’s job to police the dialogue around immunizations, but rather to make sure users have ample access to scientifically valid content.
“You wouldn’t go see a pediatrician who doesn’t hold medical certification, but on the internet, you might listen to them,” Swanson said. “Facebook isn’t responsible for changing quacks, but they do have an opportunity to change the way information is served up.”