Facebook may start curbing posts that promote anti-vaccine content.
As public pressure intensifies over how Facebook promotes misinformation about vaccines, the social media giant is considering removing anti-vaccination content from its recommendation systems, Bloomberg reported Thursday.
Facebook has become something of a haven for a small but vocal community of parents who reject wisdom about immunizations, often citing junk science or conspiracy theories, and opt out of having their children vaccinated.
This week, Facebook has come under fire for promoting anti-vaccination material, especially ads targeting women in regions with high numbers of measles cases, according to reporting from the Daily Beast.
The outcry intensified after Rep. Adam Schiff, D-Calif., wrote a letter to founder and chief executive Mark Zuckerberg asking how Facebook planned to protect users from misleading material about vaccinations. Schiff sent a similar letter to Sundar Pichai, chief executive of Google, which is also under scrutiny about how its search engine and subsidiary YouTube promote potentially dangerous misinformation.
“The algorithms which power these services are not designed to distinguish quality information from misinformation or misleading information, and the consequences of that are particularly troubling for public health issues,” Schiff wrote to both tech executives.
In a response to Bloomberg about questions raised by Schiff, Facebook said it is “exploring additional measures to combat the problem,” including “reducing or removing this type of content from recommendations, including Groups You Should Join.” The company also said it’s considering demoting these options in search results and ensuring that “higher quality and more authoritative” information is available.”
These tensions come as the United States faces a troubling resurgence of measles, a disease that was declared eliminated by the Centers For Disease Control and Prevention after the measles, mumps and rubella vaccine was introduced in 2000.
Over 100 cases have been confirmed in 10 states this year, already surpassing the total number of cases confirmed in 2016. Last month, Washington Gov. Jay Inslee, a Democrat, declared a state of emergency after 25 cases of measles cropped up in a county, where nearly a quarter of kids attend school without measles, mumps and rubella immunizations.
Facebook has contended that most anti-vaccination content didn’t violate its community guidelines for inciting “real-world harm.” The company told the Washington Post this week that it didn’t believe removing such material would help raise awareness of the facts about vaccinations. Facebook said it thinks accurate counterspeech is a more productive safeguard against misinformation.
“While we work hard to remove content that violates our policies, we also give our community tools to control what they see as well as use Facebook to speak up and share perspectives with the community around them,” Facebook told The Post on Wednesday.
Facebook did not immediately respond to a request Friday on whether the company’s stance on these issues has changed.
The platform has a spotty record when it comes to the quality of information in popular health content seen by its users.
A recent study from the Credibility Coalition and Health Feedback, a group of scientists who evaluate the accuracy of health media coverage, found the majority of the most-clicked health stories on Facebook in 2018 were fake or contained a significant amount of misleading information.
The study looked at the top 100 health stories with the most engagements on social media and had a network of experts assess their credibility. The study found less than half were “highly credible.”
Vaccinations ranked among the three most popular story topics.