The Middletown Press (Middletown, CT)
YouTube bans false vaccine claims
YouTube announced a sweeping crackdown of vaccine misinformation Wednesday that booted popular anti-vaccine influencers from its site and deleted false claims that have been made about a range of immunizations.
The video-sharing platform said it will no longer allow users to baselessly speculate that approved vaccines, like the ones given to prevent the flu or measles, are dangerous or cause diseases.
YouTube’s latest attempt to stem a tide of vaccine misinformation comes as countries around the globe struggle to convince a somewhat vaccine hesitant public to accept the free immunizations that scientists say will end the COVID-19 pandemic that began 20 months ago. The tech platform, which is owned by Google, already tried to ban COVID-19 vaccine misinformation last year, at the height of the pandemic.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” YouTube said in a blog post.
Up until Wednesday, antivaccine influencers, who have thousands of subscribers, had used YouTube to stoke fears around vaccines that health experts point out have been safely administered for decades. The YouTube channel of an organization run by environmental activist Robert F. Kennedy Jr. was one of several popular anti-vaccine accounts that was gone by Wednesday morning.
In an emailed statement to The Associated Press, Kennedy criticized the ban: “There is no instance in history when censorship and secrecy have advanced either democracy or public health.”
YouTube declined to provide details on how many accounts were removed in the crackdown.
Under its new policy, YouTube says it will remove misinformation about any vaccine that has been approved by health authorities, such as the World Health Organization, and is currently being administered. False claims that those vaccines are dangerous or cause health issues, like cancer, infertility or autism — theories that scientists have discredited for decades but have endured on the internet — should also be removed.
In March, Twitter began labelling content that made misleading claims about COVID-19 vaccines and said it would ban accounts that repeatedly share such posts. Facebook, which also owns Instagram, had already prohibited posts claiming COVID-19 vaccines cause infertility or contain tracking microchips, and in February announced it would similarly remove claims that vaccines are toxic or can cause health problems such as autism.
Yet popular anti-vaccine influencers remain live on Facebook, Instagram and Twitter, where they actively use the platforms to sell books or videos. On Facebook and Instagram alone, a handful of anti-vaccine influencers still have a combined 6.4 million followers, according to social media watchdog group the Center for Countering Digital Hate.