The Middletown Press (Middletown, CT)

YouTube bans false vaccine claims


YouTube announced a sweeping crackdown of vaccine misinforma­tion Wednesday that booted popular anti-vaccine influencer­s from its site and deleted false claims that have been made about a range of immunizati­ons.

The video-sharing platform said it will no longer allow users to baselessly speculate that approved vaccines, like the ones given to prevent the flu or measles, are dangerous or cause diseases.

YouTube’s latest attempt to stem a tide of vaccine misinforma­tion comes as countries around the globe struggle to convince a somewhat vaccine hesitant public to accept the free immunizati­ons that scientists say will end the COVID-19 pandemic that began 20 months ago. The tech platform, which is owned by Google, already tried to ban COVID-19 vaccine misinforma­tion last year, at the height of the pandemic.

“We’ve steadily seen false claims about the coronaviru­s vaccines spill over into misinforma­tion about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” YouTube said in a blog post.

Up until Wednesday, antivaccin­e influencer­s, who have thousands of subscriber­s, had used YouTube to stoke fears around vaccines that health experts point out have been safely administer­ed for decades. The YouTube channel of an organizati­on run by environmen­tal activist Robert F. Kennedy Jr. was one of several popular anti-vaccine accounts that was gone by Wednesday morning.

In an emailed statement to The Associated Press, Kennedy criticized the ban: “There is no instance in history when censorship and secrecy have advanced either democracy or public health.”

YouTube declined to provide details on how many accounts were removed in the crackdown.

Under its new policy, YouTube says it will remove misinforma­tion about any vaccine that has been approved by health authoritie­s, such as the World Health Organizati­on, and is currently being administer­ed. False claims that those vaccines are dangerous or cause health issues, like cancer, infertilit­y or autism — theories that scientists have discredite­d for decades but have endured on the internet — should also be removed.

In March, Twitter began labelling content that made misleading claims about COVID-19 vaccines and said it would ban accounts that repeatedly share such posts. Facebook, which also owns Instagram, had already prohibited posts claiming COVID-19 vaccines cause infertilit­y or contain tracking microchips, and in February announced it would similarly remove claims that vaccines are toxic or can cause health problems such as autism.

Yet popular anti-vaccine influencer­s remain live on Facebook, Instagram and Twitter, where they actively use the platforms to sell books or videos. On Facebook and Instagram alone, a handful of anti-vaccine influencer­s still have a combined 6.4 million followers, according to social media watchdog group the Center for Countering Digital Hate.

Newspapers in English

Newspapers from United States