Tech platforms face ‘whack-a-mole’ situation in battling health misinformation
When Dr. Garth Graham thinks about health misinformation on social media platforms, he envisions a garden.
No matter how bountiful or verdant that garden is, even the head of YouTube’s global health division admits it’s often in need of tending.
“How do you weed and pull out the bad information?” he questioned.
“But also...how do you plant the seeds and make sure people have access to good information as well as high quality information?”
For social media companies, these have become perennial questions that have only grown in importance as the number of platforms multiplied and people began spending increasing amounts of time online.
Now, it’s not uncommon to spot misinformation with almost every scroll.
A 2022 paper published in the Bulletin of the World Health
Organization reviewed 31 studies examining how prevalent misinformation is. The analysis found misinformation in up to 51 per cent of social media posts associated with vaccines, up to 28.8 per cent of content associated with COVID-19, and up to 60 per cent of posts related to pandemics.
An estimated 20 to 30 per cent of YouTube videos about emerging infectious diseases were also found to contain inaccurate or misleading information.
The consequences can be harmful, if not deadly.
Research the Council of Canadian Academies released in 2023 said COVID-19 misinformation alone contributed to more than 2,800 Canadian deaths and at least $300 million in hospital and ICU visits.
Platforms take the risks seriously, Graham said in an interview. “We are always concerned about anything that may produce harm.”
That concern often leads platforms to remove anything violating their content policies.
YouTube, for example, has banned content denying the existence of some medical conditions or contradicting health authority guidance on prevention and treatment.
Examples embedded in its medical misinformation policy show the company removes posts promoting turpentine, gasoline and kerosene as a treatment for certain conditions because these substances cause death. Ivermectin, used to treat parasitic worms in animals and humans, and hydroxychloroquine, a malaria drug, are also barred from being promoted as COVID-19 cures.
When it comes to vaccines, YouTube bans videos alleging immunizations cause cancer or paralysis.
Facebook and Instagram parent company Meta Platforms
Inc. refused to comment for this story and TikTok did not respond to a request for comment, but in broad strokes, these companies have similar policies to YouTube.