Fanning the flame of fake news
Disinformation and conspiracy theories abound on Youtube, which prioritises video views over accuracy
Forget Reddit — Youtube is the cesspool of the Internet. In this handwringing phase of the evils of social media, much has been made about Reddit, the chat site that is a haven for conspiracy theories and right-wing hatred.
The site — which has prided itself on not “interfering” in its readers’ discourse — recently shut down numerous discussion boards linked to hatemongering and outlandish theories.
The spark for this was the horrific shooting at US high school Marjory Stoneman Douglas, where 17 people were killed by another mentally unbalanced person who could freely buy an assault rifle. Immediately after the tragedy, a maelstrom of propaganda emerged about “crisis actors” and other patent nonsense.
I’d never heard of this odd phrase, which will go down in history as one of the nastiest forms of propaganda. What kind of person describes the teenage survivors of a gun massacre as actors, pretending to be victims so as to “take our guns away”?
The nexus for spreading such conspiracy theories, it turns out, was Youtube — a free-for-all of blatant untruths, deranged theories and other absurd disinformation.
Once considered the greatest of ways to share videos online, Youtube has become the worst of the Internet.
Last year the site was rocked by scandal when major brands pulled their advertising after it displayed alongside videos depicting hate speech, anti-semitism and extremism.
Youtube says it tries to control the spread of hate-mongering and fake news, but that creators of this content are quick to adapt and circumvent its attempts at censorship.
So Google can build artificial intelligence that can spot a diabetic illness that leads to blindness, but it can’t solve hate speech on its own video service. Who else doesn’t believe that?
Youtube has algorithms in place to try keep its 1.5bn viewers online for longer. It does that through a panel of recommended “up next” videos. But numerous commentators report how these suggested videos get increasingly more violent, or controversial, or more outlandishly inaccurate.
“Youtube is the most overlooked story of 2016. Its search and recommendation algorithms are misinformation engines,” tweeted University of North Carolina academic and New
York Times contributor Zeynep Tufekci.
Former Google engineer Guillaume Chaslot wrote in the run-up to the November 2016 US presidential elections that “80% of recommended [Youtube] videos were favourable to Donald Trump, whether the initial query was ‘Trump’ or ‘Clinton’. A large proportion of these recommendations were divisive and fake news”.
We live in a disinformation age in which Facebook, Google and Twitter may have been usurped by “bad actors” — but they are not blameless themselves. In Youtube’s case, its very own “give us your eyeballs for longer” recommendation algorithms are the problem.
What kind of person describes the teenage survivors of a gun massacre as actors, pretending to be victims?