Financial Mail

Fanning the flame of fake news

Disinforma­tion and conspiracy theories abound on Youtube, which prioritise­s video views over accuracy

- @shapshak

Forget Reddit — Youtube is the cesspool of the Internet. In this handwringi­ng phase of the evils of social media, much has been made about Reddit, the chat site that is a haven for conspiracy theories and right-wing hatred.

The site — which has prided itself on not “interferin­g” in its readers’ discourse — recently shut down numerous discussion boards linked to hatemonger­ing and outlandish theories.

The spark for this was the horrific shooting at US high school Marjory Stoneman Douglas, where 17 people were killed by another mentally unbalanced person who could freely buy an assault rifle. Immediatel­y after the tragedy, a maelstrom of propaganda emerged about “crisis actors” and other patent nonsense.

I’d never heard of this odd phrase, which will go down in history as one of the nastiest forms of propaganda. What kind of person describes the teenage survivors of a gun massacre as actors, pretending to be victims so as to “take our guns away”?

The nexus for spreading such conspiracy theories, it turns out, was Youtube — a free-for-all of blatant untruths, deranged theories and other absurd disinforma­tion.

Once considered the greatest of ways to share videos online, Youtube has become the worst of the Internet.

Last year the site was rocked by scandal when major brands pulled their advertisin­g after it displayed alongside videos depicting hate speech, anti-semitism and extremism.

Youtube says it tries to control the spread of hate-mongering and fake news, but that creators of this content are quick to adapt and circumvent its attempts at censorship.

So Google can build artificial intelligen­ce that can spot a diabetic illness that leads to blindness, but it can’t solve hate speech on its own video service. Who else doesn’t believe that?

Youtube has algorithms in place to try keep its 1.5bn viewers online for longer. It does that through a panel of recommende­d “up next” videos. But numerous commentato­rs report how these suggested videos get increasing­ly more violent, or controvers­ial, or more outlandish­ly inaccurate.

“Youtube is the most overlooked story of 2016. Its search and recommenda­tion algorithms are misinforma­tion engines,” tweeted University of North Carolina academic and New

York Times contributo­r Zeynep Tufekci.

Former Google engineer Guillaume Chaslot wrote in the run-up to the November 2016 US presidenti­al elections that “80% of recommende­d [Youtube] videos were favourable to Donald Trump, whether the initial query was ‘Trump’ or ‘Clinton’. A large proportion of these recommenda­tions were divisive and fake news”.

We live in a disinforma­tion age in which Facebook, Google and Twitter may have been usurped by “bad actors” — but they are not blameless themselves. In Youtube’s case, its very own “give us your eyeballs for longer” recommenda­tion algorithms are the problem.

What kind of person describes the teenage survivors of a gun massacre as actors, pretending to be victims?

 ??  ??

Newspapers in English

Newspapers from South Africa