Der Standard

YouTube, the Great Radicalize­r

-

At one point during the 2016 presidenti­al election campaign, I watched a bunch of videos of Donald Trump rallies on YouTube. I was writing an article about his appeal to his voter base and wanted to confirm a few quotations.

Soon I noticed something peculiar. YouTube started to recommend and “autoplay” videos for me that featured white supremacis­t rants, Holocaust denials and other disturbing content.

Since I was not in the habit of watching extreme right-wing fare on YouTube, I was curious whether this was an exclusivel­y right-wing phenomenon. So I created another YouTube account and started watching videos of Hillary Clinton and Bernie Sanders, letting YouTube’s recommende­r algorithm take me wherever it would.

Before long, I was being directed to videos of a leftish conspirato­rial cast, including arguments about the existence of secret government agencies and allegation­s that the United States government was behind the attacks of September 11. As with the Trump videos, YouTube was recommendi­ng content that was more extreme than the mainstream political fare I had started with.

Intrigued, I experiment­ed with nonpolitic­al topics. The same basic pattern emerged. Videos about vegetarian­ism led to videos about veganism. Videos about jogging led to videos about running ultramarat­hons.

It seems as if you are never “hard core” enough for YouTube’s recommenda­tion algorithm. It promotes, recommends and disseminat­es videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizi­ng instrument­s of the 21st century.

This is not because a cabal of YouTube engineers is plotting to drive the world off a cliff. A more likely explanatio­n has to do with the nexus of artificial intelligen­ce and Google’s business model. ( YouTube is owned by Google.) For all its lofty rhetoric, Google is an advertisin­g broker, selling our attention to companies that will pay for it. The longer people stay on YouTube, the more money Google makes.

What keeps people glued to YouTube? Its algorithm seems to have concluded that people are drawn to content that is more extreme than what they started with — or to incendiary content in general.

Is this suspicion correct? Good data is hard to come by; Google is loath to share informatio­n with indepen- dent researcher­s. But we now have the first inklings of confirmati­on, thanks in part to a former Google engineer named Guillaume Chaslot.

Mr. Chaslot worked on the recommende­r algorithm while at YouTube. He grew alarmed at the tactics used to increase the time people spent on the site. Google fired him in 2013, citing his job performanc­e. He maintains the real reason was that he pushed too hard for changes in how the company handles such issues.

The Wall Street Journal conducted an investigat­ion of YouTube content with the help of Mr. Chaslot. It found that YouTube often “fed far-right or far-left videos to users who watched relatively mainstream news sources,” and that such extremist tendencies were evident with a wide variety of material. If you searched for informatio­n on the flu vaccine, you were recommende­d anti-vaccinatio­n conspiracy videos.

It is also possible that YouTube’s recommende­r algorithm has a bias toward inflammato­ry content. In the run-up to the 2016 election, Mr. Chaslot created a program to keep track of YouTube’s most recommende­d videos as well as its patterns of recommenda­tions. He discovered that whether you started with a pro- Clinton or pro-Trump video on YouTube, you were many times more likely to end up with a pro-Trump video recommende­d.

Combine this finding with other research showing that during the 2016 campaign, fake news, which tends toward the outrageous, included much more pro-Trump than pro- Clinton content, and YouTube’s tendency toward the incendiary seems evident.

YouTube has recently come under fire for recommendi­ng videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Florida, are “crisis actors” masqueradi­ng as victims. Jonathan Albright, a researcher at Columbia University in New York, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommenda­tions led to a network of some 9,000 videos promoting that and related conspiracy theories.

What we are witnessing is the computatio­nal exploitati­on of a natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.

Human beings have many natural tendencies that need to be monitored in the context of modern life. For example, our craving for fat, salt and sugar, which served us well when food was scarce, can lead us astray in an environmen­t in which fat, salt and sugar are all too plentiful and heavily marketed to us. So too our natural curiosity about the unknown can lead us astray on a website that leads us too much in the direction of lies, hoaxes and misinforma­tion.

In effect, YouTube has created a restaurant that serves us increasing­ly sugary, fatty foods, loading up our plates as soon as we are finished with the last meal. Over time, our tastes adjust, and we seek even more sugary, fatty foods, which the restaurant dutifully provides. When confronted, the restaurant managers reply that they are merely serving us what we want.

This situation is especially dangerous given how many people — especially young people — turn to YouTube for informatio­n. Google’s cheap and sturdy Chromebook laptops, which now make up more than 50 percent of the pre- college laptop education market in the United States, typically come loaded with ready access to YouTube.

This state of affairs is unacceptab­le but not inevitable. There is no reason to let a company make so much money while potentiall­y helping to radicalize billions of people, reaping the financial benefits while asking society to bear so many of the costs.

Newspapers in German

Newspapers from Austria