The Guardian (USA)

I warned in 2018 YouTube was fueling farright extremism. Here's what the platform should be doing

- Becca Lewis

In the fall of 2018, I released a research report warning of a growing trend of far-right radicaliza­tion on YouTube. Specifical­ly, I identified a loosely connected network of reactionar­y YouTubers, ranging from mainstream conservati­ves and libertaria­ns all the way to overt white supremacis­ts and neoNazis, who were all broadcasti­ng their political ideas to young audiences. Tethered together by a shared opposition to “social justice warriors” and the mainstream media, they frequently collaborat­ed with each other and amplified each other’s content. In the process, they made it extremely easy for a viewer to move bit by bit into more extremist content.

The following March, I watched in horror along with much of the rest of the world, as a white supremacis­t gunman killed 51 people and injured 40 more at the Al Noor Mosque and the Linwood Islamic Centre in Christchur­ch, New Zealand. Throughout the chaos of the day, researcher­s parsed his manifesto and found that under the layers of irony and memes, the message was quite clear. He had been radicalize­d to believe in the Great Replacemen­t, a white nationalis­t conspiracy theory that claims that white population­s are being purposeful­ly replaced with (often Muslim) immigrants.

The shooter’s manifesto clearly spelled out his racist and Islamophob­ic beliefs, but it provided scant informatio­n on how he came to embrace them. On Monday, with the release of the Royal Commission’s inquiry into the attacks, we got a fuller picture: the

Christchur­ch shooter was radicalize­d on YouTube, by many of the propagandi­sts myself and other researcher­s had warned about. So why didn’t YouTube take action sooner, and what should they be doing now?

There are a million different ways YouTube could have been, and could be now, taking action. They could enforce their terms of service more aggressive­ly, or make those terms more robust. They could make changes to their algorithm so it stops recommendi­ng ever-more-extreme content. They could de-prioritize borderline content that acts as a first step to radicaliza­tion. They could refine their content moderation algorithms to catch content more effectivel­y. And, in fact, YouTube consistent­ly claims it has done many of those things.

And yet, there is often a great disconnect between what actions YouTube says it is taking and what users and creators actually experience. This is in part because these actions mean little if the platform has no clear idea of how it defines hate speech, extremism, harassment or borderline content and what values it seeks to uphold in its actions. Indeed, YouTube has often backed itself into a corner by attempting to stay as “apolitical” as possible and turning deeply value-based judgments into the parsing of minor details. In an attempt to avoid accusation­s of politicize­d censorship, the platform has frequently tied itself up in knots, focusing their decisions on the smallest technicali­ties when determinin­g whether a piece of content has violated its terms.

The great irony is that by attempting to stay apolitical, YouTube consistent­ly makes the political choice not to care about or protect vulnerable communitie­s. It can tweak its algorithms and update its policies as much as it likes, but it won’t truly address the underlying issues until it makes a firm commitment to protect Muslim creators and users of YouTube and to stop the spread of Islamophob­ia on their platform. This does not just mean stating this commitment clearly, although that would be a reasonable first step. (YouTube could, for example, follow the example of the New Zealand prime minister, Jacinda Ardern, and apologize for the role it played in facilitati­ng the terrorist attack.) It also would mean devoting significan­t resources to it and framing their approach to content along those lines.

Because, despite YouTube’s claims to be taking hate speech seriously, Islamophob­ia is still alive and well on the platform. Ben Shapiro, the conservati­ve pundit who frequently promotes Islamophob­ic ideas, is thriving on YouTube, with almost 2.5 million subscriber­s and an additional 2.4 million on his outlet the Daily Wire. Stephen Crowder, a controvers­ial creator with more than 5 million subscriber­s has claimed that “Islamophob­ia is a perfectly rational ‘phobia’,” among similar statements. This propaganda is coming not only from small, fringe creators but from some of the biggest political commentato­rs on the platform.

In the end, YouTube’s approach strangely mirrors that of the New Zealand government in the lead-up to the attack. Muslim community members interviewe­d for the commission’s report said they had been raising the alarm about rising Islamophob­ia to the government but that no one listened. As one Muslim New Zealander said, “The events of the day were presaged by so many tell-tale signs of its coming, all of which were evident and all of which were ignored by those who had power to act.”

Instead, the government was hyperfocus­ed on potential terrorist threats from Muslim individual­s, leading one interviewe­e to say that “they were watching us, not watching our backs”.

Likewise, social media platforms such as YouTube have consistent­ly taken swift and decisive action against Isis recruitmen­t channels and other threats they see coming from Muslim extremists while simultaneo­usly allowing widespread Islamophob­ic content to thrive. For YouTube, just like the New Zealand government, the question is if they can watch the backs of Muslims instead of simply watching them.

Becca Lewis is a PhD candidate at Stanford University and a graduate affiliate at the University of North Carolina’s Center for Informatio­n, Technology, and Public Life

The great irony is that by attempting to stay apolitical, YouTube consistent­ly makes the political choice not to care about or protect vulnerable communitie­s

 ??  ?? YouTube claims it is taking hate speech seriously, but Islamophob­ia is still alive and well on the platform. Photograph: Toby Melville/Reuters
YouTube claims it is taking hate speech seriously, but Islamophob­ia is still alive and well on the platform. Photograph: Toby Melville/Reuters
 ??  ?? A plaque memorializ­es the victims of the Christchur­ch shooting at Al Noor Mosque in 2019. Photograph: Kai Schwörer/Getty Images
A plaque memorializ­es the victims of the Christchur­ch shooting at Al Noor Mosque in 2019. Photograph: Kai Schwörer/Getty Images

Newspapers in English

Newspapers from United States