Sunday Star-Times

Henry Cooke

How the internet promotes hate.

- Sunday Politics henry.cooke@stuff.co.nz

Everything about this horrific act was designed for the internet. Terrorist attacks are always intended to spread fear through the media, but this was something more than that. This attacker knew his motives and modus operandi would spread all over the world in minutes, both on mainstream news websites and through social media. He was right, and it did.

The giant technology companies that are now such an essential part of our lives must reckon with how they allowed it. Media should too. It is obviously difficult to shut this stuff down immediatel­y, but YouTube is extremely adept at stamping out copyright content already.

If it doesn’t have the technology to confidentl­y stop this kind of content being uploaded with the tools it has to hand, simply stopping all uploads for a brief period of time would have been worth considerin­g. It would be worth the disruption to normal service and would be the exact opposite of letting terrorism ‘‘win’’.

But the technology companies have another thing to reflect on: The algorithmi­c encouragem­ent of radicalisa­tion.

The internet cannot be blamed for this. No-one but the accused can. Racism existed long before the internet and would not cease if everything was turned off tomorrow.

Yet we cannot ignore the ways the internet encourages extreme racist views. YouTube, in particular, hosts an appallingl­y large library of hate-filled content that doesn’t quite meet the bar to be banned. Worse, it appears to naturally lead people into this content, each video recommende­d is more radical than the last.

This is not because the people who design YouTube want viewers to become extremists. It is because the recommenda­tion engine, as many have written about and demonstrat­ed, naturally seems to lead people from casual interests to ‘‘harder’’ content.

As Zeynep Tufekci noted in the New York Times, you start with jogging videos and are soon recommende­d content about ultra-marathons. Google, which owns YouTube, has repeatedly promised to fix these problems and rolled out solutions in an attempt to do so. But a Wall Street Journal investigat­ion last February found it was not enough and extreme content was often recommende­d.

Our security agencies appear to be blind to this kind of radicalisa­tion right now, given this man was not on a single watchlist, despite several worrying posts in the leadup to the attack.

Because these recommenda­tion engines send users towards more extreme content, that same extreme content becomes lucrative to create, both in real dollar terms and in terms of online following. It seeps out of social media and into the mainstream language of politician­s and the press.

Online there is a small community of anonymous Twitter accounts obsessed with attacking the one refugee MP in our Parliament, and are interacted with by some politician­s. The politician­s generally interact with the milder content on these accounts and may have no idea what other messages the accounts send, but they should look closer.

In our Parliament, we have no-one close to Australian Senator Fraser Anning, who blamed the attack on the Muslims themselves. Keeping it that way requires constant vigilance.

We cannot ignore the ways the internet encourages extreme racist views.

Newspapers in English

Newspapers from New Zealand