The Post

Down the online rabbit hole

The internet was supposed to bring us closer together. But some corners of it are a seething snakepit of hate speech, conspiracy theories and misogyny. Katie Kenny and Tommy Livingston report.

-

Ten minutes after the gunshots had stopped ringing through the Masjid Al Noor in Christchur­ch last Friday, James was sent a link to the livestream­ed attack. The accused was still on the run, and emergency services were still arriving at the scene.

In the following minutes, he was sent more links – this time to messaging boards – where he watched an anonymous community begin to spread the video, along with the man’s manifesto, as far and as wide as possible.

‘‘As soon as it was posted there was a group of people who got to work making sure this material went viral,’’ says James, who asked us not to publish his surname. ‘‘No matter what Facebook or Twitter did, they were never going to be able to get rid of it.’’

In the following hours, he saw the global giants of the tech world struggle to contain the video (which has been classified in New Zealand as ‘‘objectiona­ble’’, meaning it’s banned) as it was downloaded and uploaded again all over the internet. This is what the attacker had wanted, James assumes.

New technologi­es should be bringing us closer together. And often, they do. The internet allows us to overcome geographic­al and social barriers, and interact with new people and perspectiv­es. But in other ways, it pulls us further apart.

Facebook, Twitter, Google, and other online platforms, are designed to give us more of what we want, and less content from strangers we disagree with. What we see in our personalis­ed, online worlds is determined by our previous behaviour and recommenda­tion algorithms.

For years, platforms have been promising to fix the likes of YouTube’s ‘‘Up Next’’ system, so they don’t so readily drag users down rabbit holes of hate speech, conspiracy theories, and hyperparti­san, misogynist and racist videos.

Those rabbit holes aren’t as hard to find as you might think. While sites like the ones used by the accused are harder to find than Facebook and Twitter, altRight ideology can easily be found on YouTube and Instagram. Popular online porn sites also have an array of videos with racist and white supremacis­t themes. Comments sections contain links to more intense and vitriolic content.

It’s natural for people to search out others who support their world views, making them feel safe, and validated. It can be liberating for geographic­ally isolated people to be able to connect with like-minded individual­s. So we end up organising ourselves into homogenous online groups, known as echo chambers, says Walter Quattrochi­occhi, a computer scientist at the University of Venice and world-leading researcher on the subject. This is where the problem starts.

The search for informatio­n online is driven by emotion rather than truth, he says. So countering ill-informed views with facts doesn’t change people’s minds. People who don’t trust official institutio­ns in real life, or science, are drawn to conspiracy-related content online. When faced with dissenting informatio­n, they only become more committed to their erroneous beliefs.

When the rhetoric within these ecosystems escalates, ‘‘one bar at a time’’, Quattrochi­occhi says, then polarising and even radical opinions emerge. ‘‘The more you are engaged by an echo chamber the more you tend to be extreme with regards to the identity of the shared narrative.’’

At this stage, a person is probably isolating themselves socially in real life, too. It’s not clear what sort of person in particular is vulnerable to being radicalise­d online, he says. ‘‘We’re in the middle of radical change and we’re still trying to understand what’s going on.’’

The man accused of Friday’s massacre represents the worstcase scenario for online extremism. Before the attack, he’d set up social media accounts, posted photos of his weapons, and linked a rambling manifesto laced with references to alt-Right online communitie­s.

He began his Facebook livestream of the killings with a casual reference to an internet meme. Anonymous users from across the world supported and encouraged him online.

Many of them then watched the massacre live, or within minutes. They celebrated the video and shared it widely. (Facebook said it removed 1.5 million copies of the video from its platform with 24 hours.)

All signs point to the gunman being steeped in a culture of what cyberhate expert and Troll

Hunting author Ginger Gorman describes as predatory trolling — repeated, sustained threats or attacks on an individual or group through the use of electronic devices, which result in real-life harm to the target.

The social posts, manifesto and video were probably meticulous­ly planned to arouse media attention and trick journalist­s into providing a platform for the gunman’s ideology, drawing more people – mainly young white men – into his extremist echo chamber.

It’s difficult to separate the issues of radicalisa­tion, terrorism, and predator trolling, Gorman says. Many of the most notorious predator trolls are white supremacis­ts. Often, they spent their childhoods online, ‘‘from a very young age imbibing torrents of hate, and they get radicalise­d into these behaviours and ideologies’’.

When politician­s and public figures tout bigotry, and news outlets quote them, ‘‘this type of thinly veiled white supremacy’’ becomes an accepted part of public discourse, she says. ‘‘It’s normalised and it shouldn’t be. There’s plenty of evidence to show it leads to real-life harm. Let’s remember the Holocaust didn’t start with murder; it started with hate speech.’’

It is that very hate speech to which Hunter Pollack believes social media platforms need to pay more attention. His 18-yearold sister Meadow was killed by Nikolas Cruz when he opened fire at Marjory Stoneman Douglas High School in Florida last year.

Cruz had posted pictures of his weapons and arsenal and other threatenin­g images on Instagram in the leadup to the attack – but they went unnoticed by the website.

‘‘Instagram should have blocked him and alerted police,’’ Pollack says. ‘‘Social media companies need to be more proactive and less reactive.’’

Sites that promote violence and extremist views are the breeding ground for people like Cruz, he says. Those who follow through on their threats become idolised and cherished in these communitie­s, which encourages others to be like them.

‘‘These killers are motivated by the internet. They can read forums and fan pages about what other killers did. They are talked about so much and these isolated people want the same thing so much. They will kill because

they want to be famous and be a celebrity.’’

Gorman agrees it’s ‘‘high time law enforcemen­t and social media companies connected the dots and started viewing predator trolling as a canary in a coalmine’’.

But Netsafe chief executive Martin Cocker says while sites like the ones used by the Christchur­ch case host graphic imagery and allow people to spout racist views, they are also used by people who need to communicat­e anonymousl­y in countries where there are oppressive regimes.

‘‘Some of these technologi­es enable people to do what we support – like fight for freedom. But they are the exact same sites which are used for the complete opposite.’’

Trying to stamp out sites that promote extremist ideologies will not work, according to Cocker. Even a unified approach by multiple government­s would struggle to fully stamp out a site, because a majority of government­s around the world would have to agree on a consistent approach.

‘‘People who want to run a site which contravene­s those rules could easily find a country which would allow them.’’

Short-term fixes include telco companies blocking certain sites – which happened after the Christchur­ch video was distribute­d. However, that is a temporary move, due to easily accessible technology that allows users to bypass blocks. There is also a social dilemma about restrictin­g an entire site.

‘‘Our law typically leans towards the idea that anything which is not illegal can be accessed, even if we don’t like it.

‘‘In blocking something like [the sites used by the accused], we are blocking stuff which is harmful. But if we block the whole site we are also blocking a whole lot of conversati­ons which people have the right to have – even if we find it offensive.’’ Working with large social media platforms is better than throwing them under the bus, Cocker believes. ‘‘Once the video was up, the big social media platforms worked very hard [to get it down]. I am a lot happier with that than the sites who hosted and promoted the video who, when we requested they take it down, gave us two fingers . . .’’

But John Parsons, an internet safety and risk assessment consultant who lectures at schools, says while nobody could have conceived, when the likes of Facebook were built, the issues we’d be facing now, ‘‘we have to expect more from them’’.

Parents often tell him their children have been exposed to objectiona­ble material on social media. When he tells them to report it, they often say they have, multiple times.

‘‘That shows these systems are still focused on making money above all, and they need to better respond to our needs. I’m firmly of the opinion they need to employ more people. Thousands more.’’

Hate speech doesn’t happen in one dark corner of the internet, he says. It’s everywhere, and spreads ‘‘at the speed of light’’. Because of this it’s hard to get a sense of the scale of the problem. If social media sites start recording incidents of content designed to spread hate and civil unrest when they remove it, and share that data with government­s, we’d better understand what we’re dealing with.

‘‘But it starts with good reporting mechanisms from the organisati­ons themselves.’’

In the meantime, parents need to explain to children, in a way they’ll understand, what has happened, and keep young children off social media. ‘‘Don’t dismiss their fears. But do point out all the people around them who support them and keep them safe.’’

InternetNZ chief executive Jordan Carter agrees serious analysis will need to take place to determine what could have been done differentl­y. ‘‘We need to have a broad-based conversati­on about this. We need to have Facebook and Google at the table, we need to have the government at the table, the broader tech community, and the Muslim community.’’

But we will need to grapple with the fact that what led to the massacre was not just the internet’s fault. ‘‘We are going to need to talk about how we deal with some of the underlying fears and hatred which gives space to these conversati­ons.

‘‘That is not a technology problem, that is a deep values conversati­on.’’

‘‘Let’s remember the Holocaust didn’t start with murder; it started with hate speech.’’

Ginger Gorman author of Troll Hunting

 ?? STUFF ?? Within minutes, video of last week’s shootings at the Al Noor Mosque in Christchur­ch was spreading across the internet.
STUFF Within minutes, video of last week’s shootings at the Al Noor Mosque in Christchur­ch was spreading across the internet.
 ??  ?? Netsafe chief executive Martin Cocker says trying to stamp out all extremist sites won’t work.
Netsafe chief executive Martin Cocker says trying to stamp out all extremist sites won’t work.
 ??  ?? Florida shooting suspect Nikolas Cruz posted pictures of weapons on Instagram before the attack.
Florida shooting suspect Nikolas Cruz posted pictures of weapons on Instagram before the attack.
 ??  ??
 ?? STUFF ?? As police tended victims, the global giants of the tech world struggled to contain the spread of the video.
STUFF As police tended victims, the global giants of the tech world struggled to contain the spread of the video.

Newspapers in English

Newspapers from New Zealand