The Post

Online censorship funding doubles

- Henry Cooke

The Government is doubling the funding for one of its censorship offices so it can crack down on terrorist content alongside child exploitati­on images.

It is also moving to strengthen censorship laws and will investigat­e whether social media should be regulated more heavily in line with traditiona­l domestic media.

The package is the main domestic component of Prime Minister Jacinda Ardern’s more globallyfo­cused Christchur­ch Call.

The Call is a set of pledges and practices she is promoting following the Christchur­ch terror attack of March 15 – and the spread of the livestream of the attack and the alleged killer’s manifesto.

The $17 million funding boost will go towards the Chief Censor and the Censorship Compliance Unit within the Department of Internal Affairs, a technical team of 13 people that is currently focused on detecting and investigat­ing child sexual exploitati­on images. The funds will see about 17 new staff focused specifical­ly on stopping the spread of violent terrorist content online.

But Internal Affairs officials cautioned yesterday that they lack any legal authority to force social media companies to remove posts – and that removing terrorist content from the internet completely was impossible. ‘‘Fundamenta­lly we can’t eliminate this content from the internet; what we can do is try to prevent New Zealanders as much as possible from seeing that content,’’ an official said.

New Zealand laws do allow for prosecutio­n of those who spread or possess content deemed ‘‘objectiona­ble’’ by the chief censor – such as the alleged killer’s manifesto and livestream. It took two days for the chief censor to rule the livestream was objectiona­ble, something the officials said could be sped up with new funding for his office.

The officials said the team had relationsh­ips with internatio­nal agencies such as Interpol and

Minister Tracey Martin

‘‘The changes mean we can target this material in a similar way to how we target child sexual exploitati­on material.’’

social media companies for its work on child exploitati­on.

It generally found the social media companies to be cooperativ­e when it issued takedown requests – despite mostly lacking jurisdicti­on.

Internal Affairs Minister Tracey Martin said the changes would allow the Government to work very quickly with social media companies.

‘‘While terrorist and violent extremist content is objectiona­ble and therefore illegal under current law, the changes mean we can target this material in a similar way to how we target child sexual exploitati­on material, by working quickly with online content hosts to remove it as quickly as possible,’’ Martin said.

The latest far Right terrorist attack in Halle, Germany, has confirmed several things. Firstly, that the Christchur­ch mosque shootings have provided something of a model for other white supremacis­ts around the world.

Following on from Anders Breivik in Norway, and Christchur­ch, the aim is mass casualties from target groups. Also, following both Breivik and Christchur­ch, a manifesto is provided that employs white supremacis­t and ultra-nationalis­t rhetoric to justify what is about to happen. As the Halle shooter noted, he wanted to ‘‘strengthen the morale of other suppressed whites’’.

At the core of these arguments is the notion that ‘‘whites’’ are being replaced – demographi­cally and in terms of national sovereignt­y and identity – usually, it is argued, by Muslims, although this is often associated with a conspiracy that it’s being orchestrat­ed by a Jewish elite. As marchers in Charlottes­ville chanted, ‘‘Jews will not replace us’’.

Perhaps the only difference in the major attacks committed in 2019 is that the target has been different – Muslims in Christchur­ch, immigrants in El Paso, Jews in Pittsburgh and Halle.

And then there is the livestream­ing, in the Christchur­ch case, on Facebook.

The ‘‘Christchur­ch Call’’ is an attempt to seek internatio­nal co-operation, involving both the major online platforms and other countries and agencies, to monitor and act against extreme racist content and violence in cyberspace. And it was great to see the New Zealand Censor acting so fast to deem the Halle shooting video objectiona­ble. But will these actions be enough?

In the Halle case, the video was streamed by Twitch, a subsidiary of Amazon. The material was removed from Twitch after 30 minutes, and with about 2000 views. However, and this is where it is going to get challengin­g, that was not the end of the video’s circulatio­n.

A number of smaller platforms or subchannel­s then got involved and posted the material from the Halle shooter. One researcher has estimated that there have been more than 50,000 views subsequent­ly – and presumably this is growing.

As we heard last week at a meeting in Melbourne to discuss violent extremism, hosted by Hedayah and Deakin University, there are new online options for the extreme Right.

The decentrali­sation of online platforms has generated platforms that can be hosted by individual­s or groups using new software – and allowing them not to be reliant on major platforms like YouTube, Facebook or Twitter.

The Pittsburgh shooter used Gab; others use Telegram, but there are many more. One estimate is that there are at least 150 catering to far Right groups and ideologies, with 100 of those establishe­d this year.

Platforms such as Telegram and Gab claim to be free speech sites which do not censor the material being posted and which resist any sort of external interventi­on or regulation.

The point is that while these sub-channels often have small audiences and reach, they are part of the online ecosystem that allows extremist groups to recruit, and to circulate their ideology and tactics internatio­nally. And they are not subject to moderation or regulation.

As one expert in Melbourne noted, they are ‘‘takedown-resistant’’.

This year has confirmed that Christchur­ch has provided something of a model for other extremists. It was not a one-off.

Secondly, tactics and options are changing for extremists. At the moment, when major platforms like Facebook are doing more to manage content and as countries and agencies such as the European Court of Justice impose new requiremen­ts, the challenge is going to be to manage self-hosted and dispersed sites that cater specifical­ly for extremist groups and activists.

There is growing evidence that what happens online has real-world consequenc­es. Research by Karsten Mu¨ ller and Carlo Schwarz has shown that there is a real-time correlatio­n between an increase in racist and hate speech on Twitter and hate crimes directed at religious and ethnic minorities.

Equally, when there are internet outages in countries like Germany or the US, the number of hate crimes goes down.

In the week after the Christchur­ch mosque shootings, the UK saw a spike in hate crimes, with 95 recorded. Eighty-five referenced the Christchur­ch shootings. The El Paso shooter praised what happened in Christchur­ch.

The online ecosystem that encourages racial and religious vilificati­on, and provides both rhetoric and tactics, is proving difficult to counter. It is becoming more difficult as the far Right migrates from using major platforms to those that are sympatheti­c to their cause – and which are not easily subject to regulation or removal.

Distinguis­hed Professor Paul Spoonley is from the College of Humanities and Social Sciences at Massey University. His research focuses on white supremacy movements, racism, immigratio­n and population.

 ?? GETTY ?? Mourners outside the synagogue in Halle, Germany, after two people were killed by a gunman, who livestream­ed on Twitch his unsuccessf­ul attempt to attack the synagogue.
GETTY Mourners outside the synagogue in Halle, Germany, after two people were killed by a gunman, who livestream­ed on Twitch his unsuccessf­ul attempt to attack the synagogue.

Newspapers in English

Newspapers from New Zealand