Online censorship funding doubles
The Government is doubling the funding for one of its censorship offices so it can crack down on terrorist content alongside child exploitation images.
It is also moving to strengthen censorship laws and will investigate whether social media should be regulated more heavily in line with traditional domestic media.
The package is the main domestic component of Prime Minister Jacinda Ardern’s more globallyfocused Christchurch Call.
The Call is a set of pledges and practices she is promoting following the Christchurch terror attack of March 15 – and the spread of the livestream of the attack and the alleged killer’s manifesto.
The $17 million funding boost will go towards the Chief Censor and the Censorship Compliance Unit within the Department of Internal Affairs, a technical team of 13 people that is currently focused on detecting and investigating child sexual exploitation images. The funds will see about 17 new staff focused specifically on stopping the spread of violent terrorist content online.
But Internal Affairs officials cautioned yesterday that they lack any legal authority to force social media companies to remove posts – and that removing terrorist content from the internet completely was impossible. ‘‘Fundamentally we can’t eliminate this content from the internet; what we can do is try to prevent New Zealanders as much as possible from seeing that content,’’ an official said.
New Zealand laws do allow for prosecution of those who spread or possess content deemed ‘‘objectionable’’ by the chief censor – such as the alleged killer’s manifesto and livestream. It took two days for the chief censor to rule the livestream was objectionable, something the officials said could be sped up with new funding for his office.
The officials said the team had relationships with international agencies such as Interpol and
Minister Tracey Martin
‘‘The changes mean we can target this material in a similar way to how we target child sexual exploitation material.’’
social media companies for its work on child exploitation.
It generally found the social media companies to be cooperative when it issued takedown requests – despite mostly lacking jurisdiction.
Internal Affairs Minister Tracey Martin said the changes would allow the Government to work very quickly with social media companies.
‘‘While terrorist and violent extremist content is objectionable and therefore illegal under current law, the changes mean we can target this material in a similar way to how we target child sexual exploitation material, by working quickly with online content hosts to remove it as quickly as possible,’’ Martin said.
The latest far Right terrorist attack in Halle, Germany, has confirmed several things. Firstly, that the Christchurch mosque shootings have provided something of a model for other white supremacists around the world.
Following on from Anders Breivik in Norway, and Christchurch, the aim is mass casualties from target groups. Also, following both Breivik and Christchurch, a manifesto is provided that employs white supremacist and ultra-nationalist rhetoric to justify what is about to happen. As the Halle shooter noted, he wanted to ‘‘strengthen the morale of other suppressed whites’’.
At the core of these arguments is the notion that ‘‘whites’’ are being replaced – demographically and in terms of national sovereignty and identity – usually, it is argued, by Muslims, although this is often associated with a conspiracy that it’s being orchestrated by a Jewish elite. As marchers in Charlottesville chanted, ‘‘Jews will not replace us’’.
Perhaps the only difference in the major attacks committed in 2019 is that the target has been different – Muslims in Christchurch, immigrants in El Paso, Jews in Pittsburgh and Halle.
And then there is the livestreaming, in the Christchurch case, on Facebook.
The ‘‘Christchurch Call’’ is an attempt to seek international co-operation, involving both the major online platforms and other countries and agencies, to monitor and act against extreme racist content and violence in cyberspace. And it was great to see the New Zealand Censor acting so fast to deem the Halle shooting video objectionable. But will these actions be enough?
In the Halle case, the video was streamed by Twitch, a subsidiary of Amazon. The material was removed from Twitch after 30 minutes, and with about 2000 views. However, and this is where it is going to get challenging, that was not the end of the video’s circulation.
A number of smaller platforms or subchannels then got involved and posted the material from the Halle shooter. One researcher has estimated that there have been more than 50,000 views subsequently – and presumably this is growing.
As we heard last week at a meeting in Melbourne to discuss violent extremism, hosted by Hedayah and Deakin University, there are new online options for the extreme Right.
The decentralisation of online platforms has generated platforms that can be hosted by individuals or groups using new software – and allowing them not to be reliant on major platforms like YouTube, Facebook or Twitter.
The Pittsburgh shooter used Gab; others use Telegram, but there are many more. One estimate is that there are at least 150 catering to far Right groups and ideologies, with 100 of those established this year.
Platforms such as Telegram and Gab claim to be free speech sites which do not censor the material being posted and which resist any sort of external intervention or regulation.
The point is that while these sub-channels often have small audiences and reach, they are part of the online ecosystem that allows extremist groups to recruit, and to circulate their ideology and tactics internationally. And they are not subject to moderation or regulation.
As one expert in Melbourne noted, they are ‘‘takedown-resistant’’.
This year has confirmed that Christchurch has provided something of a model for other extremists. It was not a one-off.
Secondly, tactics and options are changing for extremists. At the moment, when major platforms like Facebook are doing more to manage content and as countries and agencies such as the European Court of Justice impose new requirements, the challenge is going to be to manage self-hosted and dispersed sites that cater specifically for extremist groups and activists.
There is growing evidence that what happens online has real-world consequences. Research by Karsten Mu¨ ller and Carlo Schwarz has shown that there is a real-time correlation between an increase in racist and hate speech on Twitter and hate crimes directed at religious and ethnic minorities.
Equally, when there are internet outages in countries like Germany or the US, the number of hate crimes goes down.
In the week after the Christchurch mosque shootings, the UK saw a spike in hate crimes, with 95 recorded. Eighty-five referenced the Christchurch shootings. The El Paso shooter praised what happened in Christchurch.
The online ecosystem that encourages racial and religious vilification, and provides both rhetoric and tactics, is proving difficult to counter. It is becoming more difficult as the far Right migrates from using major platforms to those that are sympathetic to their cause – and which are not easily subject to regulation or removal.
Distinguished Professor Paul Spoonley is from the College of Humanities and Social Sciences at Massey University. His research focuses on white supremacy movements, racism, immigration and population.