The Post

Stopping the unstoppabl­e

How does the chief censor cope when hundreds of potentiall­y harmful videos are uploaded every day? Katie Kenny reports.

-

On Wednesday, October 9, two people died and more were injured in an antisemiti­c attack in the eastern German city of Halle.

Chief Censor David Shanks in Wellington learned of the attack at 6am the following day. It was a copycat of New Zealand’s March 15 terror attack, when an alleged white supremacis­t opened fire in two Christchur­ch mosques, killing 51 worshipper­s while broadcasti­ng live on Facebook.

For the second time in just over six months, Shanks would find himself fronting media on issues relating to terrorist and violent extremist content online.

The German shooter’s platform of choice was streaming site Twitch, known for its video game content. He apologised to his viewers when he was arrested after failing to enter a synagogue where up to 80 people had gathered for Yom Kippur, the holiest day of the year in Judaism.

Twitch confirmed about five people watched the livestream in real time and thousands of others saw it before it was flagged and removed. While it was still circulatin­g on darker corners of the internet, it wasn’t easily found on the bigger social media platforms.

That was in contrast to the video of the Christchur­ch attack, which by any definition of the term went viral. Users attempted to re-upload it 1.5 million times on Facebook. YouTube at one point was removing one copy of it per second.

On March 20, Shanks classified the Christchur­ch video as objectiona­ble because of its depiction and promotion of extreme violence and terrorism – meaning it’s illegal for anyone in New Zealand to view, possess, or distribute it. Three days later, he also banned a document, or manifesto, said to have been written by the terrorist.

That Thursday morning after the German attack, Shanks and several classifica­tion officers watched the Halle video. Reporters were already asking if he’d ban it.

By 11.30am, he made the call. ‘‘While this video is not filmed in New Zealand and fatalities are fewer than in Christchur­ch, the fundamenta­ls of this publicatio­n are the same as that of the March 15 livestream,’’ he said in a statement. ‘‘It appears on the face of it to be a racially motivated terrorist attack depicting cold-blooded murder of innocent people.’’

An old model for a new age

In 1915, a conference of representa­tives of 45 organisati­ons called for the introducti­on of a censorship system. They claimed: ‘‘The class of moving pictures at present exhibited in New Zealand constitute­s a grave danger to the moral health and social welfare of the community.’’

The first film censor was appointed the following year. He snipped naughty bits from magazines and banned some books entirely.

The Office of Film and Literature Classifica­tion was establishe­d as an independen­t Crown entity under the Films, Videos, and Publicatio­ns Classifica­tion Act 1993.

‘‘In 1993, the idea of the internet and what it could become was just a twinkle in the legislator’s eye,’’ Shanks says. Then, ‘‘everything was physical’’: tapes, books, magazines. ‘‘Fast-forward to 2017 and the universe is fundamenta­lly changed in terms of how people consume and conceive of, and market and provide, media. When I came into the role [that year] I know I’d need to match the framework against the reality. ‘‘I think about this role as fundamenta­lly about being a media regulator, who has a responsibi­lity to keep people safe from harm and also to protect people’s freedoms.’’ Shanks has a background in legal roles and came to the job from being in charge of health, safety and security at the Ministry of Education. He was thrust immediatel­y into the limelight over the controvers­ial Netflix series 13 Reasons Why. The programme, targeted at teenagers, addresses or depicts rape, suicide, drug use, and bullying. It was easily accessible for young people to watch unsupervis­ed via the Netflix streaming service. Shanks introduced a new classifica­tion for the show: RP18. This meant anyone under 18 should only watch the programme with the support of an adult to process the topics raised in the series.

But 13 Reasons is, in one respect, not typical of the kinds of potentiall­y harmful content viewed by young people in 2019.

‘‘We know from our research on young people that a large amount of their content is not from cinema or TV or even streamed services. It’s YouTube or other similar free tubes,’’ Shanks says.

‘‘If you think about that as an example, [YouTube’s] current stats are about 500 hours of content going up every minute. There is no sensible way you can have human moderation of classifica­tion of tubes generating that amount of content.’’

Whereas Shanks could see the second season of 13 Reasons coming, and speak with Netflix about its release, there is no way censors could know where the next white supremacis­t meme is coming from.

Canterbury University sociololog­ist Michael Grimshaw points to the banning of the alleged Christchur­ch shooter’s so-called manifesto as further evidence of the problem.

‘‘The aim of banning manifestos worked when you could shut down the means of publicatio­n and also shut down the means of distributi­on; that is, in the world of physical media,’’ he says.

Now, documents circulate independen­tly and can contain many embedded links which make it much more than a single document.

‘‘So every manifesto is a multiplici­ty of parts that can be divided up and circulated, and so the model is not up to date,’’ Grimshaw says.

This is where digital solutions, such as artificial intelligen­ce (AI) that finds and flags up dangerous content, enter the conversati­on.

Big platforms like YouTube and Facebook are already using AI to identify and remove extremist content, pornograph­y or other types of material. Facebook last month announced a range of measures to better clamp down on violent extremists, terrorists and hate groups on its platforms. This includes using first-person military videos to train artificial intelligen­ce to faster identify terror attacks like the live-streamed Christchur­ch massacre.

The Office of Film and Literature Classifica­tion is developing a tool of its own. It’s essentiall­y a filter for New Zealand’s sensitivit­ies, applied over the top of the self-classifica­tion

 ?? DAVID UNWIN/STUFF ?? Chief censor David Shanks is uncertain about the future of his office as the media landscape changes.
DAVID UNWIN/STUFF Chief censor David Shanks is uncertain about the future of his office as the media landscape changes.

Newspapers in English

Newspapers from New Zealand