The Post

Online content needs controls

-

Prime Minister Jacinda Ardern has done a superb job of setting the national mood and guiding our conversati­ons in the days after the Christchur­ch terror attack. It has been said again and again that she has acted exactly as a leader should. Her natural empathy and strong moral principles are evident, and her speech to Parliament on Tuesday was a perfect example.

By pointing out that she will not speak the alleged killer’s name, preferring instead to honour the victims, Ardern has sensitivel­y set the parameters of national grieving. She also verbalised an impatience with the technology companies that have been so incapable of controllin­g the spread of dangerous ideologies that they allowed 17 minutes of footage of a massacre to stream live to the world.

Just as the alleged killer appears to have been radicalise­d online, his act was designed for maximum online consumptio­n. He went from observer to instigator and seems to have wanted his video and his writings to circulate as widely and quickly as possible in an online world that thrives on extremism and discord.

It is this ecosystem that Ardern referred to on Tuesday when she said that, while the ideas and language of division have existed for decades, ‘‘their form of distributi­on, the tools of organisati­on’’ are new. She added: ‘‘We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibi­lity of the place where they are published. They are the publisher. Not just the postman. There cannot be a case of all profit, no responsibi­lity.’’

When mass shootings have occurred in the United States, a lack of responsibi­lity is a familiar response. Gun lobbyists argue that guns are not to blame, but individual­s. Just as Ardern’s Government quickly called for a reform of gun laws, her position on social media shows she is determined to indicate where responsibi­lity lies and find a way to make highly profitable tech companies do a better job of controllin­g content.

Facebook has stated that it removed 1.5 million copies of the alleged killer’s video in the first 24 hours after the attack. Of that number, 1.2m were blocked at upload and never made it online. That means 300,000 copies circulated. Over at YouTube, a new copy appeared on the platform every second.

This speaks to both the enormity of the problem and the ghoulish appetites of those sharing and viewing videos. For many observers, there are questions about whether Facebook’s ability to moderate and monitor content has kept up with its rapid growth, and whether the introducti­on of livestream­ing was premature.

There have been notorious precedents. A French terrorist live-streamed an attack from his mobile phone in 2016. A year later, a shooter in the US uploaded video of a murder to Facebook, where it stayed online for more than two hours. Facebook responded by saying it would add another 3000 reviewers to the 4500 people already vetting violent footage. But US informatio­n studies specialist Sarah T Roberts warned the increase was ‘‘a drop in the bucket’’ compared to the footage that is shared by Facebook’s more than two billion users.

‘‘It is not a matter of if another incident will happen, but when,’’ Roberts added. After the horrors in Christchur­ch, some form of regulation may follow. It is long overdue.

[Ardern’s] position on social media shows she is determined to indicate where responsibi­lity lies and find a way to make highly profitable tech companies do a better job of controllin­g content.

 ??  ??

Newspapers in English

Newspapers from New Zealand