Online content needs controls
Prime Minister Jacinda Ardern has done a superb job of setting the national mood and guiding our conversations in the days after the Christchurch terror attack. It has been said again and again that she has acted exactly as a leader should. Her natural empathy and strong moral principles are evident, and her speech to Parliament on Tuesday was a perfect example.
By pointing out that she will not speak the alleged killer’s name, preferring instead to honour the victims, Ardern has sensitively set the parameters of national grieving. She also verbalised an impatience with the technology companies that have been so incapable of controlling the spread of dangerous ideologies that they allowed 17 minutes of footage of a massacre to stream live to the world.
Just as the alleged killer appears to have been radicalised online, his act was designed for maximum online consumption. He went from observer to instigator and seems to have wanted his video and his writings to circulate as widely and quickly as possible in an online world that thrives on extremism and discord.
It is this ecosystem that Ardern referred to on Tuesday when she said that, while the ideas and language of division have existed for decades, ‘‘their form of distribution, the tools of organisation’’ are new. She added: ‘‘We cannot simply sit back and accept that these platforms just exist and that what is said on them is not the responsibility of the place where they are published. They are the publisher. Not just the postman. There cannot be a case of all profit, no responsibility.’’
When mass shootings have occurred in the United States, a lack of responsibility is a familiar response. Gun lobbyists argue that guns are not to blame, but individuals. Just as Ardern’s Government quickly called for a reform of gun laws, her position on social media shows she is determined to indicate where responsibility lies and find a way to make highly profitable tech companies do a better job of controlling content.
Facebook has stated that it removed 1.5 million copies of the alleged killer’s video in the first 24 hours after the attack. Of that number, 1.2m were blocked at upload and never made it online. That means 300,000 copies circulated. Over at YouTube, a new copy appeared on the platform every second.
This speaks to both the enormity of the problem and the ghoulish appetites of those sharing and viewing videos. For many observers, there are questions about whether Facebook’s ability to moderate and monitor content has kept up with its rapid growth, and whether the introduction of livestreaming was premature.
There have been notorious precedents. A French terrorist live-streamed an attack from his mobile phone in 2016. A year later, a shooter in the US uploaded video of a murder to Facebook, where it stayed online for more than two hours. Facebook responded by saying it would add another 3000 reviewers to the 4500 people already vetting violent footage. But US information studies specialist Sarah T Roberts warned the increase was ‘‘a drop in the bucket’’ compared to the footage that is shared by Facebook’s more than two billion users.
‘‘It is not a matter of if another incident will happen, but when,’’ Roberts added. After the horrors in Christchurch, some form of regulation may follow. It is long overdue.
[Ardern’s] position on social media shows she is determined to indicate where responsibility lies and find a way to make highly profitable tech companies do a better job of controlling content.