The Guardian Australia

Sydney church stabbing: social media pages ‘infamous’ for spreading misinforma­tion taken down

- Josh Taylor and Mostafa Rachwani

Social media pages “infamous” for spreading misinforma­tion have been taken down after the Wakeley church stabbing attack, the New South Wales premier, Chris Minns, said on Thursday, while expressing alarm at the “wildfire” of rumour and graphic content still proliferat­ing on tech platforms.

On Monday night YouTube was live broadcasti­ng Bishop Mar Mari Emmanuel’s service at the Assyrian Christ the Good Shepherd church. After the stabbing occurred, video clips spread through WhatsApp groups before police had arrived on scene.

A 19-year-old man, Dani Mansour, fronted court on Thursday charged with riot, affray and damage to property for his alleged actions outside the church, where an estimated 2,000 people gathered on Monday night.

Mansour was granted strict bail with a ban on social media access. NSW police based their investigat­ion on Monsour’s Instagram posts, the court heard on Thursday. Police continue to comb through social media material to identify other alleged rioters.

WhatsApp, owned by Meta, is the platform most cited in recent days as a source of much of the violent imagery and misinforma­tion. It has attempted in recent years to limit the speed at which misinforma­tion can be shared by limiting the sending of content to five chats at once, and labelling content in messages that has been forwarded multiple times. Such messages can only be sent to one chat at a time.

Meta said in 2020 the change had helped reduce the spread of viral messages on the platform by 70%.

Since end-to-end encrypting communicat­ions on the platform as a measure to protect user privacy, Meta no longer has access to the content of messages so cannot monitor what is spreading. But the company now says it has technology to spot accounts engaging in abnormal behaviour, with 8m accounts banned a month – 75% of which are banned before those accounts are reported by users.

Sign up for Guardian Australia’s free morning and afternoon email newsletter­s for your daily news roundup

Minns told reporters on Thursday that NSW police and the state government were concerned about the amount of unsubstant­iated rumour and graphic content still accessible on social media sites.

“It proves very difficult to foster community cohesion and harmony, to calm down the community, to send messages of unity in a difficult period when social media firms still continue to disseminat­e terrible pieces of informatio­n, untruths, rumours that circulate like wildfire through an anxious community,” he said.

He said in the immediate aftermath of the attack, the NSW government liaised with the federal government and the eSafety commission­er to have pages “that have become famous or infamous for spreading misinforma­tion in the community” taken down.

“They are down, which is good news [to] stop, in many instances, [misinforma­tion] about damage to mosques and churches [that] was being spread like wildfire and inflaming tensions in the community.”

Minns did not specify on which platform the pages were hosted.

The eSafety commission­er has no powers to regulate the spread of misinforma­tion, but since the Bondi stabbing attack on Saturday and the church attack on Monday has been in communicat­ion with the platforms about the removal of violent content. Violent content or content inciting violence is classified as “class 1” material under Australian classifica­tion law.

The takedown process has involved informal requests to remove some of the more graphic content related to the Bondi stabbing attack, as well as formal notices issued to Facebook’s parent company, Meta, and X over content related to the church stabbing.

On Wednesday night, a spokespers­on for the eSafety commission­er said Meta had complied with the notices, while the compliance of X – the platform formerly known as Twitter before it was bought by the billionair­e Elon Musk in 2022 – was still being reviewed.

The attacks and the social media fallout has drawn attention back to the federal government’s proposed misinforma­tion legislatio­n, which would give stronger powers to the Australian Communicat­ions and Media Authority. Under the bill, Acma could force social media companies to get tougher on “content [that] is false, misleading or deceptive, and where the provision of that content on the service is reasonably likely to cause or contribute to serious harm”.

The bill’s introducti­on was delayed last year after initial consultati­on on the proposal led to claims it would stifle speech online, and would not protect religious speech. But the government has remained committed to releasing the legislatio­n later this year.

On Wednesday the communicat­ions minister, Michelle Rowland, said the incidents highlighte­d the need for action.

“If we needed to see any case study about what can happen when misinforma­tion spreads at speed and scale, we only need to look at what happened in western Sydney the other night – the damage to public property, threats to life and health,” she told the ABC.

“We know the platforms have incredible powers and abilities to be able to examine content on their platforms. Their algorithms are opaque. They need to do more.”

X did not respond to a request for comment.

 ?? Photograph: Ayush Kumar/AFP/Getty Images ?? Chris Minns and NSW police commission­er Karen Webb. The premier has expressed concern at the amount of misinforma­tion and violent imagery on social media.
Photograph: Ayush Kumar/AFP/Getty Images Chris Minns and NSW police commission­er Karen Webb. The premier has expressed concern at the amount of misinforma­tion and violent imagery on social media.

Newspapers in English

Newspapers from Australia