The Guardian Australia

'Law unto themselves': the Australian battle to curb Facebook and Twitter's power

- Paul Karp

Nationals MP Anne Webster and Labor MP Sharon Claydon are less concerned with why Donald Trump was taken off social media, and more concerned with what platforms such as Facebook are doing to stop online defamation and abuse.

Webster and Claydon are the cochairs of the Parliament­ary Friends of Making Social Media Safe, a group to “highlight the environmen­t of social media and the risks associated” and to make the platforms more accountabl­e. It now boasts more than 50 members thanks partly to Twitter and Facebook’s response to last week’s attack on the US Capitol.

For Webster, it’s personal. After winning a defamation case against a conspiracy theorist who falsely accused her of being “a member of a secretive paedophile network”, she wants Facebook treated as a publisher.

The decision of Twitter and other social media platforms to first remove posts and then suspend Trump’s account prompted outrage among some conservati­ves, including National MP George Christense­n and Liberal MP Craig Kelly.

The outspoken pair both favour changes to stop social media platforms from censoring any lawful content created by their users – a push in the direction of more free speech and less responsibi­lity for content on the part of the platforms.

Webster tells Guardian Australia although she’s glad the Trump controvers­y and the Chinese foreign ministry tweet accusing Australia of war crimes in Afghanista­n had “put fire under the debate” there is now a broader discussion about the regulation of social media to be had.

Webster says social media companies “are a law unto themselves, largely”. Her defamation case “cost me dearly, both financiall­y and emotionall­y” and Webster said most aggrieved people are not able to afford to fight defamatory posts in court.

The legal position on social media defamation is unclear. University of Sydney professor David Rolph, a defamation law expert, says that “in principle” the social media companies can be liable.

Just as media companies were held liable for comments on their Facebook page in the Dylan Voller case because they were “responsibl­e for everything that flows” from having setting up a public page, “that analysis might extend to the social media platform itself”, Rolph says.

He says there are also “problems of jurisdicti­on and enforcemen­t” in taking on overseas based companies, so plaintiffs rarely go after the internet giants, and a possible immunity in the Broadcasti­ng Services Act if social media can argue that they are an “internet content host”.

Webster says in her case Facebook’s handling was “appalling – it took months” and was only prompted by her taking legal action.

“Freedom of speech must be valued but it shouldn’t give people the right to incite a riot or lie about people.

“Social media companies have profited from online conversati­ons but there are rights and responsibi­lities … If they’re not held responsibl­e the number of falsehoods will increase at the rate of knots.”

Mia Garlick, Facebook’s director of policy in Australia and New Zealand, has told a parliament­ary committee the company did geo-block some posts from Webster’s accuser and the account was removed after repeated breaches of community standards. She blamed “additional legal complexiti­es in that case” for the delay.

Claydon got involved due to her constituen­ts’ experience­s of “online harassment, posting intimate photos, cyber-stalking, and of women who were found by family violence perpetrato­rs through social media platforms”.

“I had a growing interest because there were posts and pages that al

lowed the abuse of women – and when people complained they fell into a deep dark void somewhere, and the complaints didn’t really go anywhere.”

According to Claydon, users agree not to peddle hate speech, incite violence, or deliberate­ly spread dangerous misinforma­tion – so the platforms are not doing anything wrong by removing users who breach the terms, such as Trump.

For Claydon, the de-platformin­g of Trump raises the question “why it took four years when he’s clearly in breach of their terms” – and the fact social media platforms have found courage only on the eve of a new presidency shows the limits of self-regulation.

“They regard themselves as big global entities, and are not particular­ly accountabl­e to anyone,” she says.

According to the e-safety commission­er, 14% of Australian­s have faced online hate speech. Claydon wants to build cross-party support to prevent social media becoming “a dangerous weapon for half our citizens”, rather than “let those with the biggest mouths rush out and determine the shape” of the reform conversati­on.

Despite calls from Christense­n to swing back in the direction of free speech, creating a safer space is also the direction the government is heading in.

In December, the communicat­ions minister, Paul Fletcher, released a draft online safety bill proposing to give the e-safety commission­er powers to order the take-down of harmful content.

The e-safety commission­er, Julie Inman Grant, has said the bill would ensure moderation of social media is applied “fairly and consistent­ly” but does not address concerns from some in the Coalition about de-platformin­g.

The legislatio­n would be the first of its kind to tackle not just illegal content “but also serious online harms, including image-based abuse, youth cyberbully­ing and … serious adult cyber-abuse with the intent to harm”.

There is also a voluntary code on disinforma­tion, to be devised by the social media giants and enforced by the Australian Communicat­ions and Media Authority, expected to be finalised by mid-year.

While senior Coalition figures including the acting prime minister, Michael McCormack and the deputy Liberal leader, Josh Frydenberg, expressed disquiet at Trump’s removal, there were no suggestion­s the government would change course to accommodat­e Christense­n’s call to abolish community standards in favour of anything but unlawful speech goes.

Fletcher has signalled he is cold on the idea of going beyond the existing package, arguing that it already creates “a public regulatory framework within which decisions about removing content are made by social media platforms (and, if necessary, can be overridden by decisions of government)”.

One common strand in reform calls is that participan­ts want to see greater transparen­cy around decisions that are made to block posts or remove users.

The Australian Competitio­n and Consumer Commission chairman, Rod Sims, who led the digital platforms review, has said given the degree of control they exercise on what we see and read “we definitely need the government to get to grips with this; we can’t just leave it with the digital platforms”.

The e-safety commission­er says the platforms “aren’t always transparen­t in how they enforce and apply these policies and it’s not always clear why they may remove one piece of content and not another”.

Transparen­cy would be improved by the online safety bill’s basic online safety expectatio­ns, which would “set out the expectatio­n on platforms to reflect community standards, as well as fairly and consistent­ly implementi­ng appropriat­e reporting and moderation on their sites”, she tells Guardian Australia.

“This could include, for example, the rules that platforms currently apply to ensuring the safety of their users online, including from threats of violence.”

Liberal MP Trent Zimmerman supported the platforms’ decision to remove Trump, who he accused of “stoking the flames” of a threat to the peaceful transition of power in the US.

Yet the episode demonstrat­ed the “inconsiste­nt standards being applied” as Trump was removed while “many authoritar­ian leaders remain able to use these platforms for their propaganda”.

“We need clear, transparen­t rules. And it would be helpful to clarify what avenues there are to seek explanatio­n or appeal those decisions.”

Despite unease at the highest levels of the Australian government about de-platformin­g, the prevailing mood is still for more – not less – regulation.

For those such as Webster or Claydon’s constituen­ts, basic enforcemen­t of existing standards would be an improvemen­t.

 ?? REUTERS/Dado Ruvic/Illustrati­on/File Photo Photograph: Dado Ruvić/Reuters ?? FILE PHOTO: The Twitter and Facebook logo along with binary cyber codes are seen in this illustrati­on taken November 26, 2019.
REUTERS/Dado Ruvic/Illustrati­on/File Photo Photograph: Dado Ruvić/Reuters FILE PHOTO: The Twitter and Facebook logo along with binary cyber codes are seen in this illustrati­on taken November 26, 2019.
 ?? Photograph: Mike Bowers/The Guardian ?? Nationals MP Anne Webster won a defamation case against a conspiracy theorist who falsely accused her of being ‘a member of a secretive paedophile network’.
Photograph: Mike Bowers/The Guardian Nationals MP Anne Webster won a defamation case against a conspiracy theorist who falsely accused her of being ‘a member of a secretive paedophile network’.

Newspapers in English

Newspapers from Australia