Irish Independent

Who will get to decide what can and can’t be said online in a post-Trump era?

- Adrian Weckler TECHNOLOGY EDITOR

It’s been a big week for tech companies and content moderation. US president Trump was excommunic­ated from social media platforms, ranging from Facebook and YouTube to Twitter and Reddit. Right wing social media platform Parler was shut out from Amazon Web Services, as well as the Apple and Google app stores.

While many have cheered the tech companies’ actions, they do raise serious questions about control: who should decide whether elected officials, or others, have a voice online?

Do the events of the last week show that civic society doesn’t have the guts to legislate for big calls, hoping instead that a handful of billionair­e tech CEOs will make the right decisions instead?

This unease was articulate­d most clearly by German chancellor Angela Merkel in a response to Twitter’s Trump ban.

Ms Merkel said that it’s better for society to regulate speech “in accordance with the laws and within a framework defined by the legislator, not by the decision of the management of social media platforms”. French ministers agreed.

While this response might be dismissed as being wrapped up in a different European narrative, that the tech giants are too powerful and need to be more tightly regulated anyway, it points to the same obvious gap in governance.

Amazon’s interventi­on adds another question: how far down the technology stack should moderation responsibi­lity go? Simply within the site or social media service itself? Or deeper, to a hosting provider, an app store or even a domain registrar?

We know that advertiser­s increasing­ly feel an obligation to pull commercial support from controvers­ial content.

But might responsibi­lity soon rest with a mobile or broadband telecoms provider, as Irish online safety and anti-piracy campaigner­s argued in the 2000s?

Few in politics or in the tech industry are keen to grasp what might be considered a fundamenta­l quandary in the ethics of exactly where responsibi­lity for platformin­g content starts and finishes.

“It’s not a technical problem, it’s a society problem,” says Tanya Lokot, an associate professor in digital media and society at Dublin City University’s School of Communicat­ions who specialise­s in internet freedom, censorship and internet governance.

“Part of the problem is that the companies themselves are entirely unprepared for the kind of power they now wield as spaces for public debate. I don’t think any of them set out to be that.”

Cloudflare, a large hosting provider, has been to the fore of the debate in recent years.

In 2017, it removed the neoNazi site, The Daily Stormer, from its services. In 2018, it did the same thing with 8Chan following violence that was planned on the site.

On both occasions, Cloudflare CEO Matthew Prince admitted to being uneasy about being the arbiter of who or what gets to have a presence online.

“Without a clear framework as a guide for content regulation, a small number of companies will largely determine what can and cannot be online,” he wrote.

“Law enforcemen­t, legislator­s, and courts have the political legitimacy and predictabi­lity to make decisions on what content should be

It is the edge cases that cause these issues

It’s not a technical problem, it’s a society problem TANYA LOKOT, DCU SCHOOL OF COMMUNICAT­IONS

restricted. Companies should not.”

Those supporting the status quo say that issues of communicat­ion or speech don’t arise, as Twitter, Facebook, YouTube and others are private companies who set their own rules that they are entitled to enforce.

However, when a hosting company, or a number of them, cut off access it becomes “less like booing a speaker off the stage than cutting off the electricit­y to a building”, said Sara Fischer a media reporter at US website Axios in describing Amazon’s action against Parler.

What might happen to make decision-making clearer? EU lawmakers are introducin­g tighter measures on social media firms which will make them more accountabl­e for content deemed unacceptab­le to civic institutio­ns.

This may take some of the power, as well as the responsibi­lity, away from tech CEOs.

“The Digital Services Act recognises that that infrastruc­ture we have may not be sufficient,” says Ms Lokot.

“And so maybe some of that power should be devolved back to the states or each country, because obviously the context in each country is also different.”

Well known industry commentato­rs such as Ben Evans believe that this EU move will be to internatio­nal moderation standards what the General Data Protection Regulation (EU GDPR) is to privacy rules and become a global law through the back door.

“There probably isn’t any one perfect answer as to which institutio­n should be regulating all of it,” says DCU’s Ms Lokot.

“The only consensus so far is that it’s really hard to make rules for production content moderation, because the context keeps changing,” she says.

“There’s a reason why Facebook’s content moderation manual is now practicall­y as vast as an Encycloped­ia Britannica and yet they still have to make exceptions because it’s the edge cases that cause these issues. I think the only consensus is that companies shouldn’t have all the power and states shouldn’t have all the power. Polarisati­on doesn’t just happen because of the Facebook algorithm. It happens because there’s inequality and it’s structural.”

 ??  ??
 ?? PHOTO: REUTERS / ERIN SCOTT ?? Defending free speech: The National Guard outside the US Capitol
PHOTO: REUTERS / ERIN SCOTT Defending free speech: The National Guard outside the US Capitol
 ??  ?? Ban: The Parler app was removed from Apple and Google app stores
Ban: The Parler app was removed from Apple and Google app stores

Newspapers in English

Newspapers from Ireland