Who will get to decide what can and can’t be said online in a post-Trump era?
It’s been a big week for tech companies and content moderation. US president Trump was excommunicated from social media platforms, ranging from Facebook and YouTube to Twitter and Reddit. Right wing social media platform Parler was shut out from Amazon Web Services, as well as the Apple and Google app stores.
While many have cheered the tech companies’ actions, they do raise serious questions about control: who should decide whether elected officials, or others, have a voice online?
Do the events of the last week show that civic society doesn’t have the guts to legislate for big calls, hoping instead that a handful of billionaire tech CEOs will make the right decisions instead?
This unease was articulated most clearly by German chancellor Angela Merkel in a response to Twitter’s Trump ban.
Ms Merkel said that it’s better for society to regulate speech “in accordance with the laws and within a framework defined by the legislator, not by the decision of the management of social media platforms”. French ministers agreed.
While this response might be dismissed as being wrapped up in a different European narrative, that the tech giants are too powerful and need to be more tightly regulated anyway, it points to the same obvious gap in governance.
Amazon’s intervention adds another question: how far down the technology stack should moderation responsibility go? Simply within the site or social media service itself? Or deeper, to a hosting provider, an app store or even a domain registrar?
We know that advertisers increasingly feel an obligation to pull commercial support from controversial content.
But might responsibility soon rest with a mobile or broadband telecoms provider, as Irish online safety and anti-piracy campaigners argued in the 2000s?
Few in politics or in the tech industry are keen to grasp what might be considered a fundamental quandary in the ethics of exactly where responsibility for platforming content starts and finishes.
“It’s not a technical problem, it’s a society problem,” says Tanya Lokot, an associate professor in digital media and society at Dublin City University’s School of Communications who specialises in internet freedom, censorship and internet governance.
“Part of the problem is that the companies themselves are entirely unprepared for the kind of power they now wield as spaces for public debate. I don’t think any of them set out to be that.”
Cloudflare, a large hosting provider, has been to the fore of the debate in recent years.
In 2017, it removed the neoNazi site, The Daily Stormer, from its services. In 2018, it did the same thing with 8Chan following violence that was planned on the site.
On both occasions, Cloudflare CEO Matthew Prince admitted to being uneasy about being the arbiter of who or what gets to have a presence online.
“Without a clear framework as a guide for content regulation, a small number of companies will largely determine what can and cannot be online,” he wrote.
“Law enforcement, legislators, and courts have the political legitimacy and predictability to make decisions on what content should be
It is the edge cases that cause these issues
It’s not a technical problem, it’s a society problem TANYA LOKOT, DCU SCHOOL OF COMMUNICATIONS
restricted. Companies should not.”
Those supporting the status quo say that issues of communication or speech don’t arise, as Twitter, Facebook, YouTube and others are private companies who set their own rules that they are entitled to enforce.
However, when a hosting company, or a number of them, cut off access it becomes “less like booing a speaker off the stage than cutting off the electricity to a building”, said Sara Fischer a media reporter at US website Axios in describing Amazon’s action against Parler.
What might happen to make decision-making clearer? EU lawmakers are introducing tighter measures on social media firms which will make them more accountable for content deemed unacceptable to civic institutions.
This may take some of the power, as well as the responsibility, away from tech CEOs.
“The Digital Services Act recognises that that infrastructure we have may not be sufficient,” says Ms Lokot.
“And so maybe some of that power should be devolved back to the states or each country, because obviously the context in each country is also different.”
Well known industry commentators such as Ben Evans believe that this EU move will be to international moderation standards what the General Data Protection Regulation (EU GDPR) is to privacy rules and become a global law through the back door.
“There probably isn’t any one perfect answer as to which institution should be regulating all of it,” says DCU’s Ms Lokot.
“The only consensus so far is that it’s really hard to make rules for production content moderation, because the context keeps changing,” she says.
“There’s a reason why Facebook’s content moderation manual is now practically as vast as an Encyclopedia Britannica and yet they still have to make exceptions because it’s the edge cases that cause these issues. I think the only consensus is that companies shouldn’t have all the power and states shouldn’t have all the power. Polarisation doesn’t just happen because of the Facebook algorithm. It happens because there’s inequality and it’s structural.”