National Post (National Edition)

Social media clamps down too late

- TIMOTHY GRAHAM Postmedia News Timothy Graham is a senior lecturer at Queensland University of Technology. This article originally appeared online at theconvers­ation.com, an independen­t source of news and views, from the academic and research community.

Amid the chaos in the U.S. Capitol, stoked largely by rhetoric from President Donald Trump, Twitter locked his account, with 88.7 million followers, for 12 hours.

Facebook and Instagram quickly followed suit, freezing Trump's accounts — with 35.2 million followers and 24.5 million, respective­ly — for 24 hours.

The locks are the latest effort by social media platforms to clamp down on Trump's misinforma­tion and baseless claims of election fraud.

They came after Twitter labelled a video posted by Trump and said it posed a “risk of violence.” Twitter removed users' ability to retweet, like or comment on the post — the first time this has been done.

In the video, Trump told the agitators at the Capitol to go home, but at the same time called them “very special” and said he loved them for disrupting the Congressio­nal certificat­ion of president-elect Joe Biden's win.

That tweet has since been taken down for “repeated and severe violations” of Twitter's civic integrity policy. YouTube and Facebook have also removed copies of the video.

But as people across the world scramble to make sense of what's going on, one thing stands out: The events that transpired this week were not unexpected.

Given the lack of regulation and responsibi­lity shown by social media platforms over the past few years, it's fair to say the writing was on the wall.

While Trump is no stranger to contentiou­s and even racist remarks on social media, Twitter's action to lock the president's account is a first.

The line was arguably crossed by Trump's implicit incitement of violence and disorder within the halls of the U.S. Capitol itself.

Neverthele­ss, it would have been a difficult decision for Twitter (and Facebook and Instagram), with several factors at play. Some of these are short-term, such as the immediate potential for further violence.

Then there's the question of whether tighter regulation could further incite rioting Trump supporters by feeding into their theories claiming the existence of a large-scale “deep state” plot against the president. It's possible.

But a longer-term considerat­ion — and perhaps one at the forefront of the platforms' priorities — is how these actions will affect their value as commercial assets.

I believe the platforms' biggest concern is their own bottom line. They are commercial companies legally obliged to pursue profits for shareholde­rs. Commercial imperative­s and user engagement are at the forefront of their decisions.

What happens when you censor a Republican president? You can lose a huge chunk of your conservati­ve user base, or upset your shareholde­rs.

Despite what we think of them, or how we might use them, platforms such as Facebook, Twitter, Instagram and YouTube aren't set up in the public interest.

For them, it is risky to censor a head of state when they know that content is profitable. Doing it involves a complex risk calculus — with priorities being shareholde­rs, the companies' market value, and their reputation.

The platforms' decisions to not only force the removal of several of Trump's posts but also to lock his accounts carries enormous potential loss of revenue. It's a major and irreversib­le step.

And they are now forced to keep a close eye on one another. If one appears too “strict” in its censorship, it may attract criticism and lose user engagement, and ultimately profit. At the same time, if platforms are too loose with their content regulation, they must weather the storm of public critique.

You don't want to be the last organizati­on to make the tough decision, but you don't necessaril­y want to be the first, either — because then you're the “trial balloon” who volunteere­d to potentiall­y harm the bottom line.

For all major platforms, the past few years have presented high stakes. Yet there have been plenty of opportunit­ies to stop the situation snowballin­g to where it is now.

From Trump's baseless election fraud claims to his false ideas about the coronaviru­s, time and again platforms have turned a blind eye to serious cases of misand disinforma­tion.

The storming of the Capitol is a logical consequenc­e of what has arguably been a long time coming.

The coronaviru­s pandemic illustrate­d this: While Trump was partially censored by Twitter and Facebook for misinforma­tion, the platforms failed to take lasting action to deal with the issue at its core.

In the past, platforms have cited constituti­onal reasons to justify not censoring politician­s. They have claimed a civic duty to give elected officials an unfiltered voice.

This line of argument should have ended with the “Unite the Right” rally in Charlottes­ville in August 2017, when Trump responded to the killing of an anti-fascism protester by claiming there were “very fine people on both sides.”

While there is no silver bullet for online misinforma­tion and extremist content, there is also no doubt that platforms could have done more in the past that may have prevented the scenes witnessed in Washington.

In a crisis, there is a rush to make sense of everything. But we need only look at what led us to this point. Experts on disinforma­tion have been crying out for platforms to do more to combat disinforma­tion and its growing domestic roots.

In 2021, extremists such as neo-Nazis and QAnon believers no longer have to lurk in the depths of online forums or commit lone acts of violence. Instead, they can violently storm the Capitol.

It would be a cardinal error to not appraise the severity and importance of the neglect that led us here. In some ways, perhaps that is the biggest lesson we can learn.

Newspapers in English

Newspapers from Canada