Eastern Eye (UK)

How to disagree on Positive Twitter Day

PLATFORMS URGED TO DO MORE AMID CALL TO END TOXIC SOCIAL MEDIA CULTURE

- By SUNDER KATWALA Director, British Future

THE potential of social media to spread toxicity has been back in the headlines this year, so how can we get the social media culture that we want?

On Friday (27), Positive Twitter Day offers one small opportunit­y for all of us on social platforms to do something about that – deepening the public conversati­on about what needs to change in order to make these online platforms a more civil place.

Positive Twitter Day has become a regular annual fixture on the last Friday in August. The simple idea is to offer a nudge to social media users to think before they tweet, as a way to promote more civil conversati­ons online.

It is not about everybody having to agree about everything, but it can be a day to work on how we could disagree better – and perhaps to try to have a conversati­on, rather than a shouting match, with somebody that you don’t agree with.

I conceived of this initiative in August 2012, shortly after the London Olympics, in response to the public appetite to maintain that positive spirit, along with concern at how often the incivility of social media discourse is a barrier to doing so.

Positive Twitter Day can be a day to forge unusual alliances. The blogger Guido Fawkes was an early adopter of the Positive Twitter Day message in 2012, and a consistent advocate of it over the years since. This may help it to reach some of those for whom promoting a more civil online culture – at least for one day – may be more of an effort, not just those of us who seek to do that all year round.

Users should take responsibi­lity for their contributi­on to the online climate, but social media platforms must do more to play their part too.

“We condemn racism in all its forms – our aim is to lead the industry in stopping such abhorrent views from being seen on our platform,” said Twitter this month, setting out what it had tried to do to stop racist abuse against England’s footballer­s after Euro 2020.

But a great deal needs to change for that aspiration to become a credible claim.

Twitter rules allow an astonishin­g level of racism. For example, the company confirmed to me that “black goals don’t count – no blacks in the England team” does not break its rules. So Twitter’s antiracism statements to the media are clearly contradict­ed by pro-racism platform rules.

Twitter did introduce new rules against “dehumanisi­ng” a faith or ethnic group. It was acting in response to New Zealand’s Christchur­ch mosque massacre (in March 2019), demonstrat­ing the tragic offline consequenc­es of online hate. Yet Twitter’s current interpreta­tion mainly prohibits racist metaphors – calling minorities rats or viruses –rather than extreme overt racism itself. “We must deport all blacks, Asians and Jews so that white children have a future in our country” is another example that Twitter confirmed is within its rules.

Twitter has not attempted a public defence of why it defends racism of this kind on its platform. Ministers and MPs, the FA and the Premier League, the media and NGOs must keep asking this. MPs should organise an on-the-record public committee hearing – ahead of the online harms bill this autumn – so that social media executives either defend the current rules or set out what they will change.

But stronger rules against racism would not make any difference without the capacity to uphold them. This will have to involve human beings, as well as artificial intelligen­ce (AI). A major reason why racist users are making a mockery of Twitter so easily is that AI cannot make the intuitive leaps that real people easily can.

This is exemplifie­d by the farcical scale of Twitter’s failing with banned users. Repeat offenders – the “racist respawners” – are a major cause of the most toxic content on Twitter. Those banned around the final of Euro 2020 included 60 hardcore repeat offenders, many with dozens of previous red cards. Yet 30 of this hardcore group had new accounts shortly after they were banned, some even using the same persona that had been banned. Twitter’s statement said it knew the identities of 99 per cent of the banned users – so why are so many back on its platform so easily?

Several simple steps would make a big difference. Adding a “previously banned user” flag to the user reporting options and cracking down on troll networks, which openly assist banned accounts to rebuild their networks, would be a start. More capacity could be unlocked by working constructi­vely with the networks of volunteers who currently have better real-time informatio­n on these ‘racist respawners’ than Twitter does itself.

With legislatio­n and new regulation under considerat­ion, British Future’s research shows strong public support for action against hatred online – from 72 per cent of both ethnic minority and white British respondent­s, while only seven per cent disagree.

Positive Twitter Day is a chance for users to make a contributi­on to better social norms online. But this must be the year when Twitter, Facebook and other major platforms step up and play their part too.

 ??  ??
 ?? (inset below) ?? STRONG MESSAGE: There is a need for more civil conversati­ons online; says Sunder Katwala
(inset below) STRONG MESSAGE: There is a need for more civil conversati­ons online; says Sunder Katwala

Newspapers in English

Newspapers from United Kingdom