The Daily Telegraph

We now have a once-in-a-generation chance to hold tech giants to account for hate content

Public urged to report abuse on social media platforms to help bring in new Online Safety Bill

- By Damian Collins Damian Collins MP is chairman of the joint parliament­ary committee scrutinisi­ng the Online Safety Bill

The senior Facebook executive Andrew Bosworth wrote in a now infamous internal company memo “the ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good”. The trouble the tech giants face now, is that increasing­ly government­s and parliament­s around the world no longer believe this to be the case.

These companies make money from users’ engagement with their services, and have developed business models based on holding that attention for as long as possible, whatever it takes.

Tuesday’s quarterly results call with Facebook’s investors proved that their model continues to be profitable. It is this approach that has allowed social media to become a quagmire of hate speech, anti-vaccine conspiracy theories and harmful content.

They’ve allowed their systems to be used by foreign states to interfere in the elections of other countries and even for the organisati­on of an insurrecti­on in Washington DC on

Jan 6 that led to the storming of the US Capitol building.

In recent high-profile examples in this country, such as abuse on social media directed at black members of the England men’s football team after their defeat against Italy in the final of the European Championsh­ips, the problem wasn’t just that they were the targets of racist language and monkey emojis but the recommenda­tion tools of the platforms were actively promoting this content to other users.

This example represents one of a series of questions to be answered by a new law to be put through Parliament,

‘England players were the target of racist language, but recommenda­tion tools actively promoted content’

the Online Safety Bill. This draft law will finally put a legal framework around hate speech and harmful content and empower an independen­t regulator to hold the tech giants to account for their role in hosting and promoting it. This is a once-in-ageneratio­n piece of legislatio­n that will update our laws for the digital age.

A new joint parliament­ary committee, comprising members of both the Lords and Commons, and which I chair, is now charged with examining this Bill line-by-line to make sure it is fit for purpose. However, we need the public and expert bodies help with this work.

Our social media feeds are personalis­ed to content the algorithm thinks we want to see and engage with. One of the ways the public can help this committee is to tell us what you see on your feeds that you think might be harmful. This could be something that is recommende­d by, for example, the “for you” feature on Tiktok, Youtube’s “next up” or content ranked near the top of your Facebook “news feed”. Unless content is reported, these incidences – whether they be terrible images of self-harm or violence – can otherwise be unknowable to regulators.

This is part of the problem of creating new laws in this area, that the tech companies don’t always allow independen­t researcher­s to access this content and to understand how it spreads. So if you see harmful content, take a screenshot and send it to us at jconlinesa­fetybill@parliament.uk. We won’t share your identity but it will help our research. In the social media age, we have still not yet got the balance right between stopping harm and protecting freedom of speech, and now we have the opportunit­y to fix it.

Newspapers in English

Newspapers from United Kingdom