The Daily Telegraph

Self-harm content to be banned by law

- By Charles Hymas, Steven Swinford and Mike Wright

SOCIAL media firms will be banned from promoting selfharm or suicide content to children under a legallyenf­orced code of conduct to prevent another death similar to that of Molly Russell.

Under laws that will place a statutory duty of care on all tech firms, a new independen­t regulator will have powers to require firms to have processes to “stop algorithms promoting self-harm or suicide content to users”.

They will also be required to remove illegal or unacceptab­le content within set timescales, block users who post such material and ensure those banned are prevented from creating new accounts to continue to encourage suicide or self-harm.

The crackdown, to be unveiled in a White Paper on Monday, follows the death of Molly, 14, who took her life after visiting self-harm sites. Her father, Ian, blamed Instagram for contributi­ng to her suicide.

The code of practice will be one of at least 10 to protect children and other users from online harms including terrorism, child sexual exploitati­on, serious violence, hate crime, harassment, disinforma­tion, cyberbully­ing and children accessing inappropri­ate content.

They will be underpinne­d by a statutory duty of care enforced by the regulator with powers to penalise breaches with heavy fines, prosecutin­g or fining named directors, barring sites from appearing in searches and, as a last resort, blocking firms from accessing UK users.

The White Paper also gives the regulator powers to force social media firms to open their books and reveal how their algorithms target content at users, a technology that has been blamed for directing self-harm and suicide material at vulnerable children like Molly.

Ian Russell said it was vital

that the “algorithms were stopped”. He said: “This is essential if young and vulnerable people are to be prevented from finding content that might amplify their mental ill health and lead to suicide.

“It is time for the Government to stand up to the tech companies and if they can do that I very much hope that in the near future the internet will start becoming a safer place for all.”

The White Paper will cite terrorism and child sexual exploitati­on as priority areas where government will have powers to direct the regulator on what it wants in the codes.

Firms will be expected proactivel­y to prevent new and known child abuse images appearing on their sites, identify and act against grooming, search out and disable accounts of suspected paedophile­s and block them from searching for images.

Declaring that there should be no safe space online for terrorists, the White Paper will propose firms should block would-be extremists from searching for terrorist material. They will also be required to prevent extremist material appearing on their sites and take down any that does within a set timescale.

The White Paper targets disinforma­tion by requiring firms to ensure their algorithms do not drive users towards extreme and unreliable material in order to keep them online and boost advertisin­g revenues.

They will be expected to promote authoritat­ive and diverse news content to prevent users only accessing material that confirms their own views.

Social media firms will be required by law to do more to bar under-aged users as it is proposed their terms and conditions, which set out age limits, should become legally enforced.

They will also have to submit reports showing what action they took to combat online harms and how effective they are in responding to complaints.

 ??  ??

Newspapers in English

Newspapers from United Kingdom