The Daily Telegraph

Duty of care ‘must carry criminal sanctions’

NSPCC writes blueprint for new law to keep children safe on social media

- By Charles Hymas HOME AFFAIRS EDITOR

SOCIAL media companies that breach new “duty of care” laws should face criminal sanctions punishable by unlimited fines, Britain’s biggest child protection charity has said.

The NSPCC today publishes its blueprint for a statutory duty of care on social media firms enforced by a regulator with powers to fine them millions of pounds and sanction criminal investigat­ions if they fail to prevent children being harmed online.

Under a new offence, modelled on corporate manslaught­er laws, social media firms would be prosecuted for a “gross” breach of their duty of care if they had not introduced adequate procedures to protect children from harms including sex abuse, self-harm, abuse and bullying.

The firms would have to appoint named executives who would be personally liable for ensuring their duty of care was upheld. If found guilty of a breach, they could be banned as directors for up to 15 years using current disqualifi­cation laws.

The NSPCC said the sanctions were needed to “embed” regulatory compliance in the firms to prevent a repeat of tragedies such as the death of Molly Russell, 14, who committed suicide after viewing self-harm images. Her father blamed Instagram for contributi­ng to her death.

It is the first time that plans for criminal rather than civil sanctions have been spelt out and comes amid growing support for The Daily Telegraph’s campaign for a statutory duty of care.

Last week the boss of Instagram became the first tech chief to back the duty of care concept, which was also endorsed by the Government’s chief medical officer, the Children’s Commission­er, Church of England bishops, senior MPS, charities and child experts.

The reforms will be backed today by Ruth Moss, whose daughter Sophie, 13, took her own life after studying suicide guides on social media, and by research that shows overwhelmi­ng public support for new laws to crack down on the tech giants.

Meanwhile, a Government-commission­ed review by Dame Frances Cairncross recommende­d that Facebook and Google should face new curbs on their commercial power over news providers to safeguard quality journalism. The review of the media also demanded that the BBC must stop trying to compete with commercial news websites for stories which were not clearly in the public interest.

The NSPCC proposals to the Government, developed with the law firm Herbert Smith Freehills, will be seen as a critical test of how tough ministers are in their White Paper plans for new laws to tackle online harms.

Andy Burrows, the NSPCC associate head of online child safety, said: “Unless we have regulation that is capable of protecting children in the way we know is necessary, then we will see further tragedies with children coming to harm. This is why our proposal focuses very much on institutin­g cultural

change so that harms are designed out of social networking sites rather than chasing harm after it has taken place.”

The regulator would have civil sanctions including fines of up to 4 per cent of annual global turnover, or €20million [£17million], whichever is higher, for firms that consistent­ly breached their duty of care. It could also order them to publish any finding of a breach on their home pages, spelling out how children were placed at risk.

The regulator could force a firm to hand over data needed for investigat­ions and raid premises to seize computer files and equipment if there was evidence it was obstructin­g an inquiry or trying to cover up its failings.

Police and families have previously been refused data about victims or offenders and been forced to turn to the courts. Molly Russell’s parents are seeking access to her Instagram posts to understand why she took her life.

The regulator could issue enforcemen­t orders to require the firms to make changes to better protect children, such as tougher age checks, measures to block groomers or the revamping of algorithms that drive selfharm images into children’s accounts.

The NSPCC wants social media networks to be forced to provide more informatio­n to the public in their annual “transparen­cy” reports about the risks to which children have been exposed.

This would include “red flag” alerts of incidents where children’s safety had been put risk. The NSPCC cited a case in which Apple removed Tumblr from its app store because of child abuse images but no one was told.

“The regulator should require platforms to adhere to a legally enforceabl­e, expansive duty of care that requires them to identify reasonably foreseeabl­e risks,” the NSPCC said in its report, Taming the Wild West Web.

“After a decade of inaction, it’s time to introduce statutory regulation on social networks,” it added.

Newspapers in English

Newspapers from United Kingdom