The Daily Telegraph

Duty of care law for social media firms wins support of the Health Secretary

- By Charles Hymas HOME AFFAIRS EDITOR

A NEW statutory duty of care to make social media firms responsibl­e for the content on their sites has been backed by the Health Secretary.

Matt Hancock has, however, rejected calls for them to be treated as publishers and held responsibl­e for content in the same way as newspapers.

“The truth is they’re something in between,” said Mr Hancock. “The whole concept would collapse if we make them legally liable for everything on their site. I would not be able to send out a tweet because they would be legally responsibl­e for it.

“We need something new, we need something in between. That’s where this concept of a duty of care comes from. You can’t deliver a lot of this against the social media companies because, by the nature of the technology, they have the algorithms.

“What you can do is to require them to act in a socially responsibl­e way. My sense is that they now want to do that but they need help and support.”

A halfway-house publishing model was first recommende­d by the Commons’ culture committee last summer with its proposal for a new category of tech company that was neither a “neutral” platform nor a publisher.

Theresa May’s spokesman yesterday confirmed Mr Hancock’s proposal was the Government’s position, rebuffing calls earlier in the week by Jackie Doyle-price, the suicide prevention minister, for social media firms to be treated as publishers. Mr Hancock said it was no longer enough to “politely” ask the social media firms to change: “We need to ensure there’s enforcemen­t so that society as expressed through Government can judge what is and isn’t right to be on social media in order to have this duty of care. So many times people say they feel powerless in the face of this social media. We are not powerless. We can and will legislate.”

He welcomed Instagram’s move to ban graphic self-harm images but urged all other platforms to follow suit so that there could be no repeat of what happened to Molly Russell, 14, who took her own life after viewing self-harm images.

Baroness Kidron, founder of internet campaign group 5rights, said regulation was essential, adding: “It’s all very well getting the CEO of Instagram to fly in to do a mea culpa – but Molly died and he gets to go home to his kids.

“The Instagram promise is limited to how they define graphic. It is limited to self-harm? What about pro-anorexia sites, hate and bullying? Instagram is one part of one company – they didn’t even extend it to all they own.”

The Daily Telegraph has been campaignin­g for a statutory duty of care to be imposed on social media firms to force them to protect children.

At last, Instagram has taken a decisive step towards making its site safer for children by banning some graphic self-harm content. This is welcome, but it is not enough.

I made this clear when I met Instagram’s head, Adam Mosseri, on Thursday. I explained how exposure to awful images is only one of the immediate dangers facing young people online. One platform making one change is not enough. All of them have a duty of care to their customers and this should be legally enforceabl­e.

What social networks seem to ignore in their race for connectivi­ty and cash are the rights of children to enjoy essential protection­s from exploitati­on, abuse or life-threatenin­g danger. In the offline world, from toys to playground­s, we take that for granted. Businesses must build child safety into their designs and are punished if they do not. Shockingly, online it is currently only an optional extra.

Improving platform algorithms, though part of the solution, is not sufficient. The likes of Instagram need to invest in dedicated, expertly staffed child protection outfits that can cope with the scale of the platform, and properly investigat­e when worrying content is reported to them. They must also invest in developing Artificial Intelligen­ce that can detect harmful and dangerous content and behaviour.

We want an independen­t statutory regulator that will force social networks to design their sites to meet minimum child safety standards. They must be held to account publicly if they fail to do so. Social media companies also need to be more transparen­t regarding how they operate, so we can have confidence in their attention to safeguardi­ng. And they should demonstrat­e proactive steps to tackle obvious grooming behaviour earlier.

Platforms now have the chance to become part of the solution, rather than the problem. They are uniquely placed, for instance, to help us detect child grooming online and better understand how we can disrupt it.

Complaints that these changes may be complicate­d and costly betray a misplaced sense of priorities. Fifteen voluntary codes over the past decade or so have provided plenty of time to make decisive progress, but not enough has been achieved. Following the heartbreak­ing death of Molly Russell, it should be abundantly clear that it is worth investing time and money to avoid yet more tragedy.

Society is angry with tech firms whose profits are growing, while their child-protection measures are left wanting. Politician­s are waking up to the urgency of the situation and appear serious about holding social networks to account for their lack of action, based on tougher regulation.

Finally, then, it seems that change is in the air, and social media firms need to be more robust in their response.

 ??  ??

Newspapers in English

Newspapers from United Kingdom