Sunderland Echo

Social media platforms must do more to protect kids, say Ofcom

-

Social media platforms must take action to stop their algorithms recommendi­ng harmful content to children, and put robust age-checking measures in place to protect them, Ofcom has said.

The regulator has published its draft children's safety codes of practice, which set out how it expects online services to meet their new legal responsibi­lities to protect children online under the Online Safety Act.

The online safety laws require sites which can be accessed by children in the United Kingdom to take action to protect those younger users by assessing the risk their platform poses to children and then putting in place measures to mitigate those risks - with large fines among the possible penalties for those found to be in breach.

Ofcom, as the new regulator for the sector, has published a range of draft codes of practice in recent months, setting out how platforms should handle different types of content, ahead of the new rules beginning to come into full force, which is expected towards the end of this year.

The latest codes include more than 40 practical measures which Ofcom says will demand a stepchange from tech firms by compelling safer design and operating practices from the biggest sites.

In particular, the codes will expect services to carry out robust age verificati­on processes to stop children accessing harmful material, as well as ensuring that their recommenda­tion algorithms - such as "For You" pages do not serve dangerous or potentiall­y harmful content to children.

Under the proposals, platforms which can be accessed by children and have a higher risk of harmful content appearing must configure their algorithms to filter out the most harmful content from children's feeds, and reduce the visibility and prominence of other lower risk, but still potentiall­y harmful, material.

The draft codes also require firms to have content moderation systems and processes in place, and ensure that swift action is taken against harmful content, with search engines expected to have a "safe search" option for use by children.

Speaking on the latest codes, Ofcom chief executive, Dame Melanie Dawes, said: "We want children to enjoy life online. But, for too long, their experience­s have been blighted by seriously harmful content which they can't avoid or control. Many parents share feelings of frustratio­n and worry about how to keep their children safe. That must change.

"In line with new online safety laws, our proposed Codes firmly place the responsibi­lity for keeping children safer on tech firms.

“They will need to tame aggressive algorithms that push harmful content to children in their personalis­ed feeds and introduce age-checks so children get an experience that's right for their age.”

Sir Peter Wanless, chief executive of children's charity, the NSPCC, said the draft code was a "welcome step in the right direction" towards protecting children online.

He added: "We look forward to engaging with Ofcom's consultati­on and will share our safeguardi­ng and child safety expertise to ensure that the voices and experience­s of children and young people are central to decision-making and the final version of the code."

Child online safety campaigner Ian Russell, the father of 14-year-old Molly Russell who took her own life in November 2017 after viewing harmful material on social media, said more still needed to be done to protect young people from online harms.

In his role as chair of online safety charity, the Molly Rose Foundation, Mr Russell said: "It's over six years since Molly's death, but the reality is that very little has yet changed. In some respects, the risks for teens have actually got worse.

"That's why it's hugely important that the next Prime Minister commits to finish the job to and strengthen the Online Safety Act to give children and families the protection they deserve."

 ?? ?? Ofcom has published its draft children’s safety codes of practice, which set out how it expects online services to
Ofcom has published its draft children’s safety codes of practice, which set out how it expects online services to
 ?? ?? Teen Molly Russell took her own life after viewing harmful online pictures.
Teen Molly Russell took her own life after viewing harmful online pictures.

Newspapers in English

Newspapers from United Kingdom