Head teachers back legal duty of care for social media
BRITAIN’S largest headteachers’ union has backed a statutory duty of care to protect children from online harms, saying inaction by social media companies has gone on for too long. Sarah Hannafin, senior policy adviser for the National Association of Head Teachers (NAHT), said social media firms had been too slow to act and only tended to do so after tragedies such as the death of Molly Russell, 14, who took her life after viewing self-harm images.
“Things have been allowed to happen and perpetuate on social media, and there’s been slow action. I find it difficult to understand why. They need to be more proactive,” said Ms Hannafin.
She added: “They need to be on the ball looking for material and having a clearer line on what is acceptable and what is not acceptable, particularly where children and young people are concerned.
“There needs to be a responsibility to remove content quickly that is inappropriate or harmful. If a statutory duty of care means that social media companies will prioritise the safety and wellbeing of young people, then that’s what should happen.”
The Daily Telegraph has been campaigning for a statutory duty of care to force tech giants to do more to protect children from online harms.
Ms Hannafin said the companies should be taking down not only illegal content such as child abuse images but also legal material that could be harmful such as abusive or bullying posts and sexual or violent content.
“There is an illegal level but there’s a second, legal level,” she said.
“Could this content possibly cause harm to young people because they are a specific audience? If the answer is yes, then it needs to come down.”
The NAHT also wants tougher controls to stop younger children circumventing 13-plus age limits to prevent them accessing inappropriate material.
“There are ways to improve the system through technology or age verification blocks,” added Ms Hannafin.
She also called for all platforms to have safety features tailored specifically to young people such as clearly-marked icons or buttons to report content or links to child-protection charities if they felt threatened.
“This would mean very clear and easy-to-use reporting functions on platforms and ensuring that any tools are able to be used by the youngest children,” said Ms Hannafin.
“Consistency is important and online social media platforms should operate in a similar way, making it easier for children, young people and infrequent users to easily find and use such functions, no matter which platform.”