The social media giants will finally be held responsible for children like Molly Russell
After the suicide of schoolgirl Molly Russell in 2017, her family discovered that the 14-year-old had been viewing graphic content promoting self-harm and suicide on popular social media sites Instagram and Pinterest.
Crucially, rather than having to seek out harmful material herself, it was being actively recommended to her by the apps. In fact, Molly’s email address was still receiving automated recommendations for self-harm and suicide content even after her death.
In the wave of negative publicity that followed, ministers and the chiefs of technology companies made promises to remove self-harm content from these sites. Two years later, the progress has been meagre and the persistent refusal by tech giants to allow any form of independent oversight means the incremental steps so far taken are impossible to assess.
The Age Appropriate Design Code, published today by UK’S Information Commissioner, now offers children and young people protection from the sector’s insatiable appetite for data. Its 15 provisions mandate that, before engaging with a child, an online service must consider how it will impact on them and then act in the child’s best interests – not in their own commercial interest.
Importantly, the code focuses on how children’s data is collected and used and does not dictate what content the services can or cannot host on their platform. Yet, among its provisions are two things that have the power radically to change children’s experiences online.
First, it requires services to uphold their own published rules, terms and conditions, and community guidelines. In plain language, an online service must say what it does and do what it says, or be held accountable – in this case providers face fines of up to 4 per cent of their turnover, which could mean billions for the largest tech companies. If you say you don’t host violent, harmful or suicide-promoting material, then you must not, or you will face enforcement action.
Second, the code makes companies responsible for the content they recommend to children under the age of 18. It says: “If you are using children’s personal data to automatically recommend content to them based on their past usage and browsing history then you have a responsibility for recommendations that you make. This applies even if the content itself is user-generated.”
Worth quoting because this is a hugely significant intervention by the Information Commissioner’s Office (ICO).
Algorithms that use children’s information and viewing history to recommend the next image, video, or song, are designed to keep them using the service for as long as possible. These same algorithms are often found to promulgate disinformation and conspiracy theories, entrench bias and discrimination, or – in Molly’s case – bombard a vulnerable teenager with self-harm and suicide material.
Since her death, Molly’s father, Ian, has been relentlessly campaigning to prevent other children getting caught up in cycles of harmful material. A supporter of the code, he says it as it is: “We must see the end of profit from data gathering being given precedence over the safety of children. We must stop monetising misery.”
Children and their parents have long been left with all of the responsibility but no control.
Meanwhile, the tech sector has, against all rationale, been left with all the control but no responsibility.
The code will change this. It is the first piece of regulation anywhere in the world explicitly to prevent children’s data being exploited in ways that undermine their safety and well-being. Many might be shocked that this isn’t the case already – thankfully it is now.
Baroness Kidron introduced the Age Appropriate Design Code into UK legislation. She is the founder and chairman of the 5Rights Foundation, a charity fighting for children’s rights
‘The code is the first in the world to prevent children’s data being exploited in ways that undermine safety’