The Guardian Australia

After the Molly Russell inquest, social media firms face a safety reckoning

- Dan Milmo

What content is safe for a child to view online? Who gets to decide what is OK and what is dangerous? And how much of a role should a government, a company or a parent play in that decision?

These questions were brought into focus by the inquest into the death of 14-year-old Molly Russell. And if there was one point during the twoweek hearing when the case for tougher online regulation became overwhelmi­ng, it was during Meta executive Elizabeth Lagone’s testimony.

The head of health and wellbeing policy at Mark Zuckerberg’s company was taken through a selection of the Instagram posts the teenager had viewed in the six months before her death – deeming many of them to be “safe” for children to view. It was not an opinion shared by many in the room at North London coroner’s court.

Molly, from north-west London, died in 2017 after viewing extensive amounts of online content related to suicide, depression, self-harm and anxiety. In what the NSPCC described as a global first, the senior coroner said social media had contribute­d to Molly’s death, ruling that that Molly had died from “an act of self-harm while suffering from depression and the negative effects of online content”.

Lagone would be shown a post and then comment on whether it met Instagram guidelines at the time. Many were deemed permissibl­e by Lagone, who would use phrases like “sharing of feelings” as she explained that suicide and self-harm content could be allowed if it represente­d an attempt to raise awareness of a user’s mental state and share their emotions.

The online safety bill is expected to resume its progress through parliament imminently and the new culture secretary, Michelle Donelan, said its provisions for protecting children would be strengthen­ed. The bill places a duty of care on tech platforms to protect children from harmful content and systems.

In July, a written statement from Donelan’s predecesso­r, Nadine Dorries, made clear that the type of content seen by Molly would be covered by the bill. Children must be prevented from encounteri­ng content promoting self-harm and legal suicide content, although content about recovery from self-harm or suicidal feelings could be permitted if it is age appropriat­e. Meta’s method for assessing whether this content is allowed or appropriat­e will be assessed by the communicat­ions regulator Ofcom – it won’t be the company’s call.

Molly’s father, Ian, was scathing about the “safe” assessment. Speaking at the end of the two-week hearing, he said: “If this demented trail of lifesuckin­g content was safe, my daughter Molly would probably still be alive and instead of being a bereaved family of four, there would be five of us looking forward to a life full of purpose and promise that lay ahead for our adorable Molly.”

Lagone apologised for the fact that some of the content viewed by Molly did break Instagram guidelines at the time, which barred content that promoted or glorified suicide and selfharm.

In questionin­g Lagone, the Russell family’s legal representa­tive, Oliver Sanders KC, said the entire situation was unsafe. “I suggest to you that it is an inherently unsafe environmen­t … dangerous and toxic for 13- to 14-yearolds alone in their bedrooms scrolling through this rubbish on their phones.”

“I respectful­ly disagree,” Lagone responded. There were a lot of these tense exchanges, which at one point resulted in senior coroner Andrew Walker asking Lagone to answer “yes or no” whether one batch of content she had seen was safe for children.

The accumulate­d impact of this back and forth led to Russell raising his voice at one point: “Why on Earth are you doing this?” He said Instagram was choosing to put content “in the bedrooms of depressed children”, adding: “You have no right to. You are not their parent. You are just a business in America.”

The impact of Lagone’s two-day appearance, and of what happened to Molly, was to damage faith that a major social media platform could be relied on to police its content and systems

without a wider regulatory framework to ensure it is done properly.

As Ian Russell said afterwards: “It’s time for the government’s online safety bill to urgently deliver its long-promised legislatio­n.”

Meta said its thoughts were with the Russell family and that it was “committed” to ensuring that people’s experience of Instagram was a “positive experience for everyone”. Pinterest, another platform that had shown Molly harmful content before she died, also said it was “committed to making ongoing improvemen­ts to help ensure that the platform is safe for everyone”.

If this UK government manages to pass the planned legislatio­n they, and other platforms, will soon be answerable to Ofcom on those commitment­s.

Wider TechScape

Elon Musk has performed a U-turn over his decision to walk away from a $44bn deal to buy Twitter. This week his lawyers sent a letter to the social media platform stating that the world’s richest man would, after all, like to buy the business. Twitter is suing Musk in

Delaware, the US state where the company is incorporat­ed, and it was likely to succeed in its legal argument that Musk must be forced to go ahead with the deal that he had formally signed up to.

TikTok is going from strength to strength. The Chinese-owned social video app has reported a five-fold surge in turnover to $1bn across its operations in internatio­nal markets including the UK and Europe last year.

Are virtual influencer­s set to become a big thing in the US? Digital avatars already have a strong marketing presence in Japan, South Korea and China, says the Hollywood Reporter.

Get ready for an iPhone with a new charging port. The European parliament has voted to introduce a single charging port for mobile phones, tablets and cameras by 2024 in a move that presents difficulti­es for Apple, whose iPhones use a different power connector.

Why tech jobs are popular: Google UK’s staff earned an average of more than £385,000 each in the 18 months to the end of December 2021.

In the UK, the youth suicide charity

Papyrus can be contacted on 0800 068 4141 or email pat@papyrus-uk.org, and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for support. You can also text HOME to 741741 to connect with a crisis text line counsellor. In Australia, the crisis support service Lifeline is 13 11 14. Other internatio­nal helplines can be found at befriender­s.org.

 ?? Photograph: The Russell family ?? Molly Russell, pictured here in 2013, died in 2017 after viewing extensive amounts of content online.
Photograph: The Russell family Molly Russell, pictured here in 2013, died in 2017 after viewing extensive amounts of content online.
 ?? Photograph: Beresford Hodge/PA ?? Elizabeth Lagone, Meta’s head of health and wellbeing, arrives at coroner’s court in north London on 23 September.
Photograph: Beresford Hodge/PA Elizabeth Lagone, Meta’s head of health and wellbeing, arrives at coroner’s court in north London on 23 September.

Newspapers in English

Newspapers from Australia