Scottish Daily Mail

Suicide videos help our users, Instagram tells Molly inquest

- By Jim Norton Technology Editor

AN INSTAGRAM executive has defended allowing users to post self-harm and suicidal content – claiming that it helped them to ‘express themselves’.

Liz Lagone, head of health and wellbeing at Meta, which owns Instagram, Facebook and WhatsApp, was flown in from the US to give evidence yesterday at the inquest of Molly Russell, who took her own life in 2017.

Instagram’s policies at the time allowed the self-harm and suicidal material the 14year-old was exposed to as long as it did not actively ‘encourage or promote’ it.

Miss Lagone insisted the policy was justified as such content had helped ‘create awareness’, let users ‘share their feelings’, and even enabled ‘cries for help’.

Coroner Andrew Walker warned the inquest that the graphic videos Molly had viewed on the platform were ‘of the most distressin­g nature and almost impossible to watch’ before playing them in court.

Around a dozen short montages showed people falling off buildings, jumping in front of trains and cutting themselves, as the words ‘fat’, ‘worthless’ and ‘suicidal’ flashed across the screen.

One came from an account that is still on Instagram and has more than 53,000 followers.

Miss Lagone denied Instagram had treated depressed children such as Molly like ‘guinea pigs’ when in 2016 it introduced a new algorithm – software that recommends targeted content to users.

A breakdown of the accounts Molly interacted with in the runup to her death revealed 7 per cent of those she was recommende­d to follow were either ‘sad or depressive related’.

But barrister Oliver Sanders KC, representi­ng Molly’s family, said this was ‘just the tip of the iceberg’ as Meta had refused to reveal the names of more than 1,000 that she followed or was followed by. Despite being ordered to hand over informatio­n relating to her account by UK courts, Meta has only partly supplied the informatio­n on the grounds that it cannot identify the accounts’ owners.

The inquest at North London Coroner’s Court is examining the algorithms used by social media firms. On Thursday, a Pinterest boss expressed regret for the selfharm and suicidal material Molly viewed on the image-sharing site.

However, Miss Lagone defended Instagram’s policy of letting users post such content, claiming it helped them ‘come together for support’ and ‘talk about their own experience’. She said the firm had to ‘consider the unbelievab­le harm that can be done by silencing struggles’ endured by its users.

‘Just the tip of the iceberg’

But Mr Sanders became visibly frustrated when he had to ask her repeatedly whether a child could differenti­ate between ‘content that encourages or creates awareness’ of suicide and self-harm.

Instagram has since banned all depictions of suicide or self-injury no matter what the context after its own experts warned it could ‘unintentio­nally’ encourage users to carry out such acts.

However, the policy change in 2019 – partly in response to concerns raised by Molly’s family – still allows discussion of the topic if it is not graphic or shows methods.

The inquest continues. ÷ For help or support, visit samaritans.org or call the Samaritans for free on 116 123.

 ?? ??
 ?? ?? Harmful content: Molly Russell. Inset: Liz Lagone at inquest into the teenager’s death
Harmful content: Molly Russell. Inset: Liz Lagone at inquest into the teenager’s death

Newspapers in English

Newspapers from United Kingdom