The Guardian (USA)

Instagram to extend its ban on images of selfharm to cover cartoons

- Alex Hern

Instagram is to extend its ban on depictions of self-harm to cover cartoons and drawings, following an appeal from Ian Russell, whose 14-year-old daughter, Molly, killed herself in 2017.

Molly had been looking at graphic content relating to suicide and selfharm before she died, her father discovered, prompting him to go public earlier this year and campaign against the platform’s rules that allowed that material.

Instagram had already banned graphic images of self-harm in February, following Ian Russell’s protests, and the company says it will extend that ban to unrealisti­c yet explicit depictions of suicide, and images that “promote” self-harm.

“It will take time to fully implement,” Instagram’s Adam Mosseri, the head of the Facebook subsidiary, told BBC News, “but it’s not going to be the last step we take. There is still very clearly more work to do. This work never ends.”

Russell described Instagram’s new commitment as sincere, but said the company needed to act more swiftly. “I just hope he [Mosseri] delivers,” he added.

Speaking about his daughter, Russell told the BBC: “I think Molly probably found herself becoming depressed. She was always very self-sufficient and liked to find her own answers. I think she looked towards the internet to give her support and help. She may well have received support and help, but what she also found was a dark, bleak world of content that accelerate­d her towards more such content.”

He said the algorithms used by some online platforms “push similar content towards you” based on what you have been looking at.

He said: “I think Molly entered that dark rabbit hole of depressive suicidal content. Some were as simple as little cartoons – a black and white pencil drawing of a girl that said: ‘Who would love a suicidal girl?’ Some were much more graphic and shocking.”

Andy Burrows, the head of child safety online policy at the NSPCC, said the move did not change the fact that the industry as a whole was irresponsi­ble, and called on the government to progress legislatio­n intended to impose a duty of care on social media platforms.

“Molly’s death should be a galvanisin­g moment to act,” Burrows said, “but the reality is while Instagram has taken positive steps the rest of the tech industry has been slow to respond – on self-harm, suicide and other online harms.

“As Ian Russell says there is a pressure of time and there is a price for not moving quickly enough, which is children’s lives. That is why the government needs to introduce a draft bill to introduce the duty of care regulator by next Easter and commit to ensuring it tackles all the most serious online threats to children.”

In the UK and Ireland, Samaritans can be contacted on 116 123 or email jo@samaritans.org or jo@samaritans.ie. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other internatio­nal helplines can be found at www.befriender­s.org.

 ??  ?? Molly Russell had been looking at content related to suicide and self-harm when she died, her father found. Photograph: Family handout/ PA
Molly Russell had been looking at content related to suicide and self-harm when she died, her father found. Photograph: Family handout/ PA

Newspapers in English

Newspapers from United States