The Guardian (USA)

Instagram bans 'graphic' self-harm images after Molly Russell's death

- Sarah Marsh and Jim Waterson

Instagram has announced that it will ban all graphic self-harm images as part of a series of changes made in response to the death of British teenager Molly Russell.

The photo-sharing platform made the decision – which critics said was necessary but long overdue – in response to a tide of public anger over the suicide of the 14-year-old girl, whose Instagram account contained distressin­g material about depression and suicide.

After days of growing pressure on Instagram culminated in a meeting with health secretary Matt Hancock, the social network’s head Adam Mosseri admitted that the company had not done enough and said that explicit imagery of self-harm would no longer be allowed on the site.

“We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable,” Mosseri said. “We will get better and we are committed to finding and removing this content at scale.”

The move follows significan­t public anger over Molly’s death. Her father Ian Russell said he believed Instagram was partly to blame. The family found material relating to depression and suicide when they looked at her account after her death.

Instagram announced a range of further measures, including the removal of non-graphic images of selfharm from the most visible parts of its app and website, which appeared designed to draw a line under what has become a reputation­al crisis for the brand and its parent company Facebook.

But critics said the changes should have already been made and remained sceptical they would be enough to tackle a problem that some said has grown unchecked for 10 years.

The NSPCC said Instagram had taken “an important step”, but that social networks were still falling short and that legislatio­n would be necessary.

“It should never have taken the death of Molly Russell for Instagram to act,” said chief executive Peter Wanless. “Over the last decade, social networks have proven over and over that they won’t do enough.”

Wanless said it was not enough to wait until “the next tragedy strikes”, urging the government to act without delay and impose a duty of care on social networks, with tough punishment­s for those who fail to protect their young users.

Others said Facebook had consistent­ly fallen short on self-harm and suicide across its online empire. “The company has failed to prioritise preventing self-harm,” said Jennifer Grygiel, a social media expert and assistant professor of communicat­ions at Syracuse University.

“At-risk individual­s will not be safe until Facebook takes it role as a global corporatio­n and communicat­ions platform more seriously. These changes should have been made years ago.”

Before the meeting, Mr Hancock said: “Social media companies need to do more, in particular, to remove material that encourages suicide and selfharm, so I’m going to be asking other social media companies to act.

“I don’t want people to go on to social media and search for images about suicide to get directed to yet more of that sort of imagery. They need help to not post more about suicide.”

There have been longstandi­ng concerns over how Instagram and other social networks handle content that could be harmful to the mental health of its audience, particular­ly young people, but the issue became urgent after Molly’s father said in an interview Instagram “helped kill my daughter”.

Mosseri accepted that the move was overdue. Asked in an interview with the Daily Telegraph why Instagram had taken so long to tackle the issue, he said: “We have not been as focused as we should have been on the effects of graphic imagery of anyone looking at content.

“That is something that we are looking to correct and correct quickly. It’s unfortunat­e it took the last few weeks for us to realise that. It’s now our responsibi­lity to address that issue as quickly as we can.”

Speaking on BBC Radio 4’s PM programme, the digital minister, Margot James, said the government would “have to keep the situation very closely under review to make sure that these commitment­s are made real – and as swiftly as possible”.

Mosseri said some self-harm images would be allowed to remain on Instagram. “I might have an image of a scar and say, ‘I’m 30 days clean,’ and that’s an important way to tell my story,” he said.

“That kind of content can still live on the site but the next change is that it won’t show up in any recommenda­tion services so it will be harder to find.”

Instagram’s decision comes as large social media companies such as Facebook, which owns Instagram, prepare to battle with the British government of the future of internet regulation in the UK.

The government is considerin­g imposing a mandatory code of conduct on tech companies, which could be accompanie­d by fines for non-compliance, prompting in a substantia­l behind-the-scenes lobbying campaign by social media sites.

The culture secretary, Jeremy Wright, is due to unveil the government’s proposals at the end of this month, helping to spur Facebook into swift action.

In the UK, Samaritans can be contacted on 116 123 or emailjo@samaritans.org. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other internatio­nal suicide helplines can be found at www.befriender­s.org.

 ??  ?? When Molly Russell’s family looked at her Instagram account after her death, they found distressin­g material about depression and suicide. Photograph: Family handout/PA
When Molly Russell’s family looked at her Instagram account after her death, they found distressin­g material about depression and suicide. Photograph: Family handout/PA

Newspapers in English

Newspapers from United States