The Daily Telegraph

Molly’s death made me realise Instagram’s responsibi­lities

Adam Mosseri, head of the social media firm, tells Charles Hymas how the teenager’s suicide deeply affected him as he announces his platform will ban all graphic self-harm images November 16

-

Adam Mosseri, head of Instagram, admits the moment Ian Russell blamed his company for contributi­ng to the suicide of his teenage daughter Molly was overwhelmi­ng. “It’s the kind of thing that hits you in the chest and sticks with you,” says the tech chief who has spent more than a decade at Facebook and now Instagram.

“I focused a lot on safety issues and integrity issues and well-being issues at Instagram, and before at Facebook,” he said. “They are issues I have always taken very seriously but it becomes so much more real, so much more intense, when you have the story of an individual, particular­ly if something tragic has happened.”

The death of Molly, 14, who took her life after viewing self-harm images online, provoked a tidal wave of criticism of social media for its failure to protect children from online harms. Her father, Ian, said Instagram “helped kill my daughter”.

Mr Mosseri yesterday announced that the company will ban all graphic self-harm images and make it harder for users to find non-graphic images.

It is something parents, charities and the Government believe Instagram and other companies should have done a long time ago – and not just on self harm, but cyberbully­ing, child sex abuse images, harassment and a multitude of other problems.

It is why The Daily Telegraph last year launched its campaign for a statutory duty of care to be imposed on social media firms to do more to protect children from online harms. Within weeks, the Government is to unveil a White Paper that is expected to create a new regulator to force the tech firms to take down illegal material and purge harmful but legal content.

Mr Mosseri offers a “mea culpa” to the contention that social media giants have not done enough fast enough to protect children.

“I joined the company more than 10 years ago and we were primarily focused on all of the good that came out of connecting people, giving people a voice, helping people connect with loved ones, small businesses,” he says. “But if I am honest, we were under-focused on the risks of connecting so many people. That’s a lesson we have learned over the last few years. That has fundamenta­lly changed how we build, how we design, how we do what we do.

“With a lot of these problems, the reality is they take a long time to get to a good place. I feel good about the fact we are getting better in dealing with these issues, but I don’t feel good about our state [where we are now].”

It is, however, too late to save Molly. Asked if he accepted Mr Russell’s accusation that Instagram had helped kill his daughter, he replies: “I accept that we have room to improve and we have a responsibi­lity to improve.

“There is much more that we can do on issues of self-harm and suicide, and we need to do that. I do want to be careful, because there is a tension between wanting to act and act quickly and the need to act responsibl­y. I don’t want to do anything that will unintentio­nally stigmatise any sort of mental health issues. I don’t want to do anything that will put a vulnerable person in a place where they feel unsupporte­d or ashamed if we take that content down.

“There is a tension between speed and responsibi­lity. We are trying to figure out how to navigate that.”

When asked, he reveals that he has “reached out” to Molly’s family. “I have emailed, I have reached out to Molly’s father offering my condolence­s, acknowledg­ing we have room to improve and giving him a little bit of context on some of the

In a speech at the BBC, the Duke of Cambridge accuses tech firms of not protecting children from the hate and bile of social media.

things that we were working on,” he says. Has he had a reply? “I have not yet had a response,” he says.

Would he like to meet the family? “I am always open in general to all these things. It’s important that all options are on the table,” he replies.

“I am here personally [in the UK] because this issue is important to me. I am primarily here not to deliver a message about what we are doing. I am here to talk to policymake­rs, experts and to anyone else I can learn from on these issues. So all options should be on the table.”

Mr Mosseri reverts to legalese when asked whether he would provide the family and coroner with any data or personal informatio­n that could help them understand why Molly decided to take her own life.

“We are always prepared to provide data when we get a formal request from a coroner or law enforcemen­t agency. We haven’t received one of those. We will absolutely consider it and take it seriously,” he says.

Critics will say the social media firms are only acting now because of the threat of legislatio­n. Not only the UK Government but also the US are considerin­g some form of statutory duty of care. So far, only Germany and Australia have introduced legislatio­n.

Yesterday, Mr Mosseri was engaged in a series of meetings, starting with the NSPCC, a charity campaignin­g for a duty of care, followed by Jeremy Wright, the Culture Secretary, and finally Matt Hancock, the Health Secretary, one of his firm’s arch critics.

Asked if he supported a statutory duty of care as a concept, he said: “I support that as a concept. We have a responsibi­lity to not only create value for the people who use our platform but also to keep people safe. That idea sounds very much in line with the responsibi­lity to keep people safe.

“What that looks like is a decision for policymake­rs to make. We think it’s important to collaborat­e so that they can understand how our systems work and make sure whatever legislatio­n or processes they put in place, work, make sense, can adapt as threats adapt and the world changes.”

Is it time for social media to be regulated? “There is a lot of regulation already. It’s good that we are having these conversati­ons about what more can be done. I personally try to be involved in helping connect with regulators and policymake­rs, because I think that is important.”

Mr Mosseri is a father of children aged one and three. They already have social media accounts, although he is quick to point out he currently handles them.

Not surprising­ly as a senior tech executive, he is not going to stop them using social media. More important, he says, is to instill a questionin­g and healthily sceptical approach to content online so they are able to manage any risks when they grow up and start engaging.

Nor is it sensible, he says, to try to keep tabs on them without their consent as they are likely to create shadow accounts to avoid the prying eyes of parents. More pressing, however, for many parents, is how Instagram and other platforms will take action to combat the addictive nature of their technology that has hooked a generation of children. What action will Instagram take?

He endorses as “very reasonable” the Chief Medical Officer’s demand that parents not let children spend more than two hours online.

“Last year we launched a series of features that help people understand how much time they spend on the site and you can set specific limits to remind them. That’s a great feature. I am excited that some of the operating systems are building it into all apps.”

This does not, however, answer the issue that Instagram and other platforms are addictive “by design”. Would Instagram be prepared to switch off notificati­ons and features that encourage children to stay online for longer? As the man who oversaw the creation of key design functions at Facebook and Instagram, the answer is more equivocal. “We try to design a product that people find valuable. We are not trying to design a product that people use more,” he says.

“It’s important for users to receive notificati­ons for the moments that are most important. We have gotten more conservati­ve on that over the years, and better. If you didn’t get an message or important notificati­on, the product would not work. But it is a thought-provoking question.”

What about switching off notificati­ons at night – as requested by the Chief Medical Officer? Instagram does have a mute button and, says Mr Mosseri, it is probably better handled by the Android and iphone operating systems, which have the capacity to switch on do-not-disturb functions.

But he raises it as a prospect: “We could consider building it into Instagram. I would be open to that. I don’t have any philosophi­cal reason not to but it might not be the most valuable thing we could do with our time to keep people safe.” He adds Instagram is working to define, detect and prevent cyberbully­ing.

“We are exploring allowing people to put on some sort of safety mode or to remediate in some way.

On prevention, Instagram is looking to find a way of stopping people posting in the first place, but his proposal has no sense of compulsion: “You can imagine if someone posts something hateful or writes something before it goes through, we could ask them, ‘do they want to do that?’”

On taking down child abuse imagery, he says Instagram has been working with the police and other organisati­ons to “bring it down immediatel­y – we have been working on that for a long time, although it doesn’t go away. We will continue to work on it.”

Asked whether they would take a tougher stand on barring under-13s by requiring more rigorous proofs of identity such as a passport, he says it is complicate­d. “That said, we try very hard to make sure no one under 13 uses Instagram,” he says. The question, therefore, remains as to whether Instagram’s push on self-harm will extend to other parts of its platform. Is Molly’s death a turning point? Peter Wanless, NSPCC’S chief executive, says: “This is an important step by Instagram towards cracking down on self-harm content that no child should ever be exposed to.

“It shows what can be done, but it should never have taken the death of Molly Russell for Instagram to act. The question is whether it will be enough.

“Over the last decade social networks have proven over and over that they won’t do enough to design essential protection­s into their services against online harms including grooming and abuse.

“We cannot sit here waiting for the next tragedy to strike. The Government must legislate without delay and impose a duty of care on social networks, with tough punishment­s if they fail to protect their young users.”

‘If I’m honest we underfocus­ed on the risks of connecting so many people. That’s a lesson we learned over the last few years’

 ??  ??
 ??  ??
 ??  ??
 ??  ?? The Instagram chief endorses as ‘very reasonable’ the Chief Medical Officer’s demand that parents not let children spend more than two hours a day online
The Instagram chief endorses as ‘very reasonable’ the Chief Medical Officer’s demand that parents not let children spend more than two hours a day online

Newspapers in English

Newspapers from United Kingdom