In­sta to ban self- harm im­ages

The changes were made fol­low­ing a com­pre­hen­sive re­view with global ex­perts and aca­demics on youth, men­tal health and sui­cide pre­ven­tion

The Asian Age - - Fifth Estate - AGE COR­RE­SPON­DENT

Re­cently in the UK, a father had ac­cused In­sta­gram for his 14year- old daugh­ter Molly Rus­sell’s sui­cide. In view of this al­le­ga­tion, the app has made note­wor­thy changes in its an­nounce­ment that it would roll out ‘ sen­si­tiv­ity screens’ that would block im­ages and videos of self harm­ing. UK’s Health Sec­re­tary Matt Hancock had also asked the tech com­pa­nies to clean their plat­forms from con­tent con­tain­ing people cut­ting them­selves. The new changes go a step fur­ther.

In the lat­est reve­la­tion, Adam Mosseri, head of prod­uct at In­sta­gram, said in a blog post that it is ban­ning all im­ages of cut­ting and hid­ing non- graphic im­ages of self- harm like scar­ring, from search re­sults, hash­tags, and Ex­plore. “These im­ages won’t be re­moved en­tirely, be­cause the plat­form doesn’t want to stig­ma­tise or iso­late people who may be in dis­tress and post self- harm re­lated con­tent as a cry for help. The changes were made fol­low­ing a com­pre­hen­sive re­view with global ex­perts and aca­demics on youth, men­tal health and sui­cide pre­ven­tion,” it said in the post.

The com­pany opines that it would be a dif­fi­cult task to iden­tify if an im­age shows self- harm, en­cour­ages it, raises aware­ness or is a cry for help. “It’s fur­ther com­pli­cated by the fact that not all people who self- harm have sui­ci­dal in­ten­tions, and the be­hav­iour has its own nu­ances apart from sui­ci­dal­ity,” Adam wrote, adding, “Right now, it’s not hard to find im­ages of cut­ting or scar­ring on In­sta­gram. Up un­til now, we’ve fo­cused most of our ap­proach on try­ing to help the in­di­vid­ual, who is shar­ing their ex­pe­ri­ences around self- harm. We have al­lowed con­tent that shows con­tem­pla­tion or ad­mis­sion of self- harm be­cause ex­perts have told us it can help people get the sup­port they need. But we need to do more to con­sider the ef­fect of these im­ages on other people who might see them. This is a dif­fi­cult, but im­por­tant bal­ance to get right. It will take time and we have a re­spon­si­bil­ity to get this right. Our aim is to have no graphic self- harm or graphic sui­cidere­lated con­tent on In­sta­gram while still en­sur­ing that we sup­port those us­ing In­sta­gram to con­nect with com­mu­ni­ties of sup­port.”

He added that in ad­di­tion to the app in­vest­ing in tech­nol­ogy that helps with de­tec­tion, it also re­lies on users to re­port self- harm con­tent.

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.