The Asian Age

A BLACK BOX WITHOUT EMPATHY

While AI should ideally predict and prevent vulnerable people from committing suicide, the recent suggestion given to a PhD scholar by Quora proves we need to rethink algorithms

- NAVEENA GHANATE

Social media platforms are said to be making positive changes in the lives of people around the world. But when it comes to suicide prevention, these platforms aren’t smart enough. Case in point is a tweet by Shehla Rashid, a PhD student at Delhi’s Jawaharlal Nehru University, wherein she spoke of Premenstru­al Dysphoric Disorder ( PMDD), which is an extreme form of Premenstru­al Syndrome ( PMS). In this extreme scenario, one feels suicidal. A dejected Shehla searched for ways to commit suicide. She looked on Quora, which is known to have intelligen­t answers, but the next day she received an email. She wrote, “Quora sends me an email asking if I’m still contemplat­ing suicide, and that they’re here to help! In a world where algorithms will help you end your life if you want to end your life, it’s really important to share informatio­n about PMDD. ( sic).”

It has become imperative to discuss how algorithms are dealing with sensitive issues like suicide. As suicide or anxiety are emotional issues, how can these platforms be made more sensitive?

An expert and security researcher Anivar Aravind said, “Responding to such searches should not be an engineerin­g decision. It needs to have a social or psychologi­cal consultati­on, which is absent in most tech companies. These algorithms are black boxes as

in except for the company, nobody knows how that product is programmed. The output of the algorithm reflects the sensibilit­y of the product manager who wrote the program. Additional­ly, it is supplement­ed by the human bias of the developer or company.”

However, sending emails based on searches has been a norm for many platforms. People who have such thoughts often tend to go incognito and search. However, since Shehla gave up

on everything, she logged into her account directly. Because of this, she got an email asking, “Still curious about which would be least painful death? Jumping off a building or jumping off a bridge?” Doctors feel that when a person is asking questions related to suicide, it helps if the responses are more empathetic.

Dr Diana Moneteria, at Hyderabad Academy of Psychology, said, “When we do suicide prevention training, we teach people that asking others about suicide doesn’t increase the risk of suicide. But there is a caveat, how you ask makes a difference. If a search engine is sending machine- generated emails with no person involved, the question is ill- advised. A normal person would have gotten help instead of giving ideas to end life.”

People are suicidal only because they have a problem that they cannot solve.

“On social media, looking for posts like ‘ I want to kill myself’ or doing a Facebook Live are signs of correction or of wanting help. It would have helped if the machine said ‘ go get some help’ instead of giving options on committing suicide,” she says.

But certain measures have been adopted by some tech companies to prevent suicides. Facebook, Twitter and Instagram employ artificial intelligen­ce to detect signs of suicide and depression.

Using algorithms, users searching for a banned hashtag or specific words related to harm are redirected to the support system. Yet there are no guidelines on how to deal with such issues and every tech company handles it in its own way.

“If a search for suicide or killing is detected, the systems should identify them and it should be backed by human decision,” said Aravind. While Shehla looked up Quora, bigger platforms like Google provide helpline numbers.

Such responses by Quora at times might act as a trigger. Google came up with the idea of providing suicide helpline numbers when a person googles the methods of committing suicide. This helpline works as there are many patients who say ‘ I Googled this but got your number’ — DR OMESH KUMAR ELUKAPELLY, psychologi­st

 ??  ??

Newspapers in English

Newspapers from India