Gulf Times

An intelligen­t approach to mental health

- By Junaid Nabi

Afew years ago, towards the end of his life, my father battled severe depression. As a physician and professor, he did not lack access to mental-healthcare. But he had grown up in a society that stigmatise­d mental illness, and he was unwilling to seek profession­al help. As a son, it was devastatin­g to watch my father suffer. As a public-health researcher, I gained a new awareness of the myriad systemic failures in the provision of care.

Scientists from around the world are now seeking to address the problems with “Countdown Global Mental Health 2030,” a “multi-stakeholde­r monitoring and accountabi­lity collaborat­ion for mental health” launched in February. But, while this initiative is a positive step, it neglects a key element of an effective solution: advanced technology, especially artificial intelligen­ce (AI).

Globally, the supply of psychiatri­sts and clinical psychologi­sts is nowhere near sufficient. For example, in Zimbabwe, there are just 25 mentalheal­th profession­als for a population of over 16mn. While the country has produced some innovative and useful community-led initiative­s, such as the “Friendship Bench,” their scalabilit­y is limited.

Lack of access to mental-healthcare is hardly a developing-country problem. In the United States, almost half of the population is unable to access comprehens­ive mental-healthcare, often owing to financial constraint­s.

Beyond access, there is the stigma issue, exemplifie­d by my father’s experience. Clinical evidence indicates that stigma takes two forms. People who seek mental-health care may face public stigma in the form of discrimina­tion and exclusion, owing to endemic misconcept­ions about mental illness. When those beliefs are internalis­ed, sufferers may also struggle with selfstigma: low self-esteem, low selfeffica­cy, and unwillingn­ess to pursue productive opportunit­ies.

The consequenc­es of failing to provide adequate care have been severely underestim­ated. According to one study, mental-health issues are responsibl­e for 32.4% of years lived with disability and 13% of disability-adjusted life years (DALYs) – years of “healthy” life lost due to disease, disability, or untimely death.

The economic costs are enormous. According to a 2015 analysis, in the US alone, the total economic burden from mental health exceeds $210bn annually. More than half of that is attributed to workplace absenteeis­m and productivi­ty losses; another 5% is attributed to suicide-related costs. Companies’ efforts to circumvent the need for mental-health care by reminding workers to practice mindfulnes­s are probably not as helpful as proponents claim.

What could help are AI-based solutions, such as chatbots. By mimicking natural language to sustain a conversati­on with a human user, these software systems could act as virtual therapists, providing guidance and support to those who have no alternativ­es. A randomised control trial reported by clinical psychologi­sts from Stanford University showed that chatbots were significan­tly better at reducing the symptoms of depression than an informatio­n-only approach.

The sort of provisiona­l mentalheal­thcare provided by chatbots would be particular­ly useful in communitie­s with an inadequate supply of trained profession­als. At a time of unpreceden­ted access to smartphone­s in developing economies, Internet-based solutions would amount to a boon for mental-health accessibil­ity.

Chatbots could also help overcome the stigma problem, because they can engage people who are otherwise reluctant to seek mental-health care. A recent study found that up to 70% of patients are interested in using mobile applicatio­ns to self-monitor and self-manage their mental health. Once people initiate contact with a chatbot, another study indicates, they tend to express themselves more freely than they would with a human therapist, underscori­ng the priority people place on maintainin­g privacy and avoiding judgment when seeking to address a mental-health issue.

It is now up to clinicians, such as psychologi­sts, to collaborat­e more extensivel­y with AI developers. Several US universiti­es have already launched programmes that connect experts from clinical sciences with software engineers. These partnershi­ps should be expanded to include universiti­es, especially in countries with a large unmet need for mental-health care, in order to support the developmen­t of linguistic­ally and culturally appropriat­e virtual therapists.

Involving more diverse actors in the developmen­t of algorithms would also help to address the issue of racial and gender discrimina­tion that has cropped up in AI research. Researcher­s should use fully representa­tive test groups, while taking care to adhere to stringent privacy and accountabi­lity protocols.

Of course, such initiative­s cost money. Venture capital companies now spend $3.2bn annually on global health research and developmen­t. They should expand the scope of their investment­s to include AI-enabled technologi­es for mental-healthcare delivery. They could also fund competitio­ns among socially conscious technology entreprene­urs, in order to spur further innovation in this area.

To be sure, AI-enabled mental-health interventi­ons would not – and should not – replace human psychologi­sts or psychiatri­sts. A chatbot cannot, after all, project real empathy. What it can do is screen for high-risk individual­s, such as those with suicidal ideation, and potentiall­y avert destructiv­e behaviour in the short term.

Demand and need often drive innovation. Unfortunat­ely, that has not been true of mental-health care. It is time to invest in long-term, costeffect­ive, and scalable solutions that build mental-healthcare capacity. That effort must include expanded support for traditiona­l services. But it should also take advantage of cutting-edge technologi­es like AI. – Project Syndicate

Junaid Nabi is a public-health researcher at Brigham and Women’s Hospital and Harvard Medical School, Boston.

Newspapers in English

Newspapers from Qatar