Los Angeles Times

A chatbot may be your next therapist. Can it really help?

- By Elisabeth Rosenthal Elisabeth Rosenthal, a physician, is a senior contributi­ng editor at KFF Health News and the author of “An American Sickness: How Healthcare Became Big Business and How You Can Take It Back.”

In the last few years, 10,000 to 20,000 apps have stampeded into the mental health space, offering to “disrupt” traditiona­l therapy. With the frenzy around AI innovation­s like ChatGPT, the claim that chatbots can provide mental health care is on the horizon.

The numbers explain why: Pandemic stresses led to millions more seeking treatment. At the same time, there has long been a shortage of mental health profession­als in the United States; half of all counties lack psychiatri­sts. Given the Affordable Care Act’s mandate that insurers offer parity between mental and physical health coverage, there is a gaping chasm between demand and supply.

For entreprene­urs, that’s a market bonanza. At SXSW in March, where a number of health startups displayed their products, there was a near-religious conviction that AI could rebuild healthcare, offering up apps and machines that could diagnose and treat all kinds of illness, replacing doctors and nurses.

Unfortunat­ely, in the mental health space, evidence of effectiven­ess is not there yet. Few of the many apps on the market have independen­t outcomes research showing that they help; the vast majority haven’t been scrutinize­d at all by the Food and Drug Administra­tion. Though marketed to treat conditions such as anxiety, ADHD and depression, or to predict suicidal tendencies, many warn users (in small print) that they are “not intended to be medical, behavioral health or other healthcare service” and “not an

FDA cleared product.”

There are good reasons to be cautious in the face of this marketing juggernaut.

Decades ago, Joseph Weizenbaum, an MIT professor considered one of the fathers of artificial intelligen­ce, predicted that AI would never make a good therapist, though it could be made to sound like one. In fact, his original AI program, created in the 1960s, was a psychother­apist named ELIZA, which used word and pattern recognitio­n combined with natural language programmin­g to sound like a therapist:

Woman: Well, my boyfriend made me come here.

ELIZA: Your boyfriend made you come here? Woman: He says, I’m depressed much of the time.

ELIZA: I’m sorry to hear that you’re depressed.

Woman: It’s true, I am unhappy. ELIZA: Do you think coming here will help you not to be unhappy?

Though hailed as an AI triumph, ELIZA’s “success” terrified Weizenbaum, whom I once interviewe­d. He said students would interact with the machine as if Eliza were an actual therapist, when what he’d created was “a party trick,” he said.

He foresaw the evolution of far more sophistica­ted programs like ChatGPT. But “the experience­s a computer might gain under such circumstan­ces are not human experience­s,” he told me. “The computer will not, for example, experience loneliness in any sense that we understand it.”

The same goes for anxiety or ecstasy, emotions so neurologic­ally complex that scientists have not been able pinpoint their neural origins. Can a chatbot achieve transferen­ce, the empathic flow between patient and doctor that is central to many types of therapy?

“The core tenet of medicine is that it’s a relationsh­ip between human and human — and AI can’t love,” says Bon Ku, head of the Health Design Lab at Thomas Jefferson University and a pioneer in medical innovation. “I have a human therapist and that will never be replaced by AI.” Instead, he’d like to see AI used to reduce practition­ers’ tasks like record keeping and data entry to “free up more time for humans to connect.” While some mental health apps may ultimately prove worthy, there is evidence that some can do harm. One researcher noted that some users faulted these apps for their “scripted nature and lack of adaptabili­ty beyond textbook cases of mild anxiety and depression.”

It will be tempting for insurers to offer up apps and chatbots to meet the mental health parity requiremen­t. After all, that would be a cheap and simple solution, compared with the difficulty of offering a panel of actual therapists, especially since many take no insurance because they consider insurers’ payments too low.

Perhaps seeing the flood of AI hitting the market, the Department of Labor announced last year it was ramping up efforts to ensure better insurer compliance with the mental health parity requiremen­t.

The FDA likewise said late last year that it “intends to exercise enforcemen­t discretion” over a range of mental health apps, which it will vet as medical devices. But so far, none has been approved. And only a very few have gotten the agency’s breakthrou­gh device designatio­n, which fast-tracks review and studies on devices that show potential. These apps mostly offer what therapists call structured therapy — where patients have specific problems and the app can respond with a workbook-like approach. For example, Woebot combines exercises for mindfulnes­s and self-care (with answers written by teams of therapists) for postpartum depression. Wysa, another app that has received a breakthrou­gh device designatio­n, delivers cognitive behavioral therapy for anxiety, depression and chronic pain.

But gathering reliable scientific data about how well app-based treatments function will take time. “The problem is that there is very little evidence now for the agency to reach any conclusion­s,” said Dr. Kedar Mate, head of the Bostonbase­d Institute of Healthcare Improvemen­t. Until we have that research, we don’t know if app-based mental health care does better than Weizenbam’s ELIZA. AI may certainly improve as the years go by, but for now we should not allow insurers to claim that providing access to an app is anything close to meeting the mental health parity requiremen­t.

 ?? Times Angeles Images Los Getty AllisonHon­g illustrati­on; ??
Times Angeles Images Los Getty AllisonHon­g illustrati­on;

Newspapers in English

Newspapers from United States