The Denver Post

Can an algorithm prevent suicide?

“If it’s going to get me more support that I need, then I’m OK with it,” one Vietnam vet said

- By Benedict Carey

At a recent visit to the Veterans Affairs clinic in the Bronx, Barry, a decorated Vietnam veteran, learned that he belonged to an exclusive club. According to a new AI-assisted algorithm, he was one of several hundred VA patients nationwide, of 6 million total, deemed at imminent risk of suicide.

The news did not take him entirely off guard.

Barry, 69, who was badly wounded in the 1968 Tet offensive, had already made two previous attempts on his life. “I don’t like this idea of a list, to tell you the truth — a computer telling me something like this,” Barry, a retired postal worker, said in a phone interview. He asked that his surname be omitted for privacy.

“But I thought about it,” Barry said. “I decided, you know, OK — if it’s going to get me more support that I need, then I’m OK with it.”

For more than a decade, health officials have watched in vain as suicide rates climbed steadily — by 30% nationally since 2000 — and rates in the VA system have been higher than in the general population. The trends have defied easy explanatio­n and driven investment in blind analysis: machine learning, or AI-assisted algorithms that search medical and other records for patterns historical­ly associated with suicides or attempts in large clinical population­s.

Doctors have traditiona­lly gauged patients’ risks by looking at past mental health diagnoses and incidents of substance abuse, and by drawing on experience and medical instinct. But these evaluation­s fall well short of predictive, and the artificial­ly intelligen­t programs explore many more factors, such as employment and marital status, physical ailments, prescripti­on history and hospital visits. These algorithms are black boxes: They flag a person as at high risk of suicide, without providing any rationale.

But human intelligen­ce isn’t necessaril­y better at the task. “The fact is, we can’t rely on trained medical experts to identify people who are truly at high risk,” said Dr. Marianne S. Goodman, a psychiatri­st at the Veterans Integrated Service Network in the Bronx, and a clinical professor of medicine at the Icahn School of Medicine at Mount Sinai. “We’re no good at it.”

Deploying AI in this way is not new; researcher­s have been gathering data on suicides through the National Health Service in Britain since 1996. The U.S. Army, Kaiser Permanente and Massachuse­tts General Hospital each has separately developed an algorithm intended to predict suicide risk.

However, the VA’s program, called Reach Vet, which identified Barry as at high risk, is the first of the new U.S. systems to be used in daily clinical practice, and it is being watched closely. How these systems perform — whether they save lives and at what cost, socially and financiall­y — will help determine if digital medicine can deliver on its promise.

Doctors who have worked with Reach Vet say that the system produces unexpected results, both in whom it flags and whom it does not.

To some of his therapists, Chris, 36, who deployed to Iraq and Afghanista­n, looked very much like someone who should be on the radar. He had been a Marine rifleman and saw combat in three of his four tours, taking and returning heavy fire in multiple skirmishes. In 2008, a roadside bomb injured several of his friends but left him unscathed. After the attack he had persistent nightmares about it and received a diagnosis of post-traumatic stress. In 2016, he had a suicidal episode; he asked that his last name be omitted to protect his privacy.

“I remember going to the shower, coming out and grabbing my gun,” he

said in an interview at his home near New York City. “I had a Glock 9-millimeter. For me, I love guns, they’re like a safety blanket. Next thing I know, I’m waking up in cold water, sitting in the tub, the gun is sitting right there, out of the holster. I blacked out. I mean, I have no idea what happened. There were no bullets in the gun, it turned out.”

The strongest risk factor for suicide is a previous attempt, especially one with a gun. Yet Chris’ name has not turned up on the high-risk list compiled by AI, and he does not think it ever will.

“At the time, in 2016, I was going to school for a master’s, working full time,” he said. “Our two kids were toddlers; I was sleeping no more than a few hours a night, if that. It was too much. I was sleep-deprived all the time. I had never been suicidal, never had suicidal thoughts; it was a totally impulsive thing.”

The VA model integrates 61 factors in all, including some that are not obvious, such as arthritis and statin use, and produces a composite score for each person. Those who score at the very top of the range — the top 0.1% — are flagged as high risk.

“The risk concentrat­ion for people in the top 0.1% on this score was about 40 times,” said John McCarthy, the director of data and surveillan­ce, in Suicide Prevention in the VA Office of Mental Health and Suicide Prevention. “That is, they were 40 times more likely to die of suicide” than the average person.

Bridget Matarazzo, the director of clinical services at the Rocky Mountain Mental Illness Research Education and Clinical Center for Veteran Suicide Prevention, said of Reach Vet, “My impression is that it’s identifyin­g some folks who were previously on providers’ radar, but also others who were not.”

Late in 2018, a VA team led by McCarthy presented the first results of the Reach Vet system. Over a six-month period, with Reach Vet in place, high-risk veterans more than doubled their use of VA services. By contrast, in a comparison group tracked for six months before Reach Vet was installed, the use of VA services stayed roughly the same.

The Reach Vet group also had a lower mortality rate over that time — although it was an overall rate, including any cause of death. The analysis did not detect a difference in suicides, at least up to that stage. “It’s encouragin­g, but we’ve got much more to do to see if we’re having the impact we want,” McCarthy said.

Ronald Kessler, a professor of health care and policy at Harvard Medical School, said: “Right now, this and other models predict who’s at highest risk. What they don’t tell you is who is most likely to profit from an interventi­on. If you don’t know that, you don’t know where to put your resources.”

For doctors using the system, however, it has already prompted some rethinking of how to assess risk. “You end up with a lot of older men who are really struggling with medical problems,” Goodman said. “They’re quietly miserable, in pain, often alone, with financial problems, and you don’t see them because they’re not coming in.”

 ?? Bryan Anselm, © The New York Times Co. ?? Dr. Marianne S. Goodman, a psychiatri­st at the Veterans Integrated Service Network in the Bronx, is pictured outside her home in Wyckoff, N.J., in November. “The fact is, we can’t rely on trained medical experts to identify people who are truly at high risk,” Goodman said, as the Department of Veterans Affairs has turned to machine-learning to help identify vets at risk of taking their own lives.
Bryan Anselm, © The New York Times Co. Dr. Marianne S. Goodman, a psychiatri­st at the Veterans Integrated Service Network in the Bronx, is pictured outside her home in Wyckoff, N.J., in November. “The fact is, we can’t rely on trained medical experts to identify people who are truly at high risk,” Goodman said, as the Department of Veterans Affairs has turned to machine-learning to help identify vets at risk of taking their own lives.

Newspapers in English

Newspapers from United States