Modern Healthcare

UPMC pilots machine learning, telehealth to inform patient transfers

- By Jessica Kim Cohen

THOUSANDS OF PATIENTS each year are transferre­d between UPMC’s hospitals for high-acuity, complex medical care. To ensure patients are fully informed before those transfers, the Pittsburgh health system is now piloting a new care process aided by a machine-learning tool.

While it’s sometimes necessary to transfer patients for more specialize­d care, that transfer can come with unintended consequenc­es—like moving the patient far away from their family and other support systems, a particular­ly difficult decision for patients close to death and who may not want to spend their final days in the hospital. It’s important for clinicians to discuss such decisions with patients to make sure they understand the severity of their illness and align next steps with what the patient wants.

So to ensure those conversati­ons are taking place, researcher­s at UPMC and the University of Pittsburgh School of Medicine developed a machine-learning algorithm that predicts mortality for patients who may be transferre­d to another hospital for a higher level of care. Patients deemed at highest risk are flagged for more in-depth discussion­s about their care goals.

Researcher­s published a study validating the algorithm, dubbed Safe Non-elective Emergent Transfers, or SafeNET, in the journal PLOS One earlier this month.

The SafeNET algorithm evaluates 14 variables, including age and vital signs, to assess a patient’s risk of death.

If a patient is deemed at high risk, it kicks off two processes: a three-way conversati­on between an emergency department physician, intensive-care unit physician at the possible transfer facility and a palliative-care clinician, as well as telehealth palliative-care services between the patient and family members to discuss goals, expectatio­ns and options for next steps.

Dr. Daniel Hall, medical director for high-risk population­s and outcomes at the UPMC Wolff Center and an author on the study, stressed that the algorithm doesn’t make patient-care decisions. It’s meant to trigger a “pause,” during which physicians and patients talk in detail before making decisions on whether to make a transfer.

The SafeNET algorithm is currently being piloted in three EDs at UPMC. Since November, the algorithm has flagged 11 patients who had the highest probabilit­y of dying. After conversati­ons with the palliative-care team, four of the patients ultimately decided to continue with ICU-level care and seven decided not to be transferre­d.

The seven patients “decided, all things considered—their goals of care, their personal values, what’s important to them—to stay locally,” said Dr. Karl Bezak, medical director for palliative care at UPMC Presbyteri­an and Montefiore hospitals. Instead of higher-acuity care farther from home, some of those patients chose options like at-home hospice.

The mortality risk score isn’t discussed with the patient; it’s just used to identify which patients should have the conversati­ons.

Algorithms like SafeNET could prove a promising way to remind physicians to loop in palliative-care services before making care decisions, said Lori Bishop, vice president of palliative and advanced care at the National Hospice and Palliative Care Organizati­on. Often, hospitals don’t have a standard approach for identifyin­g patients who could benefit from palliative care, she said.

Including palliative care clinicians in decision-making helps to make sure care is patient-centered. “Sometimes, medicine can be a ‘run-away train’ because we make the assumption you want everything done possible until you die,” Bishop added. “What we’ve found is that people don’t always want that option, and sometimes regret that their time was spent in hospitals.”

Health systems like UPMC have built mortality risk-assessment tools for various uses. Researcher­s at Geisinger Health also in February published a study that found a machine-learning algorithm they developed could predict mortality within a year based on echocardio­gram videos of the heart, which could help to inform physicians’ treatment decisions.

When integratin­g decision-support tools that use artificial intelligen­ce into clinical care, it’s important to make sure the tools are developed and tested on high-quality data from diverse population­s, as well as evaluated for possible biases, said Satish Gattadahal­li, director of digital health and informatic­s in advisory firm Grant Thornton’s public sector business. He also highlighte­d the need to subject algorithms to peer review and design systems so clinicians understand how the tools make recommenda­tions, and the algorithm isn’t a “black box.” “Make sure there are sufficient guardrails,” Gattadahal­li said. ●

 ??  ?? Dr. Daniel Hall
Dr. Daniel Hall
 ??  ?? Dr. Karl Bezak
Dr. Karl Bezak

Newspapers in English

Newspapers from United States