Modern Healthcare

Population health tools may exacerbate health disparitie­s

- By Jessica Kim Cohen

THE EQUITY of the healthcare industry’s population health management care has been called into question after researcher­s found that a widely used algorithm sold by Optum dramatical­ly underestim­ated the health needs of the sickest black patients.

The unexpected results for Optum have put a spotlight on the entire industry, which relies heavily on programs that make prediction­s on which patients may benefit from additional and more comprehens­ive care.

Risk-prediction algorithms like Optum’s, which use cost as an indicator to pinpoint high-risk patients, are ubiquitous in the industry, said Cynthia Burghard, a research director in value-based IT transforma­tion strategies at IDC Health Insights, a division of market research firm Internatio­nal Data Corp.

It’s “a really simple tool that any analytics vendor offers,” she said.

In fact, these types of algorithms analyze data from roughly 200 million patients in the U.S. each year, according to industry estimates cited in the study in the journal Science. And they’re only becoming more popular with the shift to risk-based contracts, according to Dr. Shaun Grannis, vice president of data and analytics and director of the center for biomedical informatic­s at the Regenstrie­f Institute.

“Folks want to understand where their costs are going to be,” he said. He was not affiliated with the study.

There’s nothing wrong with predicting, and proactivel­y intervenin­g in, care for high-cost patients. But if these tools are relied upon, they can exacerbate existing healthcare disparitie­s.

“The overall goal of the algorithms can be very positive and helpful if directed toward improving patient health outcomes,” said Dr. Marshall Chin, a primary-care physician at UChicago Medicine and associate director of the health system’s MacLean Center for Clinical Medical Ethics. “The choice of cost reflects perhaps misplaced priorities.”

That decision to predict patients’ risk scores based on cost is what led to the racial disparitie­s for Optum’s algorithm, which was found to assign healthier white patients the same risk score as black patients who had poorer lab results, according to the recent research published in Science. The algorithm had specifical­ly excluded race as a variable.

The algorithm is part of an analytics tool called Impact Pro, which Optum markets as a way to help healthcare organizati­ons identify individual­s who will benefit most from population health management programs.

But the algorithm in question doesn’t predict patients’ future health conditions; it predicts how much patients would cost the hospital in the future, and used that as a proxy for who would benefit from additional care-management services. That created a disparity, since black patients generally use healthcare services at lower rates than white patients.

Annual care for black patients with chronic conditions cost about $1,800 less than that for comparable white patients, according to the study, which looked at a patient population at one unnamed academic health system using the algorithm. Essentiall­y, that meant “healthier white patients were ‘cutting in line’ ahead of sicker black patients” to get more intensive care management, said Dr. Ziad Obermeyer, the study’s lead author and acting associate professor in health policy and management at the University of California at Berkeley.

according to the study authors—one that hits at the heart of the question over how to design and use clinical algorithms. Rather than having an algorithm predict how much a hospital would spend on patients, the researcher­s adjusted it to predict health conditions. As another alternativ­e, they tweaked the algorithm to predict patients’ avoidable costs, rather than total costs.

“Both of those alternativ­e algorithms actually had far less bias,” Obermeyer said. “The problem we found wasn’t anything to do with what’s going on in the black box of the algorithm. The problem was what the algorithm was told to do.” Obermeyer said the research team has been in communicat­ion with Optum to experiment with possible versions of the algorithm. Optum hasn’t commented on whether it will add the researcher­s’ adjustment­s to its product.

Optum, for its part, has stressed that its algorithm fulfills its intended purpose. A company spokesman highlighte­d that Impact Pro has multiple features, of

Essentiall­y, that meant “healthier white patients were ‘cutting in line’ ahead of sicker black patients” to get more intensive care management.

Dr. Ziad Obermeyer, the study’s lead author and acting associate professor in health policy and management at UC Berkeley.

which one part is an algorithm that forecasts expenditur­e costs. Optum’s tool also identifies gaps in care, which are often driven by social determinan­ts of health.

“We appreciate the researcher­s’ work, including their validation that the cost model within Impact Pro was highly predictive of cost, which is what it was designed to do,” he wrote in an email. Optum did not respond to a request for comment on how much it charges healthcare organizati­ons for Impact Pro.

Burghard, the analyst with IDC Health Insights, agreed. The algorithm did what it was designed to do: it “spit out a list of people based on cost,” she said.

If a hospital decides those high-cost patients are the population to target, “of course there’s bias. You miss all the people who didn’t get care, or who don’t have access to care,” she said. “To me, at this point in my musings, it’s not about the algorithm or the tool. It’s about the human decision (and how) to use that tool.”

On the surface, using cost as a proxy to predict health risk can seem reasonable—health is complex, and there’s no one variable to measure someone’s health, Obermeyer noted. But while cost may sound like a race-neutral measure on paper, there are social and historical disparitie­s that shape it, such as, in this case, black patients generally using healthcare services at lower rates.

So even if the algorithm worked as intended, it raises ethical questions, according to Chin. “I think that healthcare organizati­ons and hospitals have to look in their hearts and ask: What is their mission?” he said. “If you are thinking about the ultimate mission of patient care, you’ll be directed to metrics being high-quality care and the best possible patient health outcomes.”

He acknowledg­ed that it’s understand­able why a healthcare organizati­on might decide to focus efforts on tackling highcost patients. To truly encourage a focus on high-quality patient care, he said, the industry would need to think about realigning incentives so that patient outcomes are rewarded, rather than cost savings. “Hospitals are under a lot of cost pressures,” Chin said. “The financial margins for a lot of hospitals are small.”

A population health team at Partners HealthCare System in Boston was confronted with a decision on how to address disparitie­s when it tested Optum’s algorithm a few years ago.

The team had mapped patients who were getting high risk scores from the algorithm, and found many were concentrat­ed in some of the region’s wealthier neighborho­ods. “That made us uncomforta­ble with just using the tool,” said Christine Vogeli, director of evaluation and research for population health at Partners HealthCare and a study co-author. While Partners continued to use the tool as part of its care-management program, it supplement­ed the algorithm’s findings with additional clinical informatio­n.

And while the system used tools like Optum’s algorithm to serve as a resource to help corral an initial list of patients who might benefit from enhanced care management, it decided primary-care physicians would be responsibl­e for determinin­g which patients would be offered enrollment in the program.

For the care-management program, all patients designated as high risk by Optum’s algorithm would be flagged as candidates. But they also took into account factors like whether a patient had multiple chronic conditions and their patterns of healthcare utilizatio­n, such as if they had missed appointmen­ts or frequently visited the emergency department.

Developing that process required a conscious focus on patient needs, according to Vogeli. “We’re not just interested in patients who are high-cost, we’re interested in patients who have a need for more intensive care-management services,” she said.

Partners HealthCare stopped using Optum’s algorithm to inform its care-management program earlier this year, switching to a different tool that only uses informatio­n about patient’s chronic conditions.

Even when the system had used Optum’s algorithm, only about 15% of patients designated as possible candidates for the care-management program were identified solely based on having a high risk score, Vogeli said. The program serves about 14,000 patients at a given time, the bulk of whom were flagged via Partners HealthCare’s review of their chronic conditions and healthcare use.

“Healthcare organizati­ons need to be very savvy about how they use these tools,” Vogeli said. ●

 ??  ??
 ??  ??
 ?? ISTOCK/MODERN HEALTHCARE ILLUSTRATI­ON ??
ISTOCK/MODERN HEALTHCARE ILLUSTRATI­ON

Newspapers in English

Newspapers from United States