Joint Commission to shelve hospital Top Performer program until 2017
A total of 1,043 hospitals made the Joint Commission’s 2015 Top Performer list, an annual award that recognizes facilities for high marks on a suite of 49 accountability measures. That’s about 180 fewer high achievers than last year.
However, it’s telling that in the current chaotic state of healthcare quality measures, the accreditation body also announced during the release of its annual report that it will suspend the popular award for at least one year.
“Due to the evolving national performance measure environment—particularly within the Centers for Medicare and Medicaid Services,” the program will be put on hiatus, CEO Dr. Mark Chassin said in the report. “In 2017, we will return with a refreshed program.” The commission has been issuing the awards each fall since 2010.
The science of healthcare performance measurement is “all over the place,” safety leaders have said. Researchers in health policy, quality and safety, and organizations that represent hospitals have urged scrutiny of the metrics that rate, rank and financially penalize U.S. hospitals.
“The stakes are getting much higher,” said Dr. Peter Pronovost, professor and director of the Armstrong Institute for Patient Safety and Quality at Johns Hopkins Medicine.
For example, as much as 6% of a hospital’s base operating pay from Medicare could be on the line by 2017 through combined federal quality incentive programs. “When you ratchet up the stakes, you’d better make sure that what you are measuring is accurate,” Pronovost said.
The Joint Commission said it is reevaluating the current landscape, and part of its goal is to make sure the accountability measures it uses remain closely aligned with the CMS’ reporting programs. But it appears that is becoming more challenging.
The federal government has “gone in a different direction” and is increasingly relying on billing data, Chassin said in an interview with Modern Healthcare.
During a news conference, he noted that the Joint Commission does not use measures derived from hospital billing data because they don’t accurately identify complications, and they don’t provide insight on the severity of patients’ conditions. “We don’t believe those are valid measures of quality,” he said.
The report noted changes to the CMS’ Hospital Inpatient Quality Reporting program, as well as the agency’s retirement of topped-out measures.
The latter concerns Chassin. “Taking the spotlight off of very valid measures of quality is not an appropriate policy position,” he said. “When you take the spotlight off, performance deteriorates. Why would you want to take that risk?”
It’s a very delicate issue, said Pronovost, noting that it is also expensive to collect data. He suggested randomly rotating measures in and out of the cycle to ensure accountability, as well as coming up with more robust ways of evaluating the effectiveness of metrics across the board.
The proliferation of ratings groups and the wildly different conclusions they have generated are an ongoing concern as transparency becomes more prevalent in healthcare. A rating scheme now can be created by “anyone with a computer,” said Dr. Robert Wachter, interim chairman of the department of medicine at the University of California at San Francisco. “But the result for patients may be cacophony,” he said. “Sometimes less is more.”
The commission said another reason to pause the program is because the way data are collected is changing. The group introduced a flexible reporting option for the current calendar year so hospitals could choose which measures they would report on based on the procedures they perform.
Critics have said the commission’s top performance award focuses too much on process measures (such as how many heart attack patients received aspirin), rather than outcome measures (e.g., how many patients died or had complications). Data suggest that processes are easier to improve than outcomes.
However, Chassin countered that outcomes cannot be improved if processes are not changed. Also, he said that while many of the process measures are evidence-based, some outcome metrics used by other ratings groups are “so fundamentally flawed” that they can’t judge performance.
The determination for this year’s list of recognized hospitals is based on 2014 data submitted by 3,315 facilities. The facilities were evaluated on accountability measures related to care for pediatric asthma, heart attacks, perinatal care, pneumonia, psychiatry, stroke, surgery, substance use, tobacco treatment and venous thromboembolism.
Nearly one-third of Joint Commission-accredited hospitals won the award this year. While 180 fewer made the list compared with last fall, Chassin said the drop was anticipated as more required metrics were added to the list. A total of 650 hospitals made the list for the second consecutive year and 117 facilities have been on the list for five straight years.
While the temptation to add more measures and introduce new ratings is great, Wachter encouraged “taking a bit of a breather and trying to separate out the wheat from the chaff” in the ratings maze. Others who issue such lists may want to do the same, he said, to understand whether they are truly adding unique value.
“Taking the spotlight off of very valid measures of quality is not an appropriate policy position.” Dr. Mark Chassin CEO, Joint Commission