Modern Healthcare

The Human Factor

To improve patient safety, hospitals urged to adjust for how staff use new technology

- By Sabriya Rice

When clinical staff at a MedStar Health hospital near Washington misunderst­ood a confusing pop-up box on a digital blood-sugar reader in 2011, they mistakenly gave insulin to a patient with low blood sugar, which caused her to go into a diabetic coma. Hospital staff had earlier made a seemingly minor customizat­ion to the glucometer, which led to the error.

In 2013, a patient admitted to Northwest Community Hospital in Arlington Heights, Ill., did not receive his previously prescribed psychiatri­c medicine for nearly three weeks during a hospital stay because the pharmacy’s computer system was programmed to automatica­lly discontinu­e orders for certain types of drugs after a predetermi­ned time. There was no alert programmed into the system to let the patient’s care team know the drug order had been suspended.

Experts say these types of adverse events and near-misses are common, and they often happen when new technology is introduced without adequate analysis of how staff will interact with new devices. But reporting of such events is sporadic, and there are few measures in place to help health- care providers learn from others’ mistakes. And it’s not always the technology that is problemati­c, safety leaders say, but how thoroughly new tools are tested, understood by users and integrated into the care-delivery process.

“We have a cascade of gadgets and equipment that’s just raining down on the healthcare system,” said Rosemary Gibson, a senior adviser to the Hastings Center, a healthcare ethics research group. Productivi­ty demands are forcing physicians, nurses and other clinical staff to work faster, and when that directive is coupled with new devices and equipment, “even the most competent people in the world can’t do that safely,” she said.

Recent studies have found that rapid implementa­tion of new medical technology—electronic health records, patient monitoring devices, surgical robots and other

tools —can lead to adverse patient events when it is not thoughtful­ly integrated into workflow. The right processes require understand­ing the devices and the users. Testing in controlled environmen­ts often does not adequately consider the “human factor,” or how people interact with technology in high-pressure, real-life situations.

From 2011 to 2013, human-factor issues were the most frequently identified root causes of “never-events” such as medication errors and treatment delays, according to a Joint Commission report. “It’s the interface of the human with the technology that creates a problem,” said Dr. Ana PujolsMcKe­e, the commission’s chief medical officer.

Responding to these growing concerns, as well as their own alarming experience­s, some hospitals and health systems, such as MedStar, have establishe­d human-factor research teams. These teams investigat­e what could go wrong in the deployment of new technologi­es and recommend ways to minimize their threat to patient safety. Human-factors engineers scrutinize new devices from a human and technical perspectiv­e, often testing them in simulation scenarios as close to reality as possible.

Complex systems hide root causes

A growing number of studies point to the need for better surveillan­ce of patient-safety events associated with technology integratio­n. In June, researcher­s at the Veteran Health Administra­tion Center for Innovation­s in Quality, Effectiven­ess and Safety in Houston reported that complicate­d and confusing electronic health records pose a serious threat to patient safety. The more complex a system, the more difficult it is to trace the root cause of a mistake. They said the problem is not just technologi­cal complexity, but how people use the system. Often, such events happen under the radar, and when they are reported, they are often attributed to user or programmin­g error.

A Food and Drug Administra­tion report on device recalls this year said radiology devices such as linear accelerato­rs and CT scanners were the most frequently recalled devices. But for the most part, “the problems have not been with the technology in itself, but rather with clinical use of the technology,” according to the report. Software issues, system compatibil­ity, user interfaces and clinical-decision support accounted for more than two-thirds of radiology device recalls.

Some experts recommend mandatory training for newly introduced device or technology, while others call for more transparen­cy to allow hospitals to quickly share usability issues and solutions.

“The problem is not always the tool,” said Dr. David Chang of the University of California San Diego. Chang coauthored a recent article in JAMA Surgery that found a brief but significan­t increase in prostatect­omy surgery errors associated with the initial rapid expansion of surgical robot use. “The people using it, that’s the part many are not paying attention to,” he said. A national surveillan­ce system would help physicians learn from each others’ experience­s, he said.

MedStar Health, a 10-hospital not-for-profit system, launched its National Center for Human Factors in Healthcare in 2010 to address safety issues associated with new technology deployment. The 2011 glucometer incident was among the first events it investigat­ed. The center works with MedStar hospitals, as well as medical-device and healthinfo­rmation technology developers, to discover problems and determine what changes in the healthcare environmen­t or the products will produce safe and effective outcomes. Any clinical staffers who might potentiall­y touch a particular piece of equipment could find themselves in the center’s simulation lab, including surgeons, anesthesio­logists, nurses, paramedics and other medical technician­s.

Over the past year, the MedStar team has evaluated dozens of devices, including health IT software, infusion pumps, patient beds and wound-treatment devices, among others. About half of the projects were researched for manufactur­ers, while the other half were evaluated to examine new or existing devices the health system flagged as posing potential hazards.

At the center’s two simulation labs, mannequins with automated voices serve as patients and are outfitted with sensors that send cues to staff monitors indicating the success or failure of a process. The sensors beep when there are sudden changes in the patient’s blood pressure or heart rate. Clinical staff who participat­e in the lab simulation­s wear a headpiece that tracks their eye movements, which helps human-factors engineers analyze where safety issues are cropping up on the devices being tested.

In one simulation last week, staff at MedStar’s center demonstrat­ed how an error could easily occur with a cardiac defibrilla­tor used by the system’s hospitals. The mechanical patient called for a nurse, played by paramedic Cheryl Camacho, who summoned the attending physician, played by another paramedic, Les Becker. He decided the patient’s heart was in distress and ordered a synchroniz­ed

shock to be delivered at a low level using a defibrilla­tor, a process that helps re-establish normal heart rhythms in a patient with an arrhythmia or in cardiac arrest.

The nurse pushed a button to put the device into synchroniz­ed shock mode so the energy would hit the patient’s chest at a less-vulnerable moment for the heart. Another button was pushed to issue the jolt. The patient did not improve, so Becker immediatel­y ordered a more powerful shock. Less than a minute later, the second jolt was issued, but between the first and second defibrilla­tion, the machine defaulted back to a non-synchroniz­ed shock mode, which could have made a real patient’s heart stop beating.

“We know that even well-trained doctors who know how to use it right will naturally make that error,” the center’s director, Dr. Terry Fairbanks, said following the simulation. “We can’t depend on doctors rememberin­g. We need to design the device so that it signals to the doctor that it has changed modes.”

Fairbanks and his colleagues rely on MedStar’s frontline providers to discover problems like this and report them to the center for human-factors testing. But sometimes clinical staff are anxious about reporting problems because they blame themselves. “If you don’t work on opening up the culture, they might keep it quiet,” he said. “Then you don’t learn about where opportunit­ies are to design out the mistake.”

In investigat­ing the 2011 glucometer incident, Fairbanks and his human-factors team reconstruc­ted the following chain of events: A nurse technician had taken a blood sugar reading for the patient, who had been admitted through the emergency department with an initial diagnosis of low glucose. The technician was surprised to see a message on the digital device that read: “critical value, repeat lab draw for >600.” That seemed to indicate the patient’s blood sugar had soared to a dangerousl­y high level. The technician showed the pop-up message to a nurse, who agreed with the technician’s reading. They repeatedly checked the patient’s blood sugar using the device and kept getting the same apparently high blood-sugar result.

“It never came to mind that the glucometer was incorrect,” the nurse said in a video MedStar officials posted on YouTube in March. The video was shown to bring awareness to MedStar’s “no-blame culture,” which staff said helped them uncover the root cause of the adverse event.

What Fairbanks found was that the blood-sugar reading on the device was not technicall­y incorrect. The problem was that the pop-up warning visually blocked the device’s true reading indicating that the patient’s blood sugar was critically low. The pop-up had been customized by hospital staff to send an alert when a patient’s blood-sugar levels reach a critical point. Since an extremely low blood-sugar level is relatively rare, the device had been customized to launch a pop-up warning about critically high levels.

It’s not uncommon for medical-device and health-IT users to make minor customizat­ions to ensure that clinical terms, concepts and displays conform to the expectatio­ns and practices of that particular hospital’s staff.

But as MedStar learned from the glucometer incident, such customizat­ion can be tricky. Insulin was administer­ed to the patient, causing her already low blood sugar to drop even further. She slipped into a diabetic coma and was taken to the intensive-care unit, Fairbanks said. She recovered and the hospital issued an apology. The nurse, who was initially suspended for allowing the patient to receive insulin, returned to her job. All Medstar hospitals now use an updated model of the glucometer that does not include a pop-up message.

Simulation labs improve performanc­e

There are no available data on how many hospitals and systems employ a similar investigat­ive approach to patient-safety risks associated with new technologi­es and how they are deployed in clinical settings. But the Society for Simulation in Healthcare, which supports using simulation to improve performanc­e and reduce errors, has identified 165 simulation centers in the U.S. Many, however, focus on training clinical staff on new procedures and devices rather than working out human interactio­n problems with new technology.

Still, more hospitals are assembling multidisci­plinary teams to evaluate significan­t technologi­cal changes such as EHR implementa­tion. But some problems don’t get flagged until after they cause patient-safety risks.

That’s what happened with the patient at Northwest Community Hospital who did not receive his prescripti­on psychiatri­c medicine, Clozapine, for nearly three weeks in 2013. Hospital officials found that the computeriz­ed prescripti­on system automatica­lly discontinu­ed the drug order after seven days because it had a default “automatic stop” value for certain high-risk drugs. There was no pro-

grammed cue to alert the medical team to either resubmit the order or to cancel the patient’s prescripti­on medication. The hospital had been using that computeriz­ed system since 2009.

Following a review of 15 drugs with such stop values, the hospital has removed most of them, keeping only a few based on the manufactur­ers’ recommenda­tions, said its pharmacy director, Jason Alonzo. High-risk drugs for which the stop values were removed are now reviewed daily by hospital pharmacist­s.

“What we’ve all learned is that the technology will do exactly what we tell it to do,” said Kimberly Nagy, chief nursing officer at Northwest Community. Nagy said it is difficult to tell whether other patients had been affected; the 2013 incident was the first to bring the issue to the hospital’s attention.

Mary Logan, president of the Associatio­n for the Advancemen­t of Medical Instrument­ation, which develops standards for medical-device manufactur­ers, said hospitals should standardiz­e the way they purchase new technologi­es and get key users involved before making the buying decision.

“This is where a lot of organizati­ons make mistakes,” she said. “The team that does the technology assessment should not be driven by the one person who wants the shiny object.”

That means having a wide range of clinical staff on hospital value-analysis committees. Those committees, she said, should first ask two key questions: What problem are we trying to solve? And, is a particular technology going to solve it?

While Logan’s group focuses on devices such as ventilator­s, infusion pumps, monitors and pacemakers, the same principles apply to any new technology, she said.

If the tool requires customizat­ion, the staffers programmin­g the tool should understand that even small changes or upgrades could have unintended consequenc­es and produce patient-safety risks.

Even if a new technology unquestion­ably offers improved quality of care, the Joint Commission’s Pujols-McKee cautions that there should be heightened awareness about how to safely implement it in the complex healthcare setting. “Oftentimes, the thought is, if we have the technology we’re safer,” she said. “But that is incorrect.”

 ??  ?? Staff at MedStar Health’s National Center for Human Factors in Healthcare use a simulation lab to demonstrat­e how a patientsaf­ety event could easily occur with a cardiac defibrilla­tor. The center looks for potential usability and human factor problems...
Staff at MedStar Health’s National Center for Human Factors in Healthcare use a simulation lab to demonstrat­e how a patientsaf­ety event could easily occur with a cardiac defibrilla­tor. The center looks for potential usability and human factor problems...

Newspapers in English

Newspapers from United States