The Guardian (USA)

From oximeters to AI, where bias in medical devices may lurk

- Nicola Davis Science correspond­ent

The UK health secretary, Sajid Javid, has announced a review into systemic racism and gender bias in medical devices in response to concerns it could contribute to poorer outcomes for women and people of colour.

Writing in the Sunday Times, Javid said: “It is easy to look at a machine and assume that everyone’s getting the same experience. But technologi­es are created and developed by people, and so bias, however inadverten­t, can be an issue here too.”

We take a look at some of the gadgets used in healthcare where concerns over racial bias have been raised.

Oximeters

Oximeters estimate the amount of oxygen in a person’s blood, and are a crucial tool in determinin­g which Covid patients may need hospital care – not least because some can have dangerousl­y low levels of oxygen without realising.

Concerns have been raised, however, that the devices work less well for patients with darker skin. NHS England and the Medicines and Healthcare products Regulatory Agency (MHRA) say pulse oximeters can overestima­te the amount of oxygen in the blood.

Javid told the Guardian last month that the devices were designed for caucasians. “As a result, you were less likely to end up on oxygen if you were black or brown, because the reading was just wrong,” he said.

Experts believe the inaccuraci­es could be one of the reasons why death rates have been higher among minority ethnic people, although other factors may also play a role, such as working in jobs that have greater exposure to others.

Respirator masks

Medical-grade respirator­s are crucial to help keep healthcare workers safe from Covid because they offer protection to the wearer against both large and small particles that others exhale.

In order to offer the greatest protection, however, filtering face piece (FFP) masks must fit properly and research has shown they do not fit as well on people from some ethnic background­s.

“Adequate viral protection can only be provided by respirator­s that properly fit the wearer’s facial characteri­stics. Initial fit pass rates [the rate at which they pass a test on how well they fit] vary between 40% and 90% and are especially low in female and in Asian healthcare workers,” one review published in 2020 notes.

Another published in September found that studies on the fit of such PPE largely focused on Caucasian or single ethnic population­s. “BAME people remain underrepre­sented, limiting comparison­s between ethnic groups,” it said.

Spirometer­s

Spirometer­s measure lung capacity, but experts have raised concerns that there are racial biases in the interpreta­tion of data gathered from such gadgets.

Writing in the journal Science, Dr Achuta Kadambi, an electrical engineer and computer scientist at the University of California, Los Angeles said Black or Asian people are assumed to have lower lung capacity than white people – a belief he noted may be based on inaccuraci­es in earlier studies. As a result, “correction” factors are applied to the interpreta­tion of spirometer data

– a situation that can affect the order in which patients are treated.

“For example, before ‘correction’ a Black person’s lung capacity might be measured to be lower than the lung capacity of a white person” Kadambi writes.

“After ‘correction’ to a smaller baseline lung capacity, treatment plans would prioritise the white person, because it is expected that a Black person should have lower lung capacity, and so their capacity must be much lower than that of a white person before their reduction is considered a priority.”

Another area Kadambi said may be affected by racial bias is remote plethysmog­raphy, a technology in which pulse rates are measured by looking at changes in skin colour captured by video. Kadambi said such visual cues may be biased by subsurface melanin content – in other words, skin colour.

Artificial intelligen­ce systems

AI is increasing­ly being developed for applicatio­ns in healthcare, including to aid profession­als in diagnosing conditions. There are concerns, however, that biases in data used to develop such systems means they risk being less accurate for people of colour.

Such concerns were recently raised in relation to AI systems for diagnosing skin cancers. Researcher­s revealed that few freely available image databases that could be used to develop such AI are labelled with ethnicity or skin type. Of those that did have such informatio­n recorded, only a handful were of people recorded as having dark brown or black skin.

It is an issue Javid has acknowledg­ed. Announcing new funding last month for AI projects to tackle racial inequaliti­es in healthcare, such as the detection of diabetic retinopath­y, he noted that one area of focus would be the developmen­t of standards to make sure datasets used in developing AI systems are “diverse and inclusive”.

“If we only train our AI using mostly data from white patients it cannot help our population as a whole. We need to make sure the data we collect is representa­tive of our nation,” he said.

 ?? Photograph: Grace Cary/Getty Images ?? Some research suggest that oximeters work less well for patients with darker skin.
Photograph: Grace Cary/Getty Images Some research suggest that oximeters work less well for patients with darker skin.
 ?? Photograph: Justin Tallis/AFP/Getty Images ?? A woman blows into a spirometer.
Photograph: Justin Tallis/AFP/Getty Images A woman blows into a spirometer.

Newspapers in English

Newspapers from United States