The Daily Telegraph

Why computers are distinctly non-pc when making decisions

- By Henry Bodkin

ROBOTIC artificial intelligen­ce platforms that are increasing­ly replacing human decision makers are inherently racist and sexist, experts have claimed.

Programmes designed to “pre-select” candidates for university places or to assess eligibilit­y for insurance cover or bank loans are likely to discrimina­te against women and non-white applicants, according to their research.

Prof Noel Sharkey, co-director of the Foundation for Responsibl­e Robotics, said more women needed to be encouraged into the IT industry to redress the automatic bias. Just nine per cent of the engineerin­g workforce in the UK is female, with women making up only 20 per cent of those taking A-level physics.

“We have a problem,” Prof Sharkey told Today on BBC Radio 4. “We need many more women coming into this field to solve it.”

His warning came as it was revealed a prototype programme developed to shortlist candidates for a UK medical school had discrimina­ted against women, black and other ethnic minority candidates.

He added researcher­s at Boston University had demonstrat­ed the inherent bias in AI algorithms by training a machine to analyse text collected from Google News.

When they asked the machine to complete the sentence “Man is to computer programmer­s as woman is to X”, the machine answered “homemaker”.

Maxine Mackintosh, a leading expert in health data, said the problem is mainly the fault of skewed data on robotic platforms.

“These big data are really a social mirror – they reflect the biases and inequaliti­es we have in society,” she told the BBC. “If you want to take steps towards changing that you can’t just use historical informatio­n.”

Newspapers in English

Newspapers from United Kingdom