The New Zealand Herald

Social media’s racist facial recognitio­n software

Facial recognitio­n failing to ‘see’ darker people risks horrific legal consequenc­es

- Juha Saarinen comment

I’m not wanting to become the Herald’s social media correspond­ent, but over the weekend educator and PhD student sent some fascinatin­g tweets about how Zoom and Twitter’s facial recognitio­n works.

Well, how poorly it works if you're black that is. Madland noticed that his black colleagues lost their heads when Zoom virtual background­s were enabled and there was no way to fix this.

The problem doesn’t happen with people who have lighter skin. Next, Madland spotted that Twitter would crop the previews of photos containing a black and a white person so that the former did not show up.

Other Twitter users tried it and sure enough: the black person was “disappeare­d”. One harrowing example used former United States President Barack Obama and US senator Mitch McConnell and the same thing happened: no Obama in the preview, only “Moscow” Mitch as some Americans refer to him as.

Even stock photos got the same treatment, with white models showing up in the preview but not black ones.

Twitter owned up to gaffe, to its credit but this is nothing new, and it’s a problem that keeps, err, cropping up no matter how many times it’s being raised.

In 2009, HP got into hot water after it was discovered that its webcams were only able to track white people, and not black people.

That was blamed on contrast issues in certain lighting situations which few people believed, funnily enough.

The reality is that facial recognitio­n algorithms used by software and nowadays, artificial intelligen­ce and machine learning, are still biased against people with dark skin.

This despite a vast increase in computing power, algorithm refinement and learning from years of use of facial recognitio­n. Last year, the US National Institute of Standards and Technology checked out facial recognitio­n code from over 50 companies.

In one test with photos of white women and of black women, NIST found that the latter were misidentif­ied at a rate 10 times as high as the former.

The software was from French biometric tech company IDEMIA which builds, among other things, those cool passport kiosks you might bump into at immigratio­n at airports.

Biased software is a serious problem because facial recognitio­n is being rolled out everywhere at a rate of knots. Police want to use it to go through mugshot databases, which could have absolutely horrific consequenc­es with high misidentif­ication rates.

Getting it wrong means facial recognitio­n systems have already put innocent people behind bars, and missed the guilty ones.

Facial recognitio­n tech that erases or crops out people of certain ethnicitie­s could also lead to innocent people not having an alibi from surveillan­ce cameras.

That the problem still exists after all these years shows developers aren’t prioritisi­ng a fix for it. Startups train their tech on big data sets, many of which are freely available. How many startups check to ensure that the data sets are sufficient­ly diverse?

The last few years have seen ethics commission­s and groups set up to work on how to avoid biases, discrimina­tion and outright errors in algorithms and tech. They tend to not be very diverse and manned (literally) with people from wealthy European and North American countries.

Not having the experience and voices of people from a wide variety of ethnicitie­s and background­s suggest the software that will control important parts of our lives will continue to be biased, unfortunat­ely.

Facial recognitio­n tech and AI that learns from it won’t go away. It’ll increasing­ly be used to verify people’s identities as part of biometrics tech for transactio­ns, travel, security, health services and policing. It’s even used to determine “liveness”, to ensure that they’re actually humans.

Ignoring the issue is likely to create a massive problem globally, with millions of people potentiall­y being locked out of services and markets because the technology doesn’t recognise their faces. And that’s quite frankly a ridiculous thought.

This is of course a business opportunit­y. There are companies in Africa that have recognised the gap in the market created by the bias; here’s hoping they’ll be able to move quickly and create better solutions than their Western and Asian counterpar­ts who rely on people being “contrasty” enough.

 ??  ?? Zoom and Twitter are among those sites where black people have struck facial recognitio­n problems.
Zoom and Twitter are among those sites where black people have struck facial recognitio­n problems.
 ??  ??
 ??  ??

Newspapers in English

Newspapers from New Zealand