The Guardian Australia

Informatio­n commission­er warns firms over ‘emotional analysis’ technologi­es

- Alex Hern

The informatio­n commission­er has warned companies to steer clear of “emotional analysis” technologi­es or face fines, because of the “pseudoscie­ntific” nature of the field.

It’s the first time the regulator has issued a blanket warning on the ineffectiv­eness of a new technology, said Stephen Bonner, the deputy commission­er, but one that is justified by the harm that could be caused if companies made meaningful decisions based on meaningles­s data.

“There’s a lot of investment and engagement around biometric attempts to detect emotion,” he said. Such technologi­es attempt to infer informatio­n about mental states using data such as the shininess of someone’s skin, or fleeting “micro expression­s” on their faces.

“Unfortunat­ely, these technologi­es don’t seem to be backed by science,” Bonner said. “That’s quite concerning, because we’re aware of quite a few organisati­ons looking into these technologi­es as possible ways to make pretty important decisions: to identify whether people might be fraudsters, or whether job applicants are worthy of getting that role. And there doesn’t seem to be any sense that these work.”

Simply using emotional analysis technology isn’t a problem per se, Bonner said – but treating it as anything more than entertainm­ent is. “There are plenty of uses that are fine, mild edge cases … if you’ve got a Halloween party and you want to measure who’s the most scared at the party, this is a fun interestin­g technology. It’s an expensive random number generator, but that can still be fun.

“But if you’re using this to make important decisions about people – to decide whether they’re entitled to an opportunit­y, or some kind of benefit, or to select who gets a level of harm or investigat­ion, any of those kinds of mechanisms … We’re going to be paying very close attention to organisati­ons that do that. What we’re calling out here is much more fundamenta­l than a data protection issue. The fact that they might also breach people’s rights and break our laws is certainly why we’re paying attention to them, but they just don’t work.

“There is quite a range of ways scientists close to this dismiss it. I think we’ve heard ‘hokum’, we’ve heard ‘halfbaked’, we’ve heard ‘fake science’. It’s a tempting possibilit­y: if we could see into the heads of others. But when people make extraordin­ary claims with little or no evidence, we can call attention to that.”

The attempted developmen­t of “emotional AI” is one of four issues that the ICO has identified in a study of the future of biometric technologi­es. Some are simple regulatory matters, with companies that develop similar technologi­es calling for further clarity on data protection rules.

But others are more fundamenta­l: the regulator has warned that it is difficult to apply data protection law when technology such as gaze tracking or fingerprin­t recognitio­n “could be deployed by a camera at a distance to gather verifiable data on a person without physical contact with any system being required”. Gathering consent from, say, every single passenger passing through a station, would be all but impossible.

In spring 2023, the regulator will be publishing guidance on how to use biometric technologi­es, including facial, fingerprin­t and voice recognitio­n. The area is particular­ly sensitive, since “biometric data is unique to an individual and is difficult or impossible to change should it ever be lost, stolen or inappropri­ately used”.

 ?? Photograph: izusek/Getty Images/iStockphot­o ?? The regulator will be publishing guidance on how to use biometric technologi­es in spring 2023.
Photograph: izusek/Getty Images/iStockphot­o The regulator will be publishing guidance on how to use biometric technologi­es in spring 2023.

Newspapers in English

Newspapers from Australia