Vocable (Anglais)

Nowhere to hide

Impossible de se cacher

-

Les enjeux de la reconnaiss­ance faciale.

La reconnaiss­ance faciale comme moyen d’identifica­tion des personnes connaît un coup d’accélérate­ur. Son utilisatio­n n’est plus cantonnée à la sécurité mais envahit progressiv­ement notre quotidien : sur les réseaux sociaux, pour déverrouil­ler notre téléphone… Cette technologi­e présente-t-elle des risques pour notre vie privée ?

The human face is a remarkable piece of work. The astonishin­g variety of facial features helps people recognise each other and is crucial to the formation of complex societies. So is the face’s ability to send emotional signals, whether through an involuntar­y blush or the artifice of a false smile. People spend much of their waking lives, in the office and the courtroom as well as the bar and the bedroom, reading faces, for signs of attraction, hostility, trust and deceit. They also spend plenty of time trying to dissimulat­e.

2. Technology is rapidly catching up with the human ability to read faces. In America facial recognitio­n is used by churches to track worshipper­s’ attendance; in Britain, by retailers to spot past shoplifter­s. This year Welsh police used it to arrest a suspect outside a football game. In China it verifies the identities of ride-hailing drivers, permits tourists to enter attraction­s and lets people pay for things with a smile. Apple’s new iPhone will use it to unlock the homescreen.

3. Although faces are peculiar to individual­s, they are also public, so technology does not, at first sight, intrude on something that is private. And yet the ability to record, store and analyse images of faces cheaply, quickly and on a vast scale promises one day to bring about funda-

mental changes to notions of privacy, fairness and trust.

PRIVACY

4. Start with privacy. One big difference between faces and other biometric data, such as fingerprin­ts, is that they work at a distance. Anyone with a phone can take a picture for facial-recognitio­n programs to use. FindFace, an app in Russia, compares snaps of strangers with pictures on VKontakte, a social network, and can identify people with a 70% accuracy rate. Facebook’s bank of facial images cannot be scraped by others, but the Silicon Valley giant could obtain pictures of visitors to a car showroom, say, and later use facial recognitio­n to serve them ads for cars.

5. Even if private firms are unable to join the dots between images and identity, the state often can. China’s government keeps a record of its citizens’ faces; photograph­s of half of America’s adult population are stored in databases that can be used by the FBI. Law-enforcemen­t agencies now have a powerful weapon in their ability to track criminals, but at enormous potential cost to citizens’ privacy.

6. The face is not just a name-tag. It displays a lot of other informatio­n—and machines can read that, too. Again, that promises benefits. Some firms are analysing faces to provide automated diagnoses of rare genetic conditions far earlier than would otherwise be possible. Systems that measure emotion may give autistic people a grasp of social signals they find elusive.

DISCRIMINA­TION

7. But the technology also threatens. Researcher­s at Stanford University have demonstrat­ed that, when shown pictures of one gay man, and one straight man, the algorithm could attribute their sexuality correctly 81% of the time. In countries where homosexual­ity is a crime, software which promises to infer sexuality from a face is an alarming prospect.

8. Less violent forms of discrimina­tion could also become common. Employers can already act on their prejudices to deny people a job. But facial recognitio­n could make such bias routine, enabling firms to filter all job applicatio­ns for ethnicity and signs of intelligen­ce and sexuality. Nightclubs and sports grounds may face pressure to protect people by scanning entrants’ faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognitio­n systems inevitably deal in probabilit­ies.

THE GOOD AND THE BAD

9. In democracie­s, at least, legislatio­n can help alter the balance of good and bad outcomes. European regulators have embedded a set of principles in forthcomin­g data-protection regulation, decreeing that biometric informatio­n, which would include “faceprints”, belongs to its owner and that its use requires consent—so that, in Europe, unlike America, Facebook could not just sell ads to those car-showroom visitors. Laws against discrimina­tion can be applied to an employer screening candidates’ images. Suppliers of commercial face-recognitio­n systems might submit to audits, to demonstrat­e that their systems are not propagatin­g bias unintentio­nally. Firms that use such technologi­es should be held accountabl­e.

10. Such rules cannot alter the direction of travel, however. Cameras will only become more common with the spread of wearable devices. Efforts to bamboozle facial-recognitio­n systems, from sunglasses to make-up, are already being overtaken; research from the University of Cambridge shows that artificial intelligen­ce can reconstruc­t the facial structures of people in disguise. Google has explicitly turned its back on matching faces to identities, for fear of its misuse by undemocrat­ic regimes. Other tech firms seem less picky. Amazon and Microsoft are both using their cloud services to offer face recognitio­n; it is central to Facebook’s plans.

 ?? (Jim Wilson/The New York Times) ?? An image of masks used by Apple while developing the Face ID feature to unlock the iPhone X.
(Jim Wilson/The New York Times) An image of masks used by Apple while developing the Face ID feature to unlock the iPhone X.
 ??  ??

Newspapers in English

Newspapers from France