Now­here to hide

Im­pos­sible de se cacher

Vocable (Anglais) - - Édito | Sommaire -

Les en­jeux de la re­con­nais­sance fa­ciale.

La re­con­nais­sance fa­ciale comme moyen d’iden­ti­fi­ca­tion des per­sonnes connaît un coup d’ac­cé­lé­ra­teur. Son uti­li­sa­tion n’est plus can­ton­née à la sécurité mais en­va­hit pro­gres­si­ve­ment notre quo­ti­dien : sur les ré­seaux so­ciaux, pour dé­ver­rouiller notre té­lé­phone… Cette tech­no­lo­gie pré­sente-t-elle des risques pour notre vie pri­vée ?

The hu­man face is a re­mar­kable piece of work. The as­to­ni­shing va­rie­ty of fa­cial fea­tures helps people re­co­gnise each other and is cru­cial to the for­ma­tion of com­plex so­cie­ties. So is the face’s abi­li­ty to send emo­tio­nal si­gnals, whe­ther through an in­vo­lun­ta­ry blush or the ar­ti­fice of a false smile. People spend much of their wa­king lives, in the of­fice and the cour­troom as well as the bar and the be­droom, rea­ding faces, for si­gns of at­trac­tion, hos­ti­li­ty, trust and de­ceit. They al­so spend plen­ty of time trying to dis­si­mu­late.

2. Tech­no­lo­gy is ra­pid­ly cat­ching up with the hu­man abi­li­ty to read faces. In Ame­ri­ca fa­cial re­cog­ni­tion is used by churches to track wor­ship­pers’ at­ten­dance; in Bri­tain, by re­tai­lers to spot past sho­plif­ters. This year Welsh po­lice used it to ar­rest a sus­pect out­side a foot­ball game. In Chi­na it ve­ri­fies the iden­ti­ties of ride-hai­ling drivers, per­mits tou­rists to en­ter at­trac­tions and lets people pay for things with a smile. Apple’s new iPhone will use it to un­lock the ho­mes­creen.

3. Al­though faces are pe­cu­liar to in­di­vi­duals, they are al­so pu­blic, so tech­no­lo­gy does not, at first sight, in­trude on so­me­thing that is pri­vate. And yet the abi­li­ty to re­cord, store and ana­lyse images of faces chea­ply, qui­ck­ly and on a vast scale pro­mises one day to bring about fun­da-

men­tal changes to no­tions of pri­va­cy, fair­ness and trust.


4. Start with pri­va­cy. One big dif­fe­rence bet­ween faces and other bio­me­tric da­ta, such as fin­ger­prints, is that they work at a dis­tance. Anyone with a phone can take a pic­ture for fa­cial-re­cog­ni­tion pro­grams to use. FindFace, an app in Rus­sia, com­pares snaps of stran­gers with pic­tures on VKon­takte, a so­cial net­work, and can iden­ti­fy people with a 70% ac­cu­ra­cy rate. Fa­ce­book’s bank of fa­cial images can­not be scra­ped by others, but the Si­li­con Val­ley giant could ob­tain pic­tures of vi­si­tors to a car sho­wroom, say, and la­ter use fa­cial re­cog­ni­tion to serve them ads for cars.

5. Even if pri­vate firms are unable to join the dots bet­ween images and iden­ti­ty, the state of­ten can. Chi­na’s go­vern­ment keeps a re­cord of its ci­ti­zens’ faces; pho­to­graphs of half of Ame­ri­ca’s adult po­pu­la­tion are sto­red in da­ta­bases that can be used by the FBI. Law-en­for­ce­ment agen­cies now have a po­wer­ful wea­pon in their abi­li­ty to track cri­mi­nals, but at en­or­mous po­ten­tial cost to ci­ti­zens’ pri­va­cy.

6. The face is not just a name-tag. It dis­plays a lot of other in­for­ma­tion—and ma­chines can read that, too. Again, that pro­mises be­ne­fits. Some firms are ana­ly­sing faces to pro­vide au­to­ma­ted diag­noses of rare ge­ne­tic condi­tions far ear­lier than would other­wise be pos­sible. Sys­tems that mea­sure emo­tion may give au­tis­tic people a grasp of so­cial si­gnals they find elu­sive.


7. But the tech­no­lo­gy al­so threa­tens. Re­sear­chers at Stan­ford Uni­ver­si­ty have de­mons­tra­ted that, when shown pic­tures of one gay man, and one straight man, the al­go­rithm could at­tri­bute their sexua­li­ty cor­rect­ly 81% of the time. In coun­tries where ho­mo­sexua­li­ty is a crime, soft­ware which pro­mises to in­fer sexua­li­ty from a face is an alar­ming pros­pect.

8. Less violent forms of dis­cri­mi­na­tion could al­so be­come com­mon. Em­ployers can al­rea­dy act on their pre­ju­dices to de­ny people a job. But fa­cial re­cog­ni­tion could make such bias rou­tine, en­abling firms to fil­ter all job ap­pli­ca­tions for eth­ni­ci­ty and si­gns of in­tel­li­gence and sexua­li­ty. Night­clubs and sports grounds may face pres­sure to pro­tect people by scan­ning en­trants’ faces for the threat of vio­lence—even though, owing to the na­ture of ma­chine-lear­ning, all fa­cial-re­cog­ni­tion sys­tems in­evi­ta­bly deal in pro­ba­bi­li­ties.


9. In de­mo­cra­cies, at least, le­gis­la­tion can help al­ter the ba­lance of good and bad out­comes. Eu­ro­pean re­gu­la­tors have em­bed­ded a set of prin­ciples in for­th­co­ming da­ta-pro­tec­tion re­gu­la­tion, de­creeing that bio­me­tric in­for­ma­tion, which would in­clude “fa­ce­prints”, be­longs to its ow­ner and that its use re­quires consent—so that, in Eu­rope, un­like Ame­ri­ca, Fa­ce­book could not just sell ads to those car-sho­wroom vi­si­tors. Laws against dis­cri­mi­na­tion can be ap­plied to an em­ployer scree­ning can­di­dates’ images. Sup­pliers of com­mer­cial face-re­cog­ni­tion sys­tems might sub­mit to au­dits, to de­mons­trate that their sys­tems are not pro­pa­ga­ting bias unin­ten­tio­nal­ly. Firms that use such tech­no­lo­gies should be held ac­coun­table.

10. Such rules can­not al­ter the di­rec­tion of tra­vel, ho­we­ver. Ca­me­ras will on­ly be­come more com­mon with the spread of wea­rable de­vices. Ef­forts to bam­boozle fa­cial-re­cog­ni­tion sys­tems, from sun­glasses to make-up, are al­rea­dy being over­ta­ken; re­search from the Uni­ver­si­ty of Cam­bridge shows that ar­ti­fi­cial in­tel­li­gence can re­cons­truct the fa­cial struc­tures of people in dis­guise. Google has ex­pli­cit­ly tur­ned its back on mat­ching faces to iden­ti­ties, for fear of its mi­suse by un­de­mo­cra­tic re­gimes. Other tech firms seem less pi­cky. Ama­zon and Mi­cro­soft are both using their cloud ser­vices to of­fer face re­cog­ni­tion; it is cen­tral to Fa­ce­book’s plans.

(Jim Wil­son/The New York Times)

An image of masks used by Apple while de­ve­lo­ping the Face ID fea­ture to un­lock the iPhone X.

Newspapers in French

Newspapers from France

© PressReader. All rights reserved.