Nowhere to hide

Fa­cial recog­ni­tion ( to hide to con­ceal one­self, to keep one­self from view)

Vocable (All English) - - Édito Sommaire -

Life in the age of fa­cial recog­ni­tion.

The use of fa­cial recog­ni­tion tech­nol­ogy as a form of iden­ti­fi­ca­tion is ac­cel­er­at­ing. It is no longer re­stricted to the se­cu­rity ser­vices, but rather fil­ter­ing into our daily lives, for ex­am­ple on so­cial net­work plat­forms, or to un­block a mo­bile… but does this kind of tech­nol­ogy have im­pli­ca­tions for our right to pri­vacy?

The hu­man face is a re­mark­able piece of work. The as­ton­ish­ing va­ri­ety of fa­cial fea­tures helps peo­ple recog­nise each other and is cru­cial to the for­ma­tion of com­plex so­ci­eties. So is the face’s abil­ity to send emo­tional sig­nals, whether through an in­vol­un­tary blush or the ar­ti­fice of a false smile. Peo­ple spend much of their wak­ing lives, in the of­fice and the court­room as well as the bar and the bed­room, read­ing faces, for signs of at­trac­tion, hos­til­ity, trust and de­ceit. They also spend plenty of time try­ing to dis­sim­u­late.

2. Tech­nol­ogy is rapidly catch­ing up with the hu­man abil­ity to read faces. In Amer­ica fa­cial recog­ni­tion is used by churches to track wor­ship­pers’ at­ten­dance; in Bri­tain, by re­tail­ers to spot past shoplifters. This year Welsh po­lice used it to ar­rest a sus­pect out­side a foot­ball game. In China it ver­i­fies the iden­ti­ties of ride-hail­ing driv­ers, per­mits tourists to en­ter at­trac­tions and lets peo­ple pay for things with a smile. Ap­ple’s new iPhone will use it to un­lock the home­screen.

3. Al­though faces are pe­cu­liar to in­di­vid­u­als, they are also pub­lic, so tech­nol­ogy does not, at first sight, in­trude on some­thing that is pri­vate. And yet the abil­ity to record, store and an­a­lyse im­ages of faces cheaply, quickly and on a vast scale prom­ises one day to bring about funda-

men­tal changes to no­tions of pri­vacy, fair­ness and trust.


4. Start with pri­vacy. One big dif­fer­ence between faces and other bio­met­ric data, such as fin­ger­prints, is that they work at a dis­tance. Any­one with a phone can take a pic­ture for fa­cial-recog­ni­tion pro­grams to use. Find-Face, an app in Rus­sia, com­pares snaps of strangers with pic­tures on VKon­takte, a so­cial net­work, and can iden­tify peo­ple with a 70% ac­cu­racy rate. Face­book’s bank of fa­cial im­ages can­not be scraped by oth­ers, but the Sil­i­con Val­ley gi­ant could ob­tain pic­tures of vis­i­tors to a car show­room, say, and later use fa­cial recog­ni­tion to serve them ads for cars.

5. Even if pri­vate firms are un­able to join the dots between im­ages and iden­tity, the state of­ten can. China’s govern­ment keeps a record of its cit­i­zens’ faces; pho­to­graphs of half of Amer­ica’s adult pop­u­la­tion are stored in data­bases that can be used by the FBI. Law-en­force­ment agen­cies now have a pow­er­ful weapon in their abil­ity to track crim­i­nals, but at enor­mous po­ten­tial cost to cit­i­zens’ pri­vacy.

6. The face is not just a name-tag. It dis­plays a lot of other in­for­ma­tion—and ma­chines can read that, too. Again, that prom­ises ben­e­fits. Some firms are analysing faces to pro­vide au­to­mated di­ag­noses of rare ge­netic con­di­tions far ear­lier than would oth­er­wise be pos­si­ble. Sys­tems that mea­sure emo­tion may give autis­tic peo­ple a grasp of so­cial sig­nals they find elu­sive.


7. But the tech­nol­ogy also threat­ens. Re­searchers at Stan­ford Univer­sity have demon­strated that, when shown pic­tures of one gay man, and one straight man, the al­go­rithm could at­tribute their sex­u­al­ity cor­rectly 81% of the time. In coun­tries where ho­mo­sex­u­al­ity is a crime, soft­ware which prom­ises to in­fer sex­u­al­ity from a face is an alarm­ing prospect.

8. Less vi­o­lent forms of dis­crim­i­na­tion could also be­come com­mon. Em­ploy­ers can al­ready act on their prej­u­dices to deny peo­ple a job. But fa­cial recog­ni­tion could make such bias rou­tine, en­abling firms to fil­ter all job ap­pli­ca­tions for eth­nic­ity and signs of in­tel­li­gence and sex­u­al­ity. Night­clubs and sports grounds may face pres­sure to pro­tect peo­ple by scan­ning en­trants’ faces for the threat of vi­o­lence—even though, ow­ing to the na­ture of ma­chine-learn­ing, all fa­cial-recog­ni­tion sys­tems in­evitably deal in prob­a­bil­i­ties.


9. In democ­ra­cies, at least, leg­is­la­tion can help al­ter the bal­ance of good and bad out­comes. Euro­pean reg­u­la­tors have em­bed­ded a set of prin­ci­ples in forth­com­ing data-pro­tec­tion reg­u­la­tion, de­cree­ing that bio­met­ric in­for­ma­tion, which would in­clude “faceprints”, be­longs to its owner and that its use re­quires con­sent—so that, in Europe, un­like Amer­ica, Face­book could not just sell ads to those car-show­room vis­i­tors. Laws against dis­crim­i­na­tion can be ap­plied to an em­ployer screen­ing can­di­dates’ im­ages. Sup­pli­ers of com­mer­cial face-recog­ni­tion sys­tems might sub­mit to au­dits, to demon­strate that their sys­tems are not prop­a­gat­ing bias un­in­ten­tion­ally. Firms that use such tech­nolo­gies should be held ac­count­able.

10. Such rules can­not al­ter the di­rec­tion of travel, how­ever. Cam­eras will only be­come more com­mon with the spread of wear­able de­vices. Ef­forts to bam­boo­zle fa­cial-recog­ni­tion sys­tems, from sun­glasses to make-up, are al­ready be­ing over­taken; re­search from the Univer­sity of Cam­bridge shows that ar­ti­fi­cial in­tel­li­gence can re­con­struct the fa­cial struc­tures of peo­ple in dis­guise. Google has ex­plic­itly turned its back on match­ing faces to iden­ti­ties, for fear of its mis­use by un­demo­cratic regimes. Other tech firms seem less picky. Ama­zon and Mi­crosoft are both us­ing their cloud ser­vices to of­fer face recog­ni­tion; it is cen­tral to Face­book’s plans.

(Jim Wil­son/ The New York Times)

An im­age of masks used by Ap­ple while de­vel­op­ing the Face ID feature to un­lock the iPhone X.

Newspapers in English

Newspapers from France

© PressReader. All rights reserved.