With iPhone X, Ap­ple edges closer to an age of fa­cial recog­ni­tion tech

Pittsburgh Post-Gazette - - National - By Craig Tim­berg

The Wash­ing­ton Post

A whiff of dystopian creepi­ness has long wafted in the air when­ever fa­cial recog­ni­tion has come up. Books, movies and tele­vi­sion shows have por­trayed the tech­nol­ogy as mainly a tool of sur­veil­lance and so­cial con­trol — aimed by un­seen oth­ers at you, for their pur­poses, not your own.

Ap­ple sought to re­verse that equa­tion Sept. 12 with the long-an­tic­i­pated re­lease of its 10th-an­niver­sary smart­phone, the iPhone X. It re­places the fin­ger­print sen­sor pre­vi­ous gen­er­a­tions used for un­lock­ing a user’s de­vice with fa­cial recog­ni­tion tech­nol­ogy, while still keep­ing oth­ers from un­lock­ing the phone with­out the user’s knowl­edge.

All users have to do, Ap­ple said at the an­nual Septem­ber event ded­i­cated to tout­ing its lat­est prod­uct up­dates, is look at the iPhone X, which rec­og­nizes you as the reg­is­tered user — even if you are wear­ing glasses or a hat or are sport­ing a new beard.

Though not en­tirely new — sev­eral An­droid smart­phones do some­thing sim­i­lar al­ready — the tech­nol­ogy re­mains novel. Ap­ple’s em­brace of it could mark a tip­ping point in the adop­tion of fa­cial recog­ni­tion tech­nol­ogy across new ar­eas of our lives — as we shop or com­mu­ni­cate with friends and, even­tu­ally, as we en­ter build­ings or per­haps turn on our ve­hi­cles with a glance rather than a twist of the key.

Many forms of sur­veil­lance — cell phone lo­ca­tion track­ing, so­cial me­dia an­a­lyt­ics and the CIA’s re­ported abil­ity to re­motely ac­ti­vate the mi­cro­phone on an in­di­vid­ual’s smart TV — were born of such popular con­sumer ad­vances. Only later, typ­i­cally through leaked doc­u­ments and in­ves­tiga­tive re­ports, did it be­come clear how popular tech­nolo­gies were turned on their users.

“The big dan­ger with fa­cial recog­ni­tion is that we are tar­geted ev­ery­where we go and in every­thing we do,” said Jay Stan­ley, a se­nior pol­icy an­a­lyst with the Amer­i­can Civil Lib­er­ties Union’s Speech, Pri­vacy and Tech­nol­ogy Pro­ject. “The ac­cept­able uses could soften up the ter­rain for less ac­cept­able uses.”

The po­ten­tial for widely de­ployed fa­cial recog­ni­tion sys­tems has par­tic­u­larly con­cerned pri­vacy ex­perts, who have warned about a fu­ture in which our faces and other bio­met­rics are used to track our ev­ery move­ment, our po­lit­i­cal ac­tiv­ity, our re­li­gious lives and even our ro­man­tic en­coun­ters.

Re­cent re­search at Stan­ford Univer­sity, mean­while, con­tends that a range of pri­vate facts, in­clud­ing an in­di­vid­ual’s sex­ual ori­en­ta­tion, could be read through so­phis­ti­cated analy­ses of fa­cial im­ages with the help of ar­ti­fi­cial in­tel­li­gence.

“We have only one face,” said Clare Garvie, an as­so­ciate at Ge­orge­town Univer­sity’s Cen­ter on Pri­vacy & Tech­nol­ogy and an author of the Per­pet­ual Line-Up, a 2016 re­port on fa­cial recog­ni­tion data­bases col­lected by gov­ern­ments. “The more com­fort­able we be­come with fa­cial recog­ni­tion, the more com­pla­cent we may be­come.”

What Ap­ple in­tro­duced Tues­day was a ver­sion of fa­cial recog­ni­tion tech­nol­ogy that iPhone X own­ers are sup­posed to use on them­selves, for their own pur­poses and only when they want to. They can al­ways type a nu­meric pass code in­stead.

Such caveats have earned the com­pany cau­tious praise from some pri­vacy ex­perts. They noted that the iPhone X will keep its fa­cial anal­y­sis data se­cure on the de­vice rather than trans­mit­ting it across the in­ter­net — where it could po­ten­tially be in­ter­cepted — or col­lect­ing it in a data­base that might al­low hack­ers, spies or law en­force­ment agen­cies to gain ac­cess to fa­cial records en masse.

The An­droid de­vices that use fa­cial recog­ni­tion also keep the data on the de­vice, although hack­ers have demon­strated that some of th­ese sys­tems can be tricked by pho­to­graphs of users — some­thing Ap­ple says can­not hap­pen with the iPhone X.

Many pri­vacy ex­perts also re­gard fa­cial recog­ni­tion tech­nol­ogy as a rel­a­tively sim­ple, safe and re­li­able way to au­then­ti­cate the iden­tity of a smart­phone’s owner, help­ing pro­tect the mas­sive troves of per­sonal data kept on de­vices and giv­ing the tech­nol­ogy a pos­i­tive pri­vacy im­pact in the view of some ex­perts.

“I don’t think we should re­flex­ively re­ject fa­cial recog­ni­tion. The ques­tion should be, by what means and for whose ben­e­fit?” said Marc Roten­berg, ex­ec­u­tive di­rec­tor of the Elec­tronic Pri­vacy In­for­ma­tion Cen­ter. “Fa­cial recog­ni­tion has both good uses and bad uses from a con­sumer per­spec­tive.”

Half of U.S. adults al­ready have their im­ages in some fed­eral, state or lo­cal fa­cial recog­ni­tion sys­tem through a com­bi­na­tion of data­bases of peo­ple who have been ar­rested or con­victed of crimes, along with ledgers of peo­ple who hold driver’s li­censes, pass­ports and visas, the 2016 Ge­orge­town re­port found.

Pri­vacy ex­perts have fought to curb the ex­pan­sion of such data­bases. Some states, for ex­am­ple, have pro­hib­ited driver’s li­censes from be­ing used in fa­cial recog­ni­tion searches by law en­force­ment. Ex­perts have also sought to limit how and when the data­bases are used.

They have ad­di­tion­ally sought to raise aware­ness about the mas­sive com­mer­cial data­bases kept by Face­book and Google, both of which in some cir­cum­stances use fa­cial recog­ni­tion tech­nol­ogy to iden­tify peo­ple de­picted in pho­tos that users up­load.

Also slow­ing the spread of the tech­nol­ogy has been the daunt­ing tech­ni­cal chal­lenges of ac­cu­rately an­a­lyz­ing faces in any­thing less than op­ti­mal cir­cum­stances. Peo­ple in low light, wear­ing hats or glasses, or sim­ply stand­ing at an odd an­gle from a cam­era have long chal­lenged fa­cial recog­ni­tion sys­tems — as have peo­ple with darker skin — lead­ing to false pos­i­tives and neg­a­tives when analy­ses are made.

Ap­ple’s sys­tem ap­pears to solve the tech­ni­cal prob­lems; own­ers of the iPhone X are sup­posed to will­ingly “en­roll” their faces from arm’s length, turn­ing their heads so fa­cial con­tours are cap­tured more fully. Open­ing the de­vice later takes only a brief glance.

The fa­cial recog­ni­tion sys­tem, dubbed the TrueDepth cam­era sys­tem, in­cludes a front-fac­ing cam­era, a prox­im­ity sen­sor, an in­frared cam­era and a dot pro­jec­tor that beams more than 30,000 in­vis­i­ble in­frared dots onto a user’s face to take mea­sure­ments. The de­vice then com­bines all the avail­able data to cre­ate what Philip Schiller, Ap­ple’s se­nior vice pres­i­dent of world­wide mar­ket­ing, called “a math­e­mat­i­cal model of your face.”

“The chance that a ran­dom per­son in the pop­u­la­tion could look at your iPhone X and un­lock it with their face is about one in a mil­lion,” Mr. Schiller said, pre­sent­ing the new de­vice at Ap­ple’s glitzy new Steve Jobs The­ater in Cu­per­tino, Calif.

There also is the ques­tion, hotly lit­i­gated in re­cent years, about what power law en­force­ment agen­cies have to gain ac­cess to data in de­vices. The Supreme Court ruled in 2014 that author­i­ties re­quire a search war­rant to seize and at­tempt to ex­am­ine a smart­phone.

It would take a sep­a­rate court or­der to re­quire a de­vice’s owner to un­lock it for po­lice, said Nate Car­dozo, a se­nior staff at­tor­ney at the Elec­tronic Fron­tier Foun­da­tion, a civil lib­er­ties group based in San Fran­cisco.

Mr. Car­dozo ex­pressed less con­cern than some oth­ers that the in­tro­duc­tion of fa­cial recog­ni­tion for de­vice se­cu­rity will erode re­sis­tance to other uses of the tech­nol­ogy. “Peo­ple seem to un­der­stand that on a gut level that when they use bio­met­rics for their own pur­poses. That’s very dif­fer­ent than be­ing part of a data­base that can be used against them.”

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.