Apple edges closer to fraught age of fa­cial recog­ni­tion tech

The Myanmar Times - - International Business -

A WHIFF of dystopian creepi­ness has long wafted in the air when­ever fa­cial recog­ni­tion has come up. Books, movies and tele­vi­sion shows have por­trayed the technology as mainly a tool of sur­veil­lance and so­cial con­trol - aimed by un­seen oth­ers at you, for their pur­poses, not your own.

Apple sought to re­verse that equa­tion Tues­day with the long-an­tic­i­pated re­lease of its 10th-an­niver­sary smart­phone, the iPhone X. It re­places the fin­ger­print sen­sor pre­vi­ous gen­er­a­tions used for un­lock­ing a user’s de­vice with fa­cial recog­ni­tion technology, while still keep­ing oth­ers from un­lock­ing the phone with­out the user’s knowledge.

All users have to do, Apple said at the an­nual Septem­ber event ded­i­cated to tout­ing its lat­est prod­uct up­dates, is look at the iPhone X, which rec­og­nizes you as the reg­is­tered user - even if you are wear­ing glasses or a hat or are sport­ing a new beard.

Though not en­tirely new - sev­eral An­droid smart­phones do some­thing sim­i­lar al­ready - the technology re­mains novel. Apple’s em­brace of it could mark a tip­ping point in the adop­tion of fa­cial recog­ni­tion technology across new ar­eas of our lives - as we shop or com­mu­ni­cate with friends, and, even­tu­ally, as we en­ter build­ings or per­haps turn on our ve­hi­cles with a glance rather than a twist of the key.

Many forms of sur­veil­lance - cell­phone lo­ca­tion track­ing, so­cial me­dia an­a­lyt­ics and the CIA’s re­ported abil­ity to re­motely ac­ti­vate the mi­cro­phone on an in­di­vid­ual’s smart TV - were born of such pop­u­lar con­sumer ad­vances. Only later, typ­i­cally through leaked doc­u­ments and in­ves­tiga­tive re­ports, did it be­come clear how pop­u­lar tech­nolo­gies were turned on their users.

“The big dan­ger with fa­cial recog­ni­tion is that we are tar­geted ev­ery­where we go and in ev­ery­thing we do,” said Jay Stan­ley, a se­nior pol­icy an­a­lyst with the ACLU’s Speech, Privacy and Technology Project. “The ac­cept­able uses could soften up the ter­rain for less ac­cept­able uses.”

The po­ten­tial for widely de­ployed fa­cial recog­ni­tion sys­tems has par­tic­u­larly con­cerned privacy ex­perts, who have warned about a fu­ture in which our faces and other bio­met­rics are used to track our ev­ery move­ment, our po­lit­i­cal ac­tiv­ity, our re­li­gious lives and even our ro­man­tic en­coun­ters.

Re­cent re­search at Stan­ford, mean­while, con­tends that a range of pri­vate facts, in­clud­ing an in­di­vid­ual’s sex­ual ori­en­ta­tion, could be read through so­phis­ti­cated analy­ses of fa­cial images with the help of ar­ti­fi­cial in­tel­li­gence.

“We have only one face,” said Clare Garvie, an as­so­ciate at Ge­orge­town Univer­sity’s Cen­ter on Privacy & Technology and an au­thor of the Per­pet­ual Line-Up, a 2016 re­port on fa­cial recog­ni­tion data­bases col­lected by gov­ern­ments. “The more com­fort­able we be­come with fa­cial recog­ni­tion, the more com­pla­cent we may be­come.”

What Apple in­tro­duced Tues­day was a ver­sion of fa­cial recog­ni­tion technology that iPhone X own­ers are sup­posed to use on them­selves, for their own pur­poses and only when they want to. They can al­ways type a nu­meric pass­code in­stead.

Such caveats have earned the com­pany cau­tious praise from some privacy ex­perts. They noted that the iPhone X will keep its fa­cial anal­y­sis data se­cure on the de­vice rather than trans­mit­ting it across the In­ter­net (where it could po­ten­tially be in­ter­cepted) or col­lect­ing it in a data­base that might al­low hack­ers, spies or law en­force­ment agen­cies to gain ac­cess to fa­cial records en masse.

The An­droid de­vices that use fa­cial recog­ni­tion also keep the data on the de­vice, al­though hack­ers have demon­strated that some of these sys­tems can be tricked by pho­tographs of users - some­thing Apple says can­not hap­pen with the iPhone X.

Many privacy ex­perts also re­gard fa­cial recog­ni­tion technology as a rel­a­tively sim­ple, safe and re­li­able way to au­then­ti­cate the iden­tity of a smart­phone’s owner, help­ing pro­tect the mas­sive troves of per­sonal data kept on de­vices and giv­ing the technology a pos­i­tive privacy im­pact in the view of some ex­perts.

“I don’t think we should re­flex­ively re­ject fa­cial recog­ni­tion. The ques­tion should be, by what means and for whose ben­e­fit?” said Marc Roten­berg, ex­ec­u­tive di­rec­tor of the Elec­tronic Privacy In­for­ma­tion Cen­ter. “Fa­cial recog­ni­tion has both good uses and bad uses from a con­sumer per­spec­tive.”

Half of U.S. adults al­ready have their images in some fed­eral, state or lo­cal fa­cial recog­ni­tion sys­tem through a com­bi­na­tion of data­bases of peo­ple who have been ar­rested or con­victed of crimes, along with ledgers of peo­ple who hold driver’s li­censes, pass­ports and visas, the 2016 Ge­orge­town re­port found.

Privacy ex­perts have fought to curb the ex­pan­sion of such data­bases. Some states, for example, have pro­hib­ited driver’s li­censes from be­ing used in fa­cial recog­ni­tion searches by law en­force­ment. Ex­perts have also sought to limit how and when the data­bases are used.

They have ad­di­tion­ally sought to raise aware­ness about the mas­sive com­mer­cial data­bases kept by Face­book and Google, both of which in some cir­cum­stances use fa­cial recog­ni­tion technology to iden­tify peo­ple de­picted in pho­tos users up­load.

Also slow­ing the spread of the technology has been the daunt­ing tech­ni­cal chal­lenges of ac­cu­rately an­a­lyz­ing faces in anything less than op­ti­mal cir­cum­stances. Peo­ple in low light, wear­ing hats or glasses, or sim­ply stand­ing at an odd an­gle from a cam­era have long chal­lenged fa­cial recog­ni­tion sys­tems as have peo­ple with darker skin - lead­ing to false pos­i­tives and neg­a­tives when analy­ses are made.

Apple’s sys­tem ap­pears to solve the tech­ni­cal prob­lems; own­ers of the iPhone X are sup­posed to will­ingly “en­roll” their faces from arm’s length, turn­ing their heads so fa­cial con­tours are cap­tured more fully. Open­ing the de­vice later takes only a brief glance.

The fa­cial recog­ni­tion sys­tem, dubbed the TrueDepth cam­era sys­tem, in­cludes a front-fac­ing cam­era, a prox­im­ity sen­sor, an in­frared cam­era and a dot pro­jec­tor that beams more than 30,000 in­vis­i­ble in­frared dots onto a user’s face to take mea­sure­ments. The de­vice then com­bines all the avail­able data to cre­ate what Philip Schiller, Apple’s se­nior vice pres­i­dent of world­wide mar­ket­ing, called “a math­e­mat­i­cal model of your face.”

“The chance that a ran­dom per­son in the pop­u­la­tion could look at your iPhone X and un­lock it with their face is about one in a mil­lion,” Schiller said, pre­sent­ing the new de­vice at Apple’s glitzy new Steve Jobs Theater in Cupertino, Cal­i­for­nia.

There also is the ques­tion, hotly lit­i­gated in re­cent years, about what power law en­force­ment agen­cies have to gain ac­cess to data in de­vices. The Supreme Court ruled in 2014 that au­thor­i­ties re­quire a search war­rant to seize and at­tempt to ex­am­ine a smart­phone.

It would take a sep­a­rate court or­der to re­quire a de­vice’s owner to un­lock it for po­lice, said Nate Car­dozo, a se­nior staff at­tor­ney at the Elec­tronic Fron­tier Foun­da­tion, a civil lib­er­ties group based in San Fran­cisco.

Car­dozo ex­pressed less con­cern than some oth­ers that the in­tro­duc­tion of fa­cial recog­ni­tion for de­vice se­cu­rity will erode re­sis­tance to other uses of the technology. “Peo­ple seem to un­der­stand that on a gut level that when they use bio­met­rics for their own pur­poses. That’s very dif­fer­ent than be­ing part of a data­base that can be used against them.” – The Wash­ing­ton Post

Photo: The Wash­ing­ton Post

Apple’s new iPhone X has fa­cial recog­ni­tion technology.

Newspapers in English

Newspapers from Myanmar

© PressReader. All rights reserved.