New iPhone brings face recog­ni­tion – and fears – to the masses

DE­SPITE AP­PLE’S SAFE­GUARDS, PRI­VACY AC­TIVISTS FEAR THE WIDE­SPREAD USE OF FA­CIAL RECOG­NI­TION WILL WORK ITS WAY INTO SO­CI­ETY AND BE­COME A SUR­VEIL­LANCE TECH­NOL­OGY THAT IS ABUSED, MAK­ING OUR LIVES MORE TRACKABLE BY AD­VER­TIS­ERS, LAW EN­FORCE­MENT AND MAYBE SOMED

The Nation - - INSIGHT -

AP­PLE WILL let you un­lock the iPhone X with your face – a move likely to bring fa­cial recog­ni­tion to the masses, along with con­cerns over how the tech­nol­ogy may be used for ne­far­i­ous pur­poses.

Ap­ple’s new­est de­vice, set to go on sale on Fri­day, is de­signed to be un­locked with a fa­cial scan with a num­ber of pri­vacy safe­guards – as the data will only be stored on the phone and not in any data­bases.

The iPhone X is po­si­tioned as a high-end, pre­mium model in­tended to show­case ad­vanced tech­nolo­gies, such as wire­less charg­ing, an OLED dis­play, dual cam­eras with im­proved depth sens­ing, and its unique face recog­ni­tion un­lock sys­tem called Face ID.

AI ac­cel­er­a­tor

The smart­phone con­tains Ap­ple’s A11 Bionic chip. It sports a “neu­ral en­gine”, which is an ar­ti­fi­cial in­tel­li­gence (AI) ac­cel­er­a­tor. It has two cores op­ti­mised for per­for­mance, which are 25 per cent faster than the A10 Fu­sion along with four ef­fi­ciency-op­ti­mised cores, which are 70 per cent faster than the iPhone 7.

Un­lock­ing one’s phone with a face scan may of­fer added con­ve­nience and se­cu­rity for iPhone users, ac­cord­ing to Ap­ple, which claims its neu­ral en­gine for Fa­ceID can­not be tricked by a photo or hacker.

While other de­vices have of­fered fa­cial recog­ni­tion, Ap­ple is the first to pack the tech­nol­ogy al­low­ing for a three-di­men­sional scan into a hand-held phone.

But de­spite Ap­ple’s safe­guards, pri­vacy ac­tivists fear the wide­spread use of fa­cial recog­ni­tion would “nor­malise” the tech­nol­ogy and open the door to broader use by law en­force­ment, mar­keters or others of a largely un­reg­u­lated tool.

“Ap­ple has done a num­ber of things well for pri­vacy but it’s not al­ways go­ing to be about the iPhone X,” said Jay Stan­ley, a pol­icy an­a­lyst with the Amer­i­can Civil Lib­er­ties Union (ACLU).

“There are real rea­sons to worry that fa­cial recog­ni­tion will work its way into our cul­ture and be­come a sur­veil­lance tech­nol­ogy that is abused.”

A study last year by Ge­orge­town Univer­sity re­searchers found nearly half of all Amer­i­cans in a law en­force­ment database that in­cludes fa­cial recog­ni­tion, with­out their con­sent.

Civil lib­er­ties groups have sued over the Fed­eral Bureau of In­ves­ti­ga­tion’s use of its “next gen­er­a­tion” bio­met­ric database, which in­cludes fa­cial pro­files, claim­ing it has a high er­ror rate and the po­ten­tial for track­ing in­no­cent peo­ple.

“We don’t want po­lice of­fi­cers hav­ing a watch list em­bed­ded in their body cam­eras scan­ning faces on the side­walk,” said Stan­ley.

Clare Garvie – the Ge­orge­town Univer­sity Law School as­so­ci­ate who led the 2016 study on fa­cial recog­ni­tion data­bases – agreed that Ap­ple is tak­ing a re­spon­si­ble ap­proach but others might not.

“My con­cern is that the pub­lic is go­ing to be­come in­ured or com­pla­cent about this,” Garvie said.

Ad­ver­tis­ers, po­lice, porn stars

Wide­spread use of fa­cial recog­ni­tion “could make our lives more trackable by ad­ver­tis­ers, by law en­force­ment and maybe some­day by pri­vate in­di­vid­u­als”, she said.

Garvie said her re­search found sig­nif­i­cant er­rors in law en­force­ment fa­cial recog­ni­tion data­bases, open­ing up the pos­si­bil­ity some­one could be wrongly iden­ti­fied as a crim­i­nal sus­pect.

Another worry, she said, is that po­lice could track in­di­vid­u­als who have com­mit­ted no crime sim­ply for par­tic­i­pat­ing in demon­stra­tions.

Shang­hai and other Chi­nese cities have re­cently started de­ploy­ing fa­cial recog­ni­tion to catch those who flout the rules of the road, in­clud­ing jay­walk­ers.

Fa­cial recog­ni­tion and re­lated tech­nolo­gies can also be used by re­tail stores to iden­tify po­ten­tial shoplifters, and by casi­nos to pin­point un­de­sir­able gam­blers.

It can even be used to de­liver per­son­alised mar­ket­ing mes­sages, and could have some other po­ten­tially un­nerv­ing ap­pli­ca­tions.

Last year, a Rus­sian pho­tog­ra­pher fig­ured out how to match the faces of porn stars with their so­cial me­dia pro­files to “doxx” them, or re­veal their true iden­ti­ties.

“Huge prob­lems”

This type of use “can cre­ate huge prob­lems”, said Garvie. “We have to con­sider the worst pos­si­ble uses of the tech­nol­ogy.”

The three top smart­phone mak­ers – Ap­ple, Samsung and Huawei – have also come out with ar­ti­fi­cial in­tel­li­gence-pow­ered phones.

Ap­ple and Chi­nese firm Huawei have bet on ar­ti­fi­cial in­tel­li­gence ca­pa­bil­i­ties de­signed to take some of the load off users’ shoul­ders, show­cas­ing them in their phones’ cam­eras at glossy launch events. AI can help shift such ev­ery­day tin­ker­ing into the back­ground, sav­ing peo­ple time and min­imis­ing an­noy­ance.

The two com­pa­nies have built spe­cial­ist ma­chine learn­ing ca­pa­bil­i­ties into the pro­ces­sors that power their phones, which could give third­party app de­vel­op­ers all over the world the chance to think up new uses for the tech­nique.

Un­like Huawei, Ap­ple tightly con­trols its whole de­vices, from hard­ware through the op­er­at­ing sys­tem to third-party apps, mean­ing de­vel­op­ers know ex­actly what they can ex­pect when pro­gram­ming for the iPhone.

As for fa­cial recog­ni­tion tech­nol­ogy, Ap­ple’s sys­tem uses 30,000 in­frared dots to cre­ate a dig­i­tal im­age which is stored in a “se­cure en­clave”, ac­cord­ing to a white pa­per is­sued by the com­pany on its se­cu­rity. It said the chances of a “ran­dom” per­son be­ing able to un­lock the de­vice are one in a mil­lion, com­pared with one in 50,000 for its TouchID.

Le­gal bat­tle brew­ing

Ap­ple’s Fa­ceID is likely to touch off fresh le­gal bat­tles about whether po­lice can re­quire some­one to un­lock a de­vice.

Fa­ceID “brings the com­pany deeper into a le­gal de­bate” that stemmed from the in­tro­duc­tion of fin­ger­print iden­ti­fi­ca­tion on smart­phones, ac­cord­ing to ACLU staff at­tor­ney Brett Max Kauf­man.

Kauf­man says in a blog post that courts will be grap­pling with the con­sti­tu­tional guar­an­tees against un­rea­son­able searches and self-in­crim­i­na­tion if a sus­pect is forced to un­lock a de­vice.

US courts have gen­er­ally ruled that it would vi­o­late a user’s rights to give up a pass­code be­cause it is “tes­ti­mo­nial”– but that sit­u­a­tion be­comes murkier when bio­met­rics are ap­plied.

Ap­ple ap­pears to have an­tic­i­pated this sit­u­a­tion by al­low­ing a user to press two but­tons for two sec­onds to re­quire a pass­code, but Garvie said court bat­tles over com­pelling the use of Fa­ceID are likely.

Re­gard­less of these con­cerns, Ap­ple’s in­tro­duc­tion is likely to bring about wide­spread use of fa­cial recog­ni­tion tech­nol­ogy.

“What Ap­ple is do­ing here will pop­u­larise and get peo­ple more com­fort­able with the tech­nol­ogy,” said Pa­trick Moor­head, prin­ci­pal an­a­lyst at Moor In­sights & Strat­egy, who fol­lows the sec­tor.

“If I look at Ap­ple’s track record of mak­ing things easy for con­sumers, I’m op­ti­mistic users are go­ing to like this.”

Garvie added it is im­por­tant to have con­ver­sa­tions about fa­cial recog­ni­tion be­cause there is lit­tle reg­u­la­tion gov­ern­ing the use of the tech­nol­ogy.

“The tech­nol­ogy may well be in­evitable,” she said. “It is go­ing to be­come part of ev­ery­one’s lives if it isn’t al­ready.”

Ap­ple CEO Tim Cook speaks dur­ing Ap­ple’s spe­cial event launch­ing the iPhone X at the Steve Jobs Theatre in Cu­per­tino, Cal­i­for­nia, in this Septem­ber 12 photo. The iPhone X goes on sale on Fri­day.

Newspapers in English

Newspapers from Thailand

© PressReader. All rights reserved.