FACE­OFF WITH TECH

AP­PLE’S SIG­NA­TURE FEA­TURE ON ITS NEW iPHONE X, FACE ID, HAS SPARKED OFF CON­CERNS ABOUT PRI­VACY AND WHAT HAP­PENS WHEN TECH­NOL­OGY GOES ALL WRONG.

Business Today - - CONTENTS -

Ap­ple’s sig­na­ture fea­ture on its new iPhone X, Face ID, has sparked off con­cerns about pri­vacy and what hap­pens when tech­nol­ogy goes all wrong.

isn’t the first time the face is go­ing to be used for bio­met­rics, but as Ap­ple has jumped into the fray, you can bank on wide­spread adop­tion of the tech­nol­ogy. Ap­ple’s im­ple­men­ta­tion of Face ID isn’t easy to mimic. It in­volves an en­tire sys­tem with an in­frared cam­era, dot pro­jec­tor, prox­im­ity sen­sor, and am­bi­ent light sen­sor. The thing is called TrueDepth; it projects 30,000 dots to make up a 3D model of the face.

Peo­ple used to laugh at phones that could be fooled by un­lock­ing when you used a photo of the owner, but Ap­ple’s Face ID also has its own dan­gers. A per­son be­ing mugged, for ex­am­ple, could just be forced to look at the phone to un­lock it.

Fa­cial recog­ni­tion is only one aspect of ma­chine per­cep­tion. The next is to have ma­chines in­ter­pret what they see. Putting ma­chine per­cep­tion to­gether with ma­chine learn­ing will then re­sult in ap­pli­ca­tions that will im­pact ev­ery­day lives. This is al­ready hap­pen­ing.

Sev­eral air­ports, in­clud­ing Zurich and Am­s­ter­dam, are us­ing bio­met­ric face recog­ni­tion at border con­trol so that no time is wasted phys­i­cally hand­ing over pass­ports to air­port per­son­nel.

But brace your­self: some bizarre uses are also com­ing up. At a park in Bei­jing, face recog­ni­tion is hooked up to a toi­let pa­per dis­penser, which rolls out just 27 inches of the pre­cious ma­te­rial per per­son. You can’t go in for an­other round be­fore nine min­utes are up. Also in China, KFC uses the tech to give food sug­ges­tions and traf­fic au­thor­i­ties use it to com­bat jay­walk­ing. It’s be­ing used by re­tail to sell you what it thinks you need.

As the tech­nol­ogy be­comes more de­vel­oped, it could prove dan­ger­ous in the hands of au­thor­i­tar­ian regimes and rogue agen­cies. The FBI, for ex­am­ple, has been ac­cused of us­ing face recog­ni­tion data with­out pub­lic dec­la­ra­tions of pri­vacy im­pli­ca­tions.

The clev­erer it gets, the more dan­ger­ous it be­comes. Two re­searchers re­cently used AI to see if they could de­tect a per­son’s sex­u­al­ity from their faces. The al­go­rithm could dis­tin­guish be­tween gay and het­ero­sex­ual men 81 per cent of the time, and gay and het­ero­sex­ual women 71 per cent of the time, out­per­form­ing hu­man judges. Pri­vacy ad­vo­cates are right to raise a hue and cry.

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.