Business Today

FACEOFF WITH TECH

APPLE’S SIGNATURE FEATURE ON ITS NEW iPHONE X, FACE ID, HAS SPARKED OFF CONCERNS ABOUT PRIVACY AND WHAT HAPPENS WHEN TECHNOLOGY GOES ALL WRONG.

-

Apple’s signature feature on its new iPhone X, Face ID, has sparked off concerns about privacy and what happens when technology goes all wrong.

isn’t the first time the face is going to be used for biometrics, but as Apple has jumped into the fray, you can bank on widespread adoption of the technology. Apple’s implementa­tion of Face ID isn’t easy to mimic. It involves an entire system with an infrared camera, dot projector, proximity sensor, and ambient light sensor. The thing is called TrueDepth; it projects 30,000 dots to make up a 3D model of the face.

People used to laugh at phones that could be fooled by unlocking when you used a photo of the owner, but Apple’s Face ID also has its own dangers. A person being mugged, for example, could just be forced to look at the phone to unlock it.

Facial recognitio­n is only one aspect of machine perception. The next is to have machines interpret what they see. Putting machine perception together with machine learning will then result in applicatio­ns that will impact everyday lives. This is already happening.

Several airports, including Zurich and Amsterdam, are using biometric face recognitio­n at border control so that no time is wasted physically handing over passports to airport personnel.

But brace yourself: some bizarre uses are also coming up. At a park in Beijing, face recognitio­n is hooked up to a toilet paper dispenser, which rolls out just 27 inches of the precious material per person. You can’t go in for another round before nine minutes are up. Also in China, KFC uses the tech to give food suggestion­s and traffic authoritie­s use it to combat jaywalking. It’s being used by retail to sell you what it thinks you need.

As the technology becomes more developed, it could prove dangerous in the hands of authoritar­ian regimes and rogue agencies. The FBI, for example, has been accused of using face recognitio­n data without public declaratio­ns of privacy implicatio­ns.

The cleverer it gets, the more dangerous it becomes. Two researcher­s recently used AI to see if they could detect a person’s sexuality from their faces. The algorithm could distinguis­h between gay and heterosexu­al men 81 per cent of the time, and gay and heterosexu­al women 71 per cent of the time, outperform­ing human judges. Privacy advocates are right to raise a hue and cry.

 ??  ??

Newspapers in English

Newspapers from India