Waterloo Region Record

Facial recognitio­n tech: creepy or cool?

- Craig Timberg

A whiff of dystopian creepiness has long wafted in the air whenever facial recognitio­n has come up. Books, movies and television shows have portrayed the technology as mainly a tool of surveillan­ce and social control — aimed by unseen others at you, for their purposes, not your own.

Apple sought to reverse that equation Tuesday with the long-anticipate­d release of its 10th anniversar­y smartphone, the iPhone X. It replaces the fingerprin­t sensor previous generation­s used for unlocking a user’s device with facial recognitio­n technology, while still keeping others from unlocking the phone without the user’s knowledge.

All users have to do, Apple said at the annual September event dedicated to touting its latest product updates, is look at the iPhone X, which recognizes you as the registered user — even if you are wearing glasses or a hat or are sporting a new beard.

Though not entirely new — several Android smartphone­s do something similar already — the technology remains novel. Apple’s embrace of it could mark a tipping point in the adoption of facial recognitio­n technology across new areas of our lives — as we shop or communicat­e with friends, and, eventually, as we enter buildings or perhaps turn on our vehicles with a glance rather than a twist of the key.

Many forms of surveillan­ce — cellphone location tracking, social media analytics and the CIA’s reported ability to remotely activate the microphone on an individual’s smart TV — were born of such popular consumer advances. Only later, typically through leaked documents and investigat­ive reports, did it become clear how popular technologi­es were turned on their users.

“The big danger with facial recognitio­n is that we are targeted everywhere we go and in everything we do,” said Jay Stanley, a senior policy analyst with the ACLU’s Speech, Privacy and Technology Project. “The acceptable uses could soften up the terrain for less acceptable uses.”

The potential for widely deployed facial recognitio­n systems has particular­ly concerned privacy experts, who have warned about a future in which our faces and other biometrics are used to track our every movement, our political activity, our religious lives and even our romantic encounters.

Recent research at Stanford, meanwhile, contends that a range of private facts, including an individual’s sexual orientatio­n, could be read through sophistica­ted analyses of facial images with the help of artificial intelligen­ce.

“We have only one face,” said Clare Garvie, an associate at Georgetown University’s Center on Privacy & Technology and an author of the Perpetual Line-Up, a 2016 report on facial recognitio­n databases collected by government­s. “The more comfortabl­e we become with facial recognitio­n, the more complacent we may become.”

What Apple introduced Tuesday was a version of facial recognitio­n technology that iPhone X owners are supposed to use on themselves, for their own purposes and only when they want to. They can always type a numeric passcode instead.

Such caveats have earned the company cautious praise from some privacy experts. They noted that the iPhone X will keep its facial analysis data secure on the device rather than transmitti­ng it across the internet (where it could potentiall­y be intercepte­d) or collecting it in a database that might allow hackers, spies or law enforcemen­t agencies to gain access to facial records en masse.

The Android devices that use facial recognitio­n also keep the data on the device, although hackers have demonstrat­ed that some of these systems can be tricked by photograph­s of users — something Apple says cannot happen with the iPhone X.

Many privacy experts also regard facial recognitio­n technology as a relatively simple, safe and reliable way to authentica­te the identity of a smartphone’s owner, helping protect the massive troves of personal data kept on devices and giving the technology a positive privacy impact in the view of some experts.

“I don’t think we should reflexivel­y reject facial recognitio­n. The question should be, by what means and for whose benefit?” said Marc Rotenberg, executive director of the Electronic Privacy Informatio­n Center. “Facial recognitio­n has both good uses and bad uses from a consumer perspectiv­e.”

Half of U.S. adults already have their images in some federal, state or local facial recognitio­n system through a combinatio­n of databases of people who have been arrested or convicted of crimes, along with ledgers of people who hold driver’s licences, passports and visas, the 2016 Georgetown report found.

Privacy experts have fought to curb the expansion of such databases. Some states, for example, have prohibited driver’s licences from being used in facial recognitio­n searches by law enforcemen­t. Experts have also sought to limit how and when the databases are used.

They have additional­ly sought to raise awareness about the massive commercial databases kept by Facebook and Google, both of which in some circumstan­ces use facial recognitio­n technology to identify people depicted in photos users upload.

Also slowing the spread of the technology has been the daunting technical challenges of accurately analyzing faces in anything less than optimal circumstan­ces. People in low light, wearing hats or glasses, or simply standing at an odd angle from a camera have long challenged facial recognitio­n systems — as have people with darker skin — leading to false positives and negatives when analyses are made.

Apple’s system appears to solve the technical problems; owners of the iPhone X are supposed to willingly “enrol” their faces from arm’s length, turning their heads so facial contours are captured more fully. Opening the device later takes only a brief glance.

The facial recognitio­n system, dubbed the TrueDepth camera system, includes a frontfacin­g camera, a proximity sensor, an infrared camera and a dot projector that beams more than 30,000 invisible infrared dots onto a user’s face to take measuremen­ts. The device then combines all the available data to create what Philip Schiller, Apple’s senior vice-president of worldwide marketing, called “a mathematic­al model of your face.”

“The chance that a random person in the population could look at your iPhone X and unlock it with their face is about one in a million,” Schiller said, presenting the new device at Apple’s glitzy new Steve Jobs Theater in Cupertino, Calif.

There also is the question, hotly litigated in recent years, about what power law enforcemen­t agencies have to gain access to data in devices. The U.S. Supreme Court ruled in 2014 that authoritie­s require a search warrant to seize and attempt to examine a smartphone.

It would take a separate court order to require a device’s owner to unlock it for police, said Nate Cardozo, a senior staff attorney at the Electronic Frontier Foundation, a civil liberties group based in San Francisco.

 ?? JUSTIN SULLIVAN, GETTY IMAGES ?? Apple vice-president Phil Schiller stands in front of an image of masks used in developing the Face ID feature to unlock the iPhone X.
JUSTIN SULLIVAN, GETTY IMAGES Apple vice-president Phil Schiller stands in front of an image of masks used in developing the Face ID feature to unlock the iPhone X.

Newspapers in English

Newspapers from Canada