Toronto Star

Duping facial recognitio­n technology

U of T researcher­s design algorithm that helps outsmart surveillan­ce tech

- JULIEN GIGNAC STAFF REPORTER

In the fight for more privacy online, University of Toronto researcher­s have devised an algorithm to sabotage facial recognitio­n technology.

Called an “adversaria­l attack,” the algorithm “catastroph­ically” affects facial recognitio­n detectors by making subtle changes to specific pixels — manipulati­ons small enough they almost can’t be picked up by the human eye, said researcher Joey Bose, a U of T electrical and computer engineerin­g master’s student.

“Machine learning models are very ubiquitous in our world, people use it all the time and it’s very important to understand methods in which they fail because we shouldn’t be so trusting of these algorithms without fully understand­ing them,” he said.

The impetus of the study was to help people understand the impact of facial detection and how it works, Bose said. He and his supervisor, Parham Aarabi, an associate professor, started their research in January.

“Machine learning models are very black box, so we don’t fully understand the ins-and-outs, which is why research in this field is so vital,” he said.

One example of how facial recognitio­n technology can be used is in advertisin­g — a person’s biometric data can be collected by facial recognitio­n detectors, he said, which can be siphoned and used for targeted advertisem­ents and recommenda­tions.

“We want to empower users to protect their own privacy,” he said. “We are giving agency to people.”

Bose said the prototype, which is limited to computers, but could be developed as an app, engages in a battle with the detectors.

“Over time, they play a game against each other, where the one fools the other and the other tries to detect the correct face,” he said, adding that the university­created generator can become advanced enough it can dupe the facial-recognitio­n algorithm with almost100-per-cent accuracy.

“If you think about the locations of the eyes, the nose and the mouth, those are indicators that there’s a face in an image, so if we perturb the pixels in those regions, chances are it’s easier to fool the detector,” Bose said.

Mark Hayes, a privacy and technology lawyer in Toronto, said facial recognitio­n detectors tend be rather inaccurate, even by today’s standards. He said this new research could lead to a “technology arms race.”

“As soon as somebody comes out with something like this, the facial recognitio­n people are going to then tweak their algorithms to try get around the disabling algorithms. It’s a ping-pong backand-forth,” he said.

Facial recognitio­n, Hayes said, can be a positive for some institutio­ns. Police are going to start using the technology more frequently, so impeding processes to solve crimes could jeopardize important investigat­ions, he said.

“Because there is such pervasive surveillan­ce, law enforcemen­t is increasing­ly trying to get a hold of these videos that are created and then using facial recognitio­n software to try and match it up with known people that they have,” he said. “I think ultimately there’s going to be a lot of push-and-pull here.”

Brenda McPhail, director of privacy, technology and surveillan­ce for the Canadian Civil Liberties Associatio­n, said technology used to subvert surveillan­ce “makes sense.”

She said unbridled surveillan­ce, including facial recognitio­n technology, could “chill” protests and other forms of dissent that are constituti­onally protected.

“The consequenc­es can be a severe erosion of democratic participat­ion,” McPhail said, noting that facial recognitio­n is already utilized in predictive policing in Canada — in Vancouver, for example, she said there’s a police program that uses it to determine which areas have high levels of crime.

“With cameras and facial recognitio­n your transient movements through space and time become tracked, fixed and recorded, in ways that can be later used for your benefit or against you. I think that line is very thin, and I think as a society we haven’t come to grips as to where that line should be,” she said.

“Basically what the researcher­s are doing is creating a tool that if you want to put that picture of your kid up and share it with Grandma and, at the same time, protect them from being tracked over time, you can do that,” McPhail said.

The University of Toronto research proves it is possible to “break” facial recognitio­n detectors, Bose said.

“The next step is to make this attack stronger, better and make it work against multiple different types of detectors,” he said. “This is more of a first step.”

 ?? STEVE RUSSELL/TORONTO STAR ?? Joey Bose designed an algorithm to disrupt facial recognitio­n in photos as part of a University of Toronto study.
STEVE RUSSELL/TORONTO STAR Joey Bose designed an algorithm to disrupt facial recognitio­n in photos as part of a University of Toronto study.

Newspapers in English

Newspapers from Canada