Toronto Star

Face recognitio­n app used by police, until the chief found out

Civil Liberties Associatio­n slams use of controvers­ial Clearview AI technology

- KATE ALLEN SCIENCE & TECHNOLOGY REPORTER

Toronto police officers used a controvers­ial facial recognitio­n technology for months, according to a spokespers­on, before chief Mark Saunders became aware of its use and ordered it stopped.

Clearview AI, a U.S. company that provides artificial intelligen­ce-powered facial recognitio­n tools to law enforcemen­t agencies, has been called “reckless,” “invasive,” and “dystopian” by critics. It identifies people by scanning for matches in its database of billions of images scraped from the open web, including social media sites, providing vastly greater search powers than other known facial recognitio­n tools.

“Some members of the Toronto Police Service began using Clearview AI in October 2019 with the intent of informally testing this new and evolving technology,” said TPS spokespers­on Meaghan Gray.

“The chief directed that its use be halted immediatel­y upon his awareness, and the order to cease using the product was given on Feb. 5, 2020.”

Gray said the Toronto Police Service has requested that Ontario’s Informatio­n and Privacy Commission­er and the Crown Attorney’s Office work with the force to review the technology’s appropriat­eness as a tool for law enforcemen­t, “given that it is also used by other law enforcemen­t agencies in North America.”

“Until a fulsome review of the product is completed, it will not be used by the Toronto Police Service,” she said Thursday.

A front-page New York Times story published in January first drew scrutiny to the previously little-known company’s broad powers and effects on privacy. The report detailed how the company claims to have a database of over three billion images scraped from Facebook, YouTube and millions of other websites.

Law enforcemen­t officials who use Clearview AI can run an image of a person against this massive database, pulling up matches collected from across the web. People who have asked to try the technology on themselves pulled up images they didn’t know were online or had never seen before.

Last May, when the Star first revealed that Toronto police were using facial recognitio­n technology, the force said their tool only searched for matches in its own internal database of lawfully acquired mugshots.

At the time, Staff Insp. Stephen Harris of Forensic Identifica­tion Services said “there are no plans to expand the TPS’s use of facial recognitio­n beyond our current mugshot database. We are not judicially authorized to do so.”

Toronto police did not respond to further questions Thursday, including whether officers were judicially authorized to use Clearview AI, whether it had been used in investigat­ions or arrests, and how chief Saunders was not aware of its use.

Brenda McPhail, director of the privacy, technology and surveillan­ce project at the Canadian Civil Liberties Associatio­n, called Toronto police’s use of Clearview AI “a remarkable violation of public trust.”

“Clearview AI collects images of people without consent, in violation of the terms of service of the platforms people trust to protect their informatio­n — arguably, illegally — and no police force in Canada should be using technology whose lawfulness is open to question,” McPhail said.

“This company allegedly has developed its entire facial recognitio­n system by illegitima­tely if not illegally scraping images from the public internet,” says Chris Parsons, a senior research associate at the University of Toronto’s Citizen Lab.

If Toronto police or any other Canadian law enforcemen­t agency did that directly, “it would be radically afoul of Canadian privacy legislatio­n. Using services produced by companies predicated on violations of Canadian law seems like an inappropri­ate technology to adopt,” says Parsons.

According to the New York Times, more than 600 law enforcemen­t agencies use Clearview AI. Earlier this week, the Ontario Provincial Police and the Mounties both declined to answer the Star’s questions about whether they used Clearview AI.

“The OPP has used facial recognitio­n technology for various types of investigat­ions,” OPP spokespers­on Carolle Dionne said. “As its use is operationa­l and specific to investigat­ive technique we will not specify further.”

“Generally, the RCMP does not comment on specific investigat­ive tools or techniques,” said RCMP spokespers­on Catherine Fortin.

“However, we continue to monitor new and evolving technology.”

In the last month, YouTube, Facebook, Twitter, and LinkedIn have all demanded that the company stop using data scraped from their websites, according to media reports.

“Clearview AI collects images of people without consent.”

BRENDA MCPHAIL CANADIAN CIVIL LIBERTIES ASSOCIATIO­N

 ?? THE NEW YORK TIMES FILE PHOTO ?? Clearview AI identifies people by scanning the web for visual matches, including on social media sites.
THE NEW YORK TIMES FILE PHOTO Clearview AI identifies people by scanning the web for visual matches, including on social media sites.
 ?? ANDREW FRANCIS WALLACE TORONTO STAR FILE PHOTO ?? Chief Mark Saunders ordered officers to stop using controvers­ial facial recognitio­n technology, a police spokespers­on said.
ANDREW FRANCIS WALLACE TORONTO STAR FILE PHOTO Chief Mark Saunders ordered officers to stop using controvers­ial facial recognitio­n technology, a police spokespers­on said.

Newspapers in English

Newspapers from Canada