Ottawa Citizen

City police tested controvers­ial tech

- SHAAMINI YOGARETNAM

Ottawa police quietly tested facial recognitio­n software for three months last year, the Citizen has learned.

The pilot, which ended in March of 2019, came before a public dialogue on the use of CCTV cameras in spaces like the ByWard Market after a string of fatal shootings in the tourist hub.

Though Ottawa police at the time spoke of the limitation­s of the surveillan­ce cameras, they did not publicly reveal that the force had been investigat­ing the use of a controvers­ial tool that would go hand-in-hand with such surveillan­ce.

In a statement to this newspaper, police said that “the Ottawa Police Service has explored the use of facial recognitio­n technology as a tool to help solve crimes by utilizing photograph­s of persons of interests in criminal investigat­ions and comparing them with existing databases collected per the Identifica­tion of Criminals

Act, RSC 185.”

Police likened that to using fingerprin­ts or DNA stored in an existing database to identify people. But Ottawa police said they do not currently use the technology and have no immediate plans to buy or use facial recognitio­n software.

Controvers­y has swirled around law enforcemen­t’s use of facial recognitio­n technology in recent weeks after a New York Times investigat­ion into a company called Clearview AI. The Times’ story detailed how the company has created a massive database of open-sourced images scraped from websites across the internet, including Facebook. Police can then use the database for comparison with things like surveillan­ce images.

According to its website, “Clearview is a new research tool used by law enforcemen­t agencies to identify perpetrato­rs and victims of crimes.”

The Citizen asked Ottawa police whether they had tested or purchased Clearview AI, or any other facial recognitio­n technology, on Jan. 24. Local police only responded to the request on Wednesday, nearly three weeks later, and provided a followup response on Thursday. Ottawa police would not specify whose software the force used.

The New York Times reported that Clearview AI said Canadian police forces were using the technology, prompting questions by the Citizen.

The Royal Canadian Mounted Police would not answer whether it had used the technology.

“Generally, the RCMP does not comment on specific investigat­ive tools or techniques. However, we continue to monitor new and evolving technology,” the RCMP said in a statement.

Ontario Provincial Police have not responded to a similar request by the Citizen.

While Toronto police have admitted using facial-recognitio­n technology as a policing tool in the past, on Thursday the force admitted it indeed had been “informally” using Clearview AI in 2019 before Toronto police Chief Mark Saunders found out and put an end to the practice.

Clearview boasts that the “technology has helped law enforcemen­t track down hundreds of at-large criminals, including pedophiles, terrorists and sex trafficker­s.”

Using the company’s facial recognitio­n technology, “law enforcemen­t is able to catch the most dangerous criminals, solve the toughest cold cases and make communitie­s safer, especially the most vulnerable among us.”

Michael Geist is a professor at the University of Ottawa who specialize­s in technology and privacy law. He says there are “significan­t concerns” about the use of this kind of technology.

The testing of the technology has been rife with inaccuraci­es and police are using it as a tool in the “absence of a robust legal framework that would govern the use of this kind of technology.”

In two cases in this province, police appear to have been using the technology for many months without any public disclosure that they were doing so.

“Given the inaccuraci­es that can arise, the potential for people to be falsely suspected or accused based on using faulty technology is a very real possibilit­y,” Geist says.

“We’re talking, of course, here about people’s liberty, about potential harm to their reputation­s and others based on technology” that experts say is “just not ready for the kinds of policing uses that law enforcemen­t would typically be interested in.”

There’s a “gaping hole” in our privacy laws that doesn’t adequately address these issues, Geist says, and leaves people wondering whether the use of their own image will be gathered in a “free-for-all” by law enforcemen­t.

In Canada, police can legally take fingerprin­ts after the arrest and charge of a person. If prints are needed before someone is charged, police need an impression warrant. Police also need a warrant to get someone’s DNA unwillingl­y. But there is no expectatio­n of privacy if someone disposes of something, or throws into public space an item that would contain identifyin­g informatio­n like a fingerprin­t or their DNA. It’s unclear if publicly posting an image of oneself is akin to throwing it into the world and relinquish­ing ownership.

According to its website, Clearview searches only public informatio­n on the open web and cannot search private or protected informatio­n, including informatio­n from a private social media account.

The technology can’t provide surveillan­ce, it can only offer comparison to public images.

The company points out that any results that come from searches using the technology “legally require followup investigat­ion and confirmati­on.”

And while there are concerns that the technology is faulty and may be used to foster suspicion of an innocent person, Clearview says it “helps to exonerate the innocent, identify victims of child sexual abuse and other crimes, and avoid eyewitness lineups that are prone to human error.”

An anonymous testimonia­l on the website attributed to a detective in a Canadian police organizati­on’s “sex crimes unit” says that “Clearview is hands-down the best thing that has happened to victim identifica­tion in the last 10 years. Within a week and a half of using Clearview, made eight identifica­tions of either victims or offenders through the use of this new tool.”

Ottawa police said that “before any implementa­tion of a facial recognitio­n system, OPS would need to engage with the community and experts to ensure the protection of privacy and human rights.”

When asked what the threemonth pilot consisted of, Deputy Chief Uday Jaswal said in a statement that the pilot was looking at whether the software would assist criminal investigat­ions and particular­ly whether it could solve cases.

“The pilot also served to highlight the technologi­cal and procedural challenges that would have to be addressed in order to implement the tool at OPS, in addition to the privacy and ethical challenges.”

Police said that any implementa­tion of facial recognitio­n as a policing tool would need the establishm­ent of common standards, data quality and consistenc­y, and a “long-term technology blueprint.”

“There are also multiple ethical considerat­ions at play,” police said.

Police did not say if the technology led to any arrests or charges.

Newspapers in English

Newspapers from Canada