City police tested controversial tech
Ottawa police quietly tested facial recognition software for three months last year, the Citizen has learned.
The pilot, which ended in March of 2019, came before a public dialogue on the use of CCTV cameras in spaces like the ByWard Market after a string of fatal shootings in the tourist hub.
Though Ottawa police at the time spoke of the limitations of the surveillance cameras, they did not publicly reveal that the force had been investigating the use of a controversial tool that would go hand-in-hand with such surveillance.
In a statement to this newspaper, police said that “the Ottawa Police Service has explored the use of facial recognition technology as a tool to help solve crimes by utilizing photographs of persons of interests in criminal investigations and comparing them with existing databases collected per the Identification of Criminals
Act, RSC 185.”
Police likened that to using fingerprints or DNA stored in an existing database to identify people. But Ottawa police said they do not currently use the technology and have no immediate plans to buy or use facial recognition software.
Controversy has swirled around law enforcement’s use of facial recognition technology in recent weeks after a New York Times investigation into a company called Clearview AI. The Times’ story detailed how the company has created a massive database of open-sourced images scraped from websites across the internet, including Facebook. Police can then use the database for comparison with things like surveillance images.
According to its website, “Clearview is a new research tool used by law enforcement agencies to identify perpetrators and victims of crimes.”
The Citizen asked Ottawa police whether they had tested or purchased Clearview AI, or any other facial recognition technology, on Jan. 24. Local police only responded to the request on Wednesday, nearly three weeks later, and provided a followup response on Thursday. Ottawa police would not specify whose software the force used.
The New York Times reported that Clearview AI said Canadian police forces were using the technology, prompting questions by the Citizen.
The Royal Canadian Mounted Police would not answer whether it had used the technology.
“Generally, the RCMP does not comment on specific investigative tools or techniques. However, we continue to monitor new and evolving technology,” the RCMP said in a statement.
Ontario Provincial Police have not responded to a similar request by the Citizen.
While Toronto police have admitted using facial-recognition technology as a policing tool in the past, on Thursday the force admitted it indeed had been “informally” using Clearview AI in 2019 before Toronto police Chief Mark Saunders found out and put an end to the practice.
Clearview boasts that the “technology has helped law enforcement track down hundreds of at-large criminals, including pedophiles, terrorists and sex traffickers.”
Using the company’s facial recognition technology, “law enforcement is able to catch the most dangerous criminals, solve the toughest cold cases and make communities safer, especially the most vulnerable among us.”
Michael Geist is a professor at the University of Ottawa who specializes in technology and privacy law. He says there are “significant concerns” about the use of this kind of technology.
The testing of the technology has been rife with inaccuracies and police are using it as a tool in the “absence of a robust legal framework that would govern the use of this kind of technology.”
In two cases in this province, police appear to have been using the technology for many months without any public disclosure that they were doing so.
“Given the inaccuracies that can arise, the potential for people to be falsely suspected or accused based on using faulty technology is a very real possibility,” Geist says.
“We’re talking, of course, here about people’s liberty, about potential harm to their reputations and others based on technology” that experts say is “just not ready for the kinds of policing uses that law enforcement would typically be interested in.”
There’s a “gaping hole” in our privacy laws that doesn’t adequately address these issues, Geist says, and leaves people wondering whether the use of their own image will be gathered in a “free-for-all” by law enforcement.
In Canada, police can legally take fingerprints after the arrest and charge of a person. If prints are needed before someone is charged, police need an impression warrant. Police also need a warrant to get someone’s DNA unwillingly. But there is no expectation of privacy if someone disposes of something, or throws into public space an item that would contain identifying information like a fingerprint or their DNA. It’s unclear if publicly posting an image of oneself is akin to throwing it into the world and relinquishing ownership.
According to its website, Clearview searches only public information on the open web and cannot search private or protected information, including information from a private social media account.
The technology can’t provide surveillance, it can only offer comparison to public images.
The company points out that any results that come from searches using the technology “legally require followup investigation and confirmation.”
And while there are concerns that the technology is faulty and may be used to foster suspicion of an innocent person, Clearview says it “helps to exonerate the innocent, identify victims of child sexual abuse and other crimes, and avoid eyewitness lineups that are prone to human error.”
An anonymous testimonial on the website attributed to a detective in a Canadian police organization’s “sex crimes unit” says that “Clearview is hands-down the best thing that has happened to victim identification in the last 10 years. Within a week and a half of using Clearview, made eight identifications of either victims or offenders through the use of this new tool.”
Ottawa police said that “before any implementation of a facial recognition system, OPS would need to engage with the community and experts to ensure the protection of privacy and human rights.”
When asked what the threemonth pilot consisted of, Deputy Chief Uday Jaswal said in a statement that the pilot was looking at whether the software would assist criminal investigations and particularly whether it could solve cases.
“The pilot also served to highlight the technological and procedural challenges that would have to be addressed in order to implement the tool at OPS, in addition to the privacy and ethical challenges.”
Police said that any implementation of facial recognition as a policing tool would need the establishment of common standards, data quality and consistency, and a “long-term technology blueprint.”
“There are also multiple ethical considerations at play,” police said.
Police did not say if the technology led to any arrests or charges.