City po­lice tested con­tro­ver­sial tech

Ottawa Citizen - - Front Page - SHAAMINI YOGARETNAM

Ot­tawa po­lice qui­etly tested fa­cial recog­ni­tion soft­ware for three months last year, the Ci­ti­zen has learned.

The pi­lot, which ended in March of 2019, came be­fore a pub­lic di­a­logue on the use of CCTV cam­eras in spa­ces like the By­Ward Mar­ket af­ter a string of fatal shoot­ings in the tourist hub.

Though Ot­tawa po­lice at the time spoke of the lim­i­ta­tions of the sur­veil­lance cam­eras, they did not pub­licly re­veal that the force had been in­ves­ti­gat­ing the use of a con­tro­ver­sial tool that would go hand-in-hand with such sur­veil­lance.

In a state­ment to this news­pa­per, po­lice said that “the Ot­tawa Po­lice Ser­vice has ex­plored the use of fa­cial recog­ni­tion tech­nol­ogy as a tool to help solve crimes by uti­liz­ing pho­to­graphs of per­sons of in­ter­ests in crim­i­nal in­ves­ti­ga­tions and com­par­ing them with ex­ist­ing data­bases col­lected per the Iden­ti­fi­ca­tion of Crim­i­nals

Act, RSC 185.”

Po­lice likened that to us­ing fin­ger­prints or DNA stored in an ex­ist­ing data­base to iden­tify people. But Ot­tawa po­lice said they do not cur­rently use the tech­nol­ogy and have no im­me­di­ate plans to buy or use fa­cial recog­ni­tion soft­ware.

Con­tro­versy has swirled around law en­force­ment’s use of fa­cial recog­ni­tion tech­nol­ogy in re­cent weeks af­ter a New York Times in­ves­ti­ga­tion into a com­pany called Clearview AI. The Times’ story de­tailed how the com­pany has cre­ated a mas­sive data­base of open-sourced im­ages scraped from web­sites across the in­ter­net, in­clud­ing Face­book. Po­lice can then use the data­base for com­par­i­son with things like sur­veil­lance im­ages.

Ac­cord­ing to its web­site, “Clearview is a new re­search tool used by law en­force­ment agen­cies to iden­tify per­pe­tra­tors and vic­tims of crimes.”

The Ci­ti­zen asked Ot­tawa po­lice whether they had tested or pur­chased Clearview AI, or any other fa­cial recog­ni­tion tech­nol­ogy, on Jan. 24. Lo­cal po­lice only re­sponded to the re­quest on Wednesday, nearly three weeks later, and pro­vided a fol­lowup re­sponse on Thurs­day. Ot­tawa po­lice would not spec­ify whose soft­ware the force used.

The New York Times re­ported that Clearview AI said Cana­dian po­lice forces were us­ing the tech­nol­ogy, prompt­ing ques­tions by the Ci­ti­zen.

The Royal Cana­dian Mounted Po­lice would not an­swer whether it had used the tech­nol­ogy.

“Gen­er­ally, the RCMP does not comment on spe­cific in­ves­tiga­tive tools or tech­niques. How­ever, we con­tinue to mon­i­tor new and evolv­ing tech­nol­ogy,” the RCMP said in a state­ment.

On­tario Pro­vin­cial Po­lice have not re­sponded to a sim­i­lar re­quest by the Ci­ti­zen.

While Toronto po­lice have ad­mit­ted us­ing fa­cial-recog­ni­tion tech­nol­ogy as a polic­ing tool in the past, on Thurs­day the force ad­mit­ted it in­deed had been “in­for­mally” us­ing Clearview AI in 2019 be­fore Toronto po­lice Chief Mark Saun­ders found out and put an end to the prac­tice.

Clearview boasts that the “tech­nol­ogy has helped law en­force­ment track down hun­dreds of at-large crim­i­nals, in­clud­ing pe­dophiles, ter­ror­ists and sex traf­fick­ers.”

Us­ing the com­pany’s fa­cial recog­ni­tion tech­nol­ogy, “law en­force­ment is able to catch the most dan­ger­ous crim­i­nals, solve the tough­est cold cases and make com­mu­ni­ties safer, es­pe­cially the most vul­ner­a­ble among us.”

Michael Geist is a pro­fes­sor at the Univer­sity of Ot­tawa who spe­cial­izes in tech­nol­ogy and pri­vacy law. He says there are “sig­nif­i­cant con­cerns” about the use of this kind of tech­nol­ogy.

The test­ing of the tech­nol­ogy has been rife with in­ac­cu­ra­cies and po­lice are us­ing it as a tool in the “ab­sence of a ro­bust le­gal frame­work that would gov­ern the use of this kind of tech­nol­ogy.”

In two cases in this prov­ince, po­lice ap­pear to have been us­ing the tech­nol­ogy for many months with­out any pub­lic dis­clo­sure that they were do­ing so.

“Given the in­ac­cu­ra­cies that can arise, the po­ten­tial for people to be falsely sus­pected or ac­cused based on us­ing faulty tech­nol­ogy is a very real pos­si­bil­ity,” Geist says.

“We’re talk­ing, of course, here about people’s lib­erty, about po­ten­tial harm to their rep­u­ta­tions and oth­ers based on tech­nol­ogy” that ex­perts say is “just not ready for the kinds of polic­ing uses that law en­force­ment would typ­i­cally be in­ter­ested in.”

There’s a “gap­ing hole” in our pri­vacy laws that doesn’t ad­e­quately ad­dress these is­sues, Geist says, and leaves people won­der­ing whether the use of their own im­age will be gath­ered in a “free-for-all” by law en­force­ment.

In Canada, po­lice can legally take fin­ger­prints af­ter the ar­rest and charge of a per­son. If prints are needed be­fore some­one is charged, po­lice need an im­pres­sion war­rant. Po­lice also need a war­rant to get some­one’s DNA un­will­ingly. But there is no ex­pec­ta­tion of pri­vacy if some­one dis­poses of some­thing, or throws into pub­lic space an item that would con­tain iden­ti­fy­ing in­for­ma­tion like a fin­ger­print or their DNA. It’s un­clear if pub­licly post­ing an im­age of one­self is akin to throw­ing it into the world and re­lin­quish­ing own­er­ship.

Ac­cord­ing to its web­site, Clearview searches only pub­lic in­for­ma­tion on the open web and can­not search pri­vate or pro­tected in­for­ma­tion, in­clud­ing in­for­ma­tion from a pri­vate so­cial me­dia ac­count.

The tech­nol­ogy can’t pro­vide sur­veil­lance, it can only offer com­par­i­son to pub­lic im­ages.

The com­pany points out that any re­sults that come from searches us­ing the tech­nol­ogy “legally re­quire fol­lowup in­ves­ti­ga­tion and confirmati­on.”

And while there are con­cerns that the tech­nol­ogy is faulty and may be used to foster sus­pi­cion of an in­no­cent per­son, Clearview says it “helps to ex­on­er­ate the in­no­cent, iden­tify vic­tims of child sex­ual abuse and other crimes, and avoid eye­wit­ness line­ups that are prone to hu­man er­ror.”

An anony­mous tes­ti­mo­nial on the web­site at­trib­uted to a de­tec­tive in a Cana­dian po­lice or­ga­ni­za­tion’s “sex crimes unit” says that “Clearview is hands-down the best thing that has hap­pened to vic­tim iden­ti­fi­ca­tion in the last 10 years. Within a week and a half of us­ing Clearview, made eight iden­ti­fi­ca­tions of ei­ther vic­tims or of­fend­ers through the use of this new tool.”

Ot­tawa po­lice said that “be­fore any im­ple­men­ta­tion of a fa­cial recog­ni­tion sys­tem, OPS would need to en­gage with the com­mu­nity and ex­perts to en­sure the pro­tec­tion of pri­vacy and hu­man rights.”

When asked what the three­month pi­lot con­sisted of, Deputy Chief Uday Jaswal said in a state­ment that the pi­lot was look­ing at whether the soft­ware would as­sist crim­i­nal in­ves­ti­ga­tions and par­tic­u­larly whether it could solve cases.

“The pi­lot also served to high­light the tech­no­log­i­cal and pro­ce­dural chal­lenges that would have to be ad­dressed in order to im­ple­ment the tool at OPS, in ad­di­tion to the pri­vacy and eth­i­cal chal­lenges.”

Po­lice said that any im­ple­men­ta­tion of fa­cial recog­ni­tion as a polic­ing tool would need the es­tab­lish­ment of com­mon stan­dards, data qual­ity and con­sis­tency, and a “long-term tech­nol­ogy blue­print.”

“There are also mul­ti­ple eth­i­cal con­sid­er­a­tions at play,” po­lice said.

Po­lice did not say if the tech­nol­ogy led to any ar­rests or charges.

Newspapers in English

Newspapers from Canada

© PressReader. All rights reserved.