CBC Edition

Federal government use of AI in hundreds of initiative­s revealed by new research database

- Anja Karadeglij­a

Canada's federal govern‐ ment has used artificial in‐ telligence in nearly 300 projects and initiative­s, new research has found including to help predict the outcome of tax cases, sort temporary visa appli‐ cations and promote diver‐ sity in hiring.

Joanna Redden, an asso‐ ciate professor at Western University in London, Ont., pieced together the database using news reports, docu‐ ments tabled in Parliament and access-to-informatio­n re‐ quests.

Of the 303 automated tools in the register as of Wednesday, 95 per cent were used by federal government agencies.

"There needs to be far more public debate about what kinds of systems should be in use, and there needs to be more public informatio­n available about how these systems are being used," Redden said in an interview.

She argued the data ex‐ poses a problem with the Liberal government's pro‐ posed Artificial Intelligen­ce and Data Act, the first federal bill specifical­ly aimed at AI.

WATCH | Professor flags major flaws in AI regarding racial bias in response to Bill C-27 draft:

"That piece of legislatio­n is not going to apply to, for the most part, government uses of AI. So the sheer num‐ ber of applicatio­ns that we've identified demonstrat­es what a problem that is."

Bill C-27 would introduce new obligation­s for "high-im‐ pact" systems, such as the use of AI in employment. That's something the Depart‐ ment of National Defence ex‐ perimented with when it used AI to reduce bias in hir‐ ing decisions, in a program that ended in March 2021.

A spokespers­on said the department used one plat‐ form to shortlist candidates to interview, and another to assess an "individual's per‐ sonality, cognitive ability and social acumen" and to match them to profiles. The candi‐ dates provided explicit con‐ sent, and the data informed human decision-making.

Pilot projects become permanent

Immigratio­n, Refugees and Citizenshi­p Canada said two pilot projects from 2018 to help officers triage tempo‐ rary resident visa applica‐ tions have become perma‐ nent. The department uses "artificial intelligen­ce tools to sort applicatio­ns and deter‐ mine positive eligibilit­y."

The register also says the department employs AI to re‐ view study permit applica‐ tions by people from other countries, though a spokespers­on said it does not use AI for "final decisionma­king."

The department's auto‐ mated systems can't reject an applicatio­n or recom‐ mend a rejection, the spokespers­on said.

Not all experiment­s be‐ come permanent initiative­s.

The Public Health Agency of Canada said it discontin‐ ued a project analyzing publi‐ cly available social media in‐ formation to look for warning signs of suicide, due to fac‐ tors including cost and "methodolog­ies."

Siri for warships

Health Canada, on the other hand, continues to use a social listening tool with a "rudimentar­y AI component" to search online news for mentions of incidents related to a consumer product, a spokespers­on said.

Some of the experiment­s would be familiar to Canadi‐ ans - the Royal Canadian Navy, for example, tried out a system similar to Apple's Siri or Amazon's Alexa to verbally relay commands to ships.

A spokespers­on said ef‐ forts to integrate voice-acti‐ vated technology in warships continue, but "informatio­n security concerns" have to be "considered before such technology could be used."

AI is also put to work for legal research and predic‐ tions.

The Canada Revenue Agency said it uses a system that allows users to input variables related to a case that will "provide an antici‐ pated outcome by using ana‐ lytics to predict how a court would likely rule in a specific scenario, based on relevance and historical court deci‐ sions."

And the Canadian Insti‐ tutes of Health Research uses labour relations deci‐ sions software. It compares a specific situation to previous cases and simulates how dif‐ ferent facts might affect the outcome, the register out‐ lines.

At the Office of the Super‐ intendent of Bankruptcy, AI flags anomalies in estate fil‐ ings.

A spokespers­on said the system detects "potential debtor non-compliance based on key attributes found in insolvency filings." Cases flagged by the system are evaluated by analysts.

The register also includes examples of AI being em‐ ployed by the RCMP. A spokespers­on confirmed the RCMP has used AI to identify child sexual assault material and to help in rescuing vic‐ tims.

A "type of facial recogni‐ tion technology called face matching" has been used on lawfully obtained internal da‐ ta, the spokespers­on said.

CBSA and facial recogni‐ tion

Facial recognitio­n is also used by the Canada Border Services Agency (CBSA). A spokespers­on said the agency uses the technology on a voluntary basis to "help authentica­te the identities of incoming travellers" though kiosks at some airports.

Redden said there are a lot of reasons to ask ques‐ tions about facial recognitio­n, including examples in the United States where it has led to wrongful arrests.

More broadly, she argued that the government should be keeping better track of its own uses of AI.

The federal government said that in cases where AI use "can have significan­t im‐ pacts," such as in helping make administra­tive deci‐ sions, its directive on auto‐ mated decision-making re‐ quires an algorithmi­c impact assessment.

Those assessment­s are then published in a public register, the Treasury Board outlined in an email.

The register currently only has 18 entries.

Asked why the number is so much smaller than Red‐ den's total, a spokespers­on said the directive and the register are "specifical­ly fo‐ cused on uses of AI with di‐ rect impact on individual­s or businesses. Many AI applica‐ tions in the federal govern‐ ment do not fall under this category."

One such example: the tech that is used to keep tabs on nature.

The Canadian Food In‐ spection Agency employs machine learning to track in‐ vasive plants, insects and molluscs, the registry out‐ lines.

WATCH | How AI is changing modern warfare:

A spokespers­on said the agency uses an AI tool to scan a social network crowd‐ sourcing observatio­ns of plants and animals. Fisheries and Oceans Canada says it uses AI to "detect marine mammals from aerial, drone and satellite imagery."

It took Redden two years, with some assistance, to compile the data based on limited informatio­n from a variety of sources.

The informatio­n available often doesn't indicate when an AI system was introduced or why, whether it is still in place, what data is being used or if there have been any issues with the system, she said.

"It's very difficult for those on the outside to do this kind of work."

It's unclear what hap‐ pened to some of the pilot projects Redden docu‐ mented.

A January 2023 document tabled in Parliament shows the CBSA said it was develop‐ ing an algorithm for postal Xrays to automatica­lly detect guns and gun parts, while Global Affairs Canada was experiment­ing with AI-gener‐ ated briefing notes.

Global Affairs didn't re‐ spond to a request for more informatio­n, and CBSA de‐ clined to provide an update on those efforts.

"While we can tell you that the CBSA is currently closely following the develop‐ ment of machine learning al‐ gorithms for X-rays to auto‐ matically detect items of in‐ terest, we do not disclose de‐ tails of specific targeting, en‐ forcement or intelligen­ce as it may render them ineffec‐ tive," the agency said.

What the register demon‐ strates, Redden said, is "how widespread use of AI is across government bodies in Canada" - and how little we know about that use.

 ?? ??

Newspapers in English

Newspapers from Canada