Calgary Herald

A MATTER OF ETHICS

Advanced technologi­es like AI are challengin­g judicial system in unanticipa­ted ways

- ANITA BALAKRISHN­AN

Imagine you're in court, accused of a heinous crime. You didn't do it, but the police don't believe you. Now you're on trial and, after days of building a case against you, the Crown attorney introduces the evidence they expect to prove your guilt beyond a reasonable doubt: a trace of DNA found at the crime scene, analyzed using a new artificial intelligen­ce technology.

The Crown argues it is more likely than not that your DNA matches the profile of the suspect, based on applying the technology, called probabilis­tic genotyping. The presumptio­n of your innocence now hangs on one question: Will the judge understand how the technology works — and will they understand its limitation­s when instructin­g the jury and evaluating the lawyers' cases?

“Most litigators and adjudicato­rs don't understand the technology well enough to know what to challenge or question when (probabilis­tic genotyping) tools are used,” said a June case study by Jill Presser and Kate Robertson for the Law Commission of Ontario.

The case study was timely. The same month, the Canadian Judicial Council published its revised Ethical Principles for Judges, with a new addition: “Judges should develop and maintain proficienc­y with technology relevant to the nature and performanc­e of their judicial duties.”

Lawyers who spoke with The Logic said tech literacy isn't just a practical considerat­ion for judges, but an ethical obligation — and could set fairer precedents and lower costs of accessing the justice system.

The new guidance from the CJC, a body composed of the nation's chief justices and created by Parliament to investigat­e judicial conduct in response to complaints, suggests the bench agrees its members need to be more tech literate.

Prior to the release of the new ethical principles, the National Judicial Institute, a non-profit that designs and evaluates profession­al education for judges, has revised its curriculum yearly to stay “cutting edge” and address gaps in judges' education. New judges follow a learning program for the first five years on the bench, and also create their own profession­al-developmen­t plans, investing the equivalent of 10 days per year. While “each judge is responsibl­e for their own training,” the CJC says, some national training modules are required for new judges, and local chief justices may organize other training. While the NJI is an independen­t organizati­on, it does use some CJC content in its training, like the model instructio­ns that judges give to juries. Some CJC tech-training sessions have drawn as many as 165 participan­ts.

DNA evidence is far from the only technology transformi­ng courtrooms that, until relatively recently, was described as “shocking ... archaic” and of the “dinosaur era.” For better or worse, the COVID-19 pandemic forced judges in Canada and across the world into Zoom trials — and for every amusing “cat lawyer” gaffe, there have been tales of “dreadful” hearings where participan­ts said the use of technology fell short. The genotyping case study is just one of many examples the LCO identified where advanced technologi­es like artificial intelligen­ce are challengin­g the justice system in unanticipa­ted ways.

Amy Salyzyn, associate professor at the University of Ottawa's Faculty of Law and president of the Canadian Associatio­n for Legal Ethics, has been a longtime proponent of adding a specific duty of technologi­cal competence for judges. Over the years she has documented their struggles, ranging from suffering phishing attacks (or simply ignoring emails) to having to consider evidence that may contain doctored audio or, in the future, potentiall­y deepfakes. She pointed out that the CJC'S guidance gives no details on how a judge's technologi­cal proficienc­y might be achieved, assessed or enforced. The CJC guidelines say they are meant only to be an “aspiration­al” document, not as a “code of conduct that sets minimum standards.”

Nonetheles­s, Salyzyn said codifying an ethical principle of tech proficienc­y, more than two decades after the first edition, is “a very positive developmen­t.”

“More and more we are seeing issues with AI and algorithms have been introduced into court ... (and) facial recognitio­n. Of course there's even things like e-discovery, documentat­ion, emojis being put into evidence in court,” she said.

“Sometimes you can bring in experts, and that's appropriat­e. But ( judges) having some literacy with these tools and what's going on is important.”

When it comes to technologi­cal proficienc­y, Salyzyn told The Logic that judges could benefit from a supplement­ary set of “best practices” that include specific examples. For instance, the CJC doesn't offer specific direction on how judges can avoid what she called the “minefield” of seeing out-ofcourt evidence on social media, or what kinds of tweets can encourage transparen­cy and engagement without calling into question the impartiali­ty of the court.

In other jurisdicti­ons, the LCO has documented “extraordin­ary backlash” against what it called the “historical­ly racist, discrimina­tory or biased data” used in many algorithms that assess bail, sentencing, parole, police surveillan­ce, immigratio­n and child welfare, and the potential for future domestic violence. The LCO has determined there is no centralize­d accounting of how these technologi­es are used in Canada.

Lawyer Carole Piovesan, a managing partner at data-focused Toronto law firm INQ, told The Logic that judges are also increasing­ly encounteri­ng technology in corporate-law cases, such as smart contracts executed on a blockchain. The goal of judicial education, she said, is for judges to “know what questions to ask” and when to be critical.

“Judges (are) really adjudicati­ng on some pretty important issues, particular­ly where there's an absence of clear statutory laws,” said Piovesan. “There might not be a lot of precedent, so you're arguing by analogy. And you're starting to create new law, which is what judges are going to be asked to do very soon, as litigation picks up in the world of technology.”

The efficient use of technologi­es like online document filing is important beyond the convenienc­e of judges or lawyers, noted Gerald Chan, a lawyer at Stockwoods. Hiring expert witnesses and paying lawyers to sift through paper documents are expensive — too expensive to let many people effectivel­y fight their case. In 2018, a judge in the Ontario Superior Court of Justice decided as much, capping a lawyer's bill because the use of artificial intelligen­ce should have “significan­tly reduced” counsel's preparatio­n time.

Chan cited the 2014 Supreme Court of Canada decision R v. Fearon, which explored police searches of cellphones, as an example where society might have benefited from a faster pace of litigation.

“(The decision) was so long after the fact in that case that what the court was considerin­g was a flip phone. The consequenc­es of the decision, obviously, had implicatio­ns for smartphone­s,” said Chan, who has co-authored books on digital evidence, digital privacy and litigating artificial intelligen­ce.

“There's always a lag because of how slow litigation moves, but I think that justice participan­ts and judges have an obligation to be literate enough in technologi­cal advances that they don't exacerbate the problem.”

Chan said judges will soon be “issue-spotting” in cases posing yet-unanswered questions, like how a lawyer might meaningful­ly challenge evidence processed by an algorithm. What disclosure is the other side entitled to when it comes to source code? If an algorithm is evolving on its own, rather than functionin­g the precise way it was developed, is there any cross-examinatio­n of a live witness that could really challenge the results?

These are questions that are already coming to the fore with probabilis­tic genotyping, the type of DNA evidence examined by the Law Commission of Ontario's case study this summer.

The authors flagged a number of challenges the technology raises for the courts. The algorithm's creator or operator may have baked in subjectivi­ty. The maker of the tool may restrict disclosure of source code using non-disclosure and confidenti­ality agreements. There may be no independen­t experts outside of government forensic DNA labs to help the court.

Nonetheles­s, Presser and Robertson wrote, the evidence gleaned from probabilis­tic genotyping “is often confidentl­y received by judges and juries, possibly because it benefits from the high degree of confidence typically associated with both scientific evidence and artificial intelligen­ce.”

If efficiency and fairness were not reason enough to keep judges up to date on technology, it's easy to imagine a future where technology is used not only to run courtrooms and assess evidence, but also to track and assess judges themselves.

Companies like Blue J Legal are already attempting to predict outcomes of labour and employment cases. Lenczner Slaght, a law firm that specialize­s in arguing cases in court, now uses data and AI to analyze case outcomes for the Supreme Court of Canada, Competitio­n Tribunal and a special court for corporate cases.

Salyzyn, the U of Ottawa law professor, noted that judges find these types of analytics helpful, too. For example, one infamous study and an old adage suggest judges are more lenient after lunch — a pattern a judge might be wary of emulating.

“These kinds of tools can bring data to bear, rather than people's intuitions and narratives about how people act,” she said.

More and more, we are seeing issues with AI and algorithms have been introduced into court. (Judges) having some literacy with these tools and what's going on is important.

For more news about the innovation economy, visit www.thelogic.co

 ?? JASON FRANSON/THE CANADIAN PRESS FILES ?? Lawyers say tech literacy isn't just a practical considerat­ion for judges, but an ethical obligation — and could set fairer precedents and lower costs of accessing the justice system. The Canadian Judicial Council published its revised Ethical Principles for Judges recommendi­ng that judges “maintain proficienc­y with technology.”
JASON FRANSON/THE CANADIAN PRESS FILES Lawyers say tech literacy isn't just a practical considerat­ion for judges, but an ethical obligation — and could set fairer precedents and lower costs of accessing the justice system. The Canadian Judicial Council published its revised Ethical Principles for Judges recommendi­ng that judges “maintain proficienc­y with technology.”
 ?? ??

Newspapers in English

Newspapers from Canada