A MATTER OF ETHICS
Advanced technologies like AI are challenging judicial system in unanticipated ways
Imagine you're in court, accused of a heinous crime. You didn't do it, but the police don't believe you. Now you're on trial and, after days of building a case against you, the Crown attorney introduces the evidence they expect to prove your guilt beyond a reasonable doubt: a trace of DNA found at the crime scene, analyzed using a new artificial intelligence technology.
The Crown argues it is more likely than not that your DNA matches the profile of the suspect, based on applying the technology, called probabilistic genotyping. The presumption of your innocence now hangs on one question: Will the judge understand how the technology works — and will they understand its limitations when instructing the jury and evaluating the lawyers' cases?
“Most litigators and adjudicators don't understand the technology well enough to know what to challenge or question when (probabilistic genotyping) tools are used,” said a June case study by Jill Presser and Kate Robertson for the Law Commission of Ontario.
The case study was timely. The same month, the Canadian Judicial Council published its revised Ethical Principles for Judges, with a new addition: “Judges should develop and maintain proficiency with technology relevant to the nature and performance of their judicial duties.”
Lawyers who spoke with The Logic said tech literacy isn't just a practical consideration for judges, but an ethical obligation — and could set fairer precedents and lower costs of accessing the justice system.
The new guidance from the CJC, a body composed of the nation's chief justices and created by Parliament to investigate judicial conduct in response to complaints, suggests the bench agrees its members need to be more tech literate.
Prior to the release of the new ethical principles, the National Judicial Institute, a non-profit that designs and evaluates professional education for judges, has revised its curriculum yearly to stay “cutting edge” and address gaps in judges' education. New judges follow a learning program for the first five years on the bench, and also create their own professional-development plans, investing the equivalent of 10 days per year. While “each judge is responsible for their own training,” the CJC says, some national training modules are required for new judges, and local chief justices may organize other training. While the NJI is an independent organization, it does use some CJC content in its training, like the model instructions that judges give to juries. Some CJC tech-training sessions have drawn as many as 165 participants.
DNA evidence is far from the only technology transforming courtrooms that, until relatively recently, was described as “shocking ... archaic” and of the “dinosaur era.” For better or worse, the COVID-19 pandemic forced judges in Canada and across the world into Zoom trials — and for every amusing “cat lawyer” gaffe, there have been tales of “dreadful” hearings where participants said the use of technology fell short. The genotyping case study is just one of many examples the LCO identified where advanced technologies like artificial intelligence are challenging the justice system in unanticipated ways.
Amy Salyzyn, associate professor at the University of Ottawa's Faculty of Law and president of the Canadian Association for Legal Ethics, has been a longtime proponent of adding a specific duty of technological competence for judges. Over the years she has documented their struggles, ranging from suffering phishing attacks (or simply ignoring emails) to having to consider evidence that may contain doctored audio or, in the future, potentially deepfakes. She pointed out that the CJC'S guidance gives no details on how a judge's technological proficiency might be achieved, assessed or enforced. The CJC guidelines say they are meant only to be an “aspirational” document, not as a “code of conduct that sets minimum standards.”
Nonetheless, Salyzyn said codifying an ethical principle of tech proficiency, more than two decades after the first edition, is “a very positive development.”
“More and more we are seeing issues with AI and algorithms have been introduced into court ... (and) facial recognition. Of course there's even things like e-discovery, documentation, emojis being put into evidence in court,” she said.
“Sometimes you can bring in experts, and that's appropriate. But ( judges) having some literacy with these tools and what's going on is important.”
When it comes to technological proficiency, Salyzyn told The Logic that judges could benefit from a supplementary set of “best practices” that include specific examples. For instance, the CJC doesn't offer specific direction on how judges can avoid what she called the “minefield” of seeing out-ofcourt evidence on social media, or what kinds of tweets can encourage transparency and engagement without calling into question the impartiality of the court.
In other jurisdictions, the LCO has documented “extraordinary backlash” against what it called the “historically racist, discriminatory or biased data” used in many algorithms that assess bail, sentencing, parole, police surveillance, immigration and child welfare, and the potential for future domestic violence. The LCO has determined there is no centralized accounting of how these technologies are used in Canada.
Lawyer Carole Piovesan, a managing partner at data-focused Toronto law firm INQ, told The Logic that judges are also increasingly encountering technology in corporate-law cases, such as smart contracts executed on a blockchain. The goal of judicial education, she said, is for judges to “know what questions to ask” and when to be critical.
“Judges (are) really adjudicating on some pretty important issues, particularly where there's an absence of clear statutory laws,” said Piovesan. “There might not be a lot of precedent, so you're arguing by analogy. And you're starting to create new law, which is what judges are going to be asked to do very soon, as litigation picks up in the world of technology.”
The efficient use of technologies like online document filing is important beyond the convenience of judges or lawyers, noted Gerald Chan, a lawyer at Stockwoods. Hiring expert witnesses and paying lawyers to sift through paper documents are expensive — too expensive to let many people effectively fight their case. In 2018, a judge in the Ontario Superior Court of Justice decided as much, capping a lawyer's bill because the use of artificial intelligence should have “significantly reduced” counsel's preparation time.
Chan cited the 2014 Supreme Court of Canada decision R v. Fearon, which explored police searches of cellphones, as an example where society might have benefited from a faster pace of litigation.
“(The decision) was so long after the fact in that case that what the court was considering was a flip phone. The consequences of the decision, obviously, had implications for smartphones,” said Chan, who has co-authored books on digital evidence, digital privacy and litigating artificial intelligence.
“There's always a lag because of how slow litigation moves, but I think that justice participants and judges have an obligation to be literate enough in technological advances that they don't exacerbate the problem.”
Chan said judges will soon be “issue-spotting” in cases posing yet-unanswered questions, like how a lawyer might meaningfully challenge evidence processed by an algorithm. What disclosure is the other side entitled to when it comes to source code? If an algorithm is evolving on its own, rather than functioning the precise way it was developed, is there any cross-examination of a live witness that could really challenge the results?
These are questions that are already coming to the fore with probabilistic genotyping, the type of DNA evidence examined by the Law Commission of Ontario's case study this summer.
The authors flagged a number of challenges the technology raises for the courts. The algorithm's creator or operator may have baked in subjectivity. The maker of the tool may restrict disclosure of source code using non-disclosure and confidentiality agreements. There may be no independent experts outside of government forensic DNA labs to help the court.
Nonetheless, Presser and Robertson wrote, the evidence gleaned from probabilistic genotyping “is often confidently received by judges and juries, possibly because it benefits from the high degree of confidence typically associated with both scientific evidence and artificial intelligence.”
If efficiency and fairness were not reason enough to keep judges up to date on technology, it's easy to imagine a future where technology is used not only to run courtrooms and assess evidence, but also to track and assess judges themselves.
Companies like Blue J Legal are already attempting to predict outcomes of labour and employment cases. Lenczner Slaght, a law firm that specializes in arguing cases in court, now uses data and AI to analyze case outcomes for the Supreme Court of Canada, Competition Tribunal and a special court for corporate cases.
Salyzyn, the U of Ottawa law professor, noted that judges find these types of analytics helpful, too. For example, one infamous study and an old adage suggest judges are more lenient after lunch — a pattern a judge might be wary of emulating.
“These kinds of tools can bring data to bear, rather than people's intuitions and narratives about how people act,” she said.
More and more, we are seeing issues with AI and algorithms have been introduced into court. (Judges) having some literacy with these tools and what's going on is important.
For more news about the innovation economy, visit www.thelogic.co