National Post

AI ‘hallucinat­ion’ in B.C. court case a wake-up call

Cases that don’t exist cited by lawyer using AI

- Darryl Greer

Vancouver tech lawyer Ryan Black’s work with video game companies put him in a position to watch the rise of artificial intelligen­ce in the industry.

Now he finds himself on the front lines again as his own profession grapples with the technology.

“The degree to which it was impacting game studios really surprised people,” said Black, who helped the Law Society of British Columbia draft advice for lawyers about the use of AI.

“The generative (AI) revolution kind of has really hit people really hard in terms of, ‘Oh my gosh, we have to really pay attention to this now,’ so I would say that it’s a new thing for a lot of people,” he said referring to the type of technology that can create arguments and essays based on prompts from a user.

“It doesn’t surprise me that lawyers don’t know a lot about it.”

The rise of generative AI tools like CHATGPT, he said, is a “revolution­ary change to the practice of law,” but a recent ruling by the B.C. Supreme Court shows lawyers must use the technology cautiously and skepticall­y, legal experts say.

In a costs ruling released Feb. 20 related to a child custody case, it was revealed that Vancouver lawyer Chong Ke had used CHATGPT to prepare material submitted in the case.

The material included citations to cases that don’t exist, something her opponent in the case called an AI “hallucinat­ion.”

Ke told the court that discoverin­g that the cited cases were fictitious was “mortifying,” and she quickly informed the Law Society and admitted a “lack of knowledge of the risks” of using AI to draft court submission­s.

“I am now aware of the dangers of relying on Al generated materials,” Ke said in an affidavit. “I understand that this issue has arisen in other jurisdicti­ons and that the Law Society has published materials in recent months intended to alert lawyers in B.C. to these dangers.”

Ke apologized to the court and her fellow lawyers.

Her lawyer John Forstrom said in an email that the case “has provoked significan­t public interest, but the substance of what happened is otherwise unremarkab­le.”

“I’m not sure that the case has any significan­t implicatio­ns regarding the use of generative AI in court proceeding­s generally,” Forstrom said.

“Ms. Ke’s use of AI in this case was an acknowledg­ed mistake. The question if or how generative AI might appropriat­ely be employed in legal work did not arise.”

The society is now investigat­ing Ke’s conduct, spokeswoma­n Christine Tam said in an email.

“While recognizin­g the potential benefits of using AI in the delivery of legal services, the Law Society has also issued guidance to lawyers on the appropriat­e use of AI and expects lawyers to comply with the standards of conduct expected of a competent lawyer if they do rely on AI in serving their clients,” Tam said.

The law society’s guidance, issued in late 2023, urges lawyers to seek training in the use of the technology, and be aware of confidenti­ality issues around data security, plagiarism and copyright concerns, and potential bias in materials produced by the technology.

Law societies and courts in other provinces and territorie­s have also produced guidance on the use of AI. For instance, the Supreme Court of Yukon said in a June 2023 practice direction that if any lawyer relies on AI “for their legal research or submission­s in any matter and in any form,” they must tell the court.

For Black, with the firm DLA Piper, the use of AI is causing a lot of “necessary angst about relying on a tool like this to do any real heavy lifting.”

Black said delivering justice requires the impartiali­ty of a “human peer,” capable of evaluating and making important legally binding decisions.

He said he’s encountere­d lawyers and judges who are either “completely dialed into it, to completely averse to it, to completely agnostic to it.”

He said he’s been “impressed by the pace of the technology,” but the need for caution and skepticism around any materials generated by the material is essential for lawyers now and into the future.

Reflecting on the Ke case and others like it, Black said tools like CHATGPT are “really good autocorrec­t tools that do a fantastic job of relating text to other text, but they have no understand­ing of the world, they have no understand­ing of reality.”

The judge in the case that involved Ke said it would be “prudent” for her to tell the court and opposing lawyers if any other material employed AI technology like CHATGPT.

Black said artificial intelligen­ce technology isn’t going away, and any rules developed now will likely need changing due to the “breakneck speed” of its evolution.

“We are for sure now in a world where AI will exist,” he said. “There is no un-ringing this bell as far as I’m concerned.”

 ?? PETER MORGAN / THE ASSOCIATED PRESS FILES ?? The rise of generative AI tools like CHATGPT is a “revolution­ary change” for law, says lawyer Ryan Black.
PETER MORGAN / THE ASSOCIATED PRESS FILES The rise of generative AI tools like CHATGPT is a “revolution­ary change” for law, says lawyer Ryan Black.

Newspapers in English

Newspapers from Canada