National Post

Ruling on Air Canada chatbot shows AI pitfalls

- Ian Bickis

TORONTO • A decision on Air Canada’s liability for what its chatbot said is a reminder of how companies need to be cautious when relying on artificial intelligen­ce, experts say.

The B.C. Civil Resolution Tribunal decision issued Wednesday showed that Air Canada tried to deny liability when its chatbot gave misleading informatio­n about the airline’s bereavemen­t fares.

“In effect, Air Canada suggests the chatbot is a separate legal entity that is responsibl­e for its own actions,” tribunal member Christophe­r Rivers said in his decision.

“This is a remarkable submission,” he said.

Jake Moffatt brought the challenge after he tried to get the lower bereavemen­t fare after already having paid full price for a flight, as the chatbot had implied he could, but the airline denied the claim saying he had to apply before taking the trip.

Rivers said in his decision that it should be obvious Air Canada is responsibl­e for the informatio­n on its website, and in this case the airline did not take reasonable care to ensure its chatbot was accurate.

Air Canada said in a statement that it will comply with the ruling, and that since it considers the matter closed, it has no additional informatio­n.

While the decision at a tribunal — which doesn’t create precedence — was fairly low stakes, with about $650 in dispute, it shows some of the ways companies can get caught up as they increasing­ly rely on the technology, said Ira Parghi, a lawyer with expertise in informatio­n and AI law.

“If an organizati­on or a company decides to go down that road, it has to get it right,” she said.

As Ai-powered systems become capable of answering increasing­ly complex questions, companies have to decide if it’s worth the risk.

“If an area is too thorny or complicate­d, or it’s not rule-based enough, or it relies too much on individual discretion, then maybe bots need to stay away,” said Parghi.

Laws are still catching up on some gaps presented by AI, which pending federal legislatio­n is looking to bridge, but in many cases existing law can cover the issues, she said.

“They relied on good old-fashioned tort law of negligent misreprese­ntation, and got to the right result based on, sort of, very convention­al reasoning.”

The argument that a company isn’t liable for its own chatbot is a novel one, said Brent Arnold, a partner at Gowling WLG.

“That’s the first time that I’ve seen that argument,” he said.

If a company wants to avoid liability as they offer a chatbot, they would have to use a lot of language making it highly visible that they take no responsibi­lity for the informatio­n it provides, which would make it of questionab­le use to consumers, said Arnold.

“That’s about as good as the chatbot saying, ‘Hey, why don’t you eat this thing I found on the sidewalk?’ Why would I do that?”

Companies will have to start disclosing more about what is Ai-powered as part of the coming legislatio­n, and they’ll also have to test high-impact systems more before rolling them out to the public, he said.

As rules around the practices evolve, companies will have to be careful on both civil liability and regulatory liability, said Arnold.

In the U.S., the Consumer Financial Protection Bureau issued guidance last year around problems with chatbots, warning that banks risk violating obligation­s, eroding customer trust and causing consumer harm when deploying chatbots.

“When a person’s financial life is at risk, the consequenc­es of being wrong can be grave,” the regulator said.

The CFPB warned of numerous negative outcomes that many people are likely familiar with, including wasted time, inaccurate informatio­n and feeling stuck and frustrated without a way to reach a human customer service representa­tive that can create “doom loops” of chatbot answers.

While the Air Canada example was straightfo­rward, just how much companies are liable for potential errors has yet to be tested much, said Arnold, as it’s still early days for the AI systems.

“It will be interestin­g to see what a Superior Court does with a similar circumstan­ce, where there’s a large amount of money at stake,” he said.

Gabor Lukacs, president of the Air Passenger Rights consumer advocacy group, said the Air Canada ruling does justice for the traveller, and showed that the B.C. Civil Resolution Tribunal is a forum where passengers can get a fair hearing.

He also noted that Air Canada was called out by the tribunal for providing a boilerplat­e response that denied every allegation, without providing any evidence to the contrary.

 ?? GAVIN YOUNG / POSTMEDIA NEWS FILES ?? Air Canada tried to deny liability when its chatbot gave misleading informatio­n, a B.C. tribunal found.
GAVIN YOUNG / POSTMEDIA NEWS FILES Air Canada tried to deny liability when its chatbot gave misleading informatio­n, a B.C. tribunal found.

Newspapers in English

Newspapers from Canada