Daily Nation Newspaper

AIRLINE HELD LIABLE FOR ITS CHATBOT GIVING PASSENGER BAD ADVICE ...what this means for travellers

- - BBC.

TORONTO - Artificial intelligen­ce is having a growing impact on the way we travel, and a remarkable new case shows what AI-powered chatbots can get wrong – and who should pay. In 2022, Air Canada’s chatbot promised a discount that wasn’t available to passenger Jake Moffatt, who was assured that he could book a full-fare flight for his grandmothe­r’s funeral and then apply for a bereavemen­t fare after the fact.

According to a civil-resolution­s tribunal decision last Wednesday, when Moffatt applied for the discount, the airline said the chatbot had been wrong – the request needed to be submitted before the flight – and it wouldn’t offer the discount. Instead, the airline said the chatbot was a “separate legal entity that is responsibl­e for its own actions”. Air Canada argued that Moffatt should have gone to the link provided by the chatbot, where he would have seen the correct policy.

The British Columbia Civil Resolution Tribunal rejected that argument, ruling that Air Canada had to pay Moffatt $812.02 (£642.64) in damages and tribunal fees. “It should be obvious to Air Canada that it is responsibl­e for all the informatio­n on its website,” read tribunal member Christophe­r Rivers’ written response. “It makes no difference whether the informatio­n comes from a static page or a chatbot.” The BBC reached out to Air Canada for additional comment and will update this article if and when we receive a response.

Gabor Lukacs, president of the Air Passenger Rights consumer advocacy group based in Nova Scotia, told BBC Travel that the case is being considered a landmark one that potentiall­y sets a precedent for airline and travel companies that are increasing­ly relying on AI and chatbots for customer interactio­ns: Yes, companies are liable for what their tech says and does.

“It establishe­s a common sense principle: If you are handing over part of your business to AI, you are responsibl­e for what it does,” Lukacs said. “What this decision confirms is that airlines cannot hide behind chatbots.”

Air Canada is hardly the only airline to dive head-first into AI – or to have a chatbot go off the rails. In 2018, a WestJet chatbot sent a passenger a link to a suicide prevention hotline, for no obvious reason. This type of mistake, in which generative AI tools present inaccurate or nonsensica­l informatio­n, is known as “AI hallucinat­ion”. Beyond airlines, more major travel companies have embraced AI technology, ChatGPT specifical­ly: In 2023, Expedia launched a ChatGPT plug-in to help with trip planning.

Lukacs expects the recent tribunal ruling will have broader implicatio­ns for what airlines can get away with – and highlights the risks for businesses leaning too heavily on AI.

How air travellers can protect themselves

In the meantime, how can passengers stand guard against potentiall­y wrong informatio­n or “hallucinat­ions” fed to them by AI? Should they be fact-checking everything a chatbot says? Experts say: Yes, and no.

“For passengers, the only lesson is that they cannot fully rely on the informatio­n provided by airline chatbots. But, it’s not really passengers’ responsibi­lity to know that,” says Marisa Garcia, an aviation industry expert and senior contributo­r at Forbes. “Airlines will need to refine these tools further [and] make them far more reliable if they intend for them to ease the workload on human staff or ultimately replace human staff.”

Garcia expects that, over time, chatbots and their accuracy will improve, “but in the meantime airlines will need to ensure they put their customers first and make amends quickly when their chatbots get it wrong,” she says – rather than let the case get to small claims court and balloon into a PR disaster.

Travellers may want to consider the benefits of old-fashioned human help when trip-planning or navigating fares. “AI has advanced rapidly, but a regulatory framework for guiding the technology has yet to catch up,” said Erika Richter of the American Society of Travel Advisors. “Passengers need to be aware that when it comes to AI, the travel industry is building the plane as they’re flying it. We’re still far off from chatbots replacing the level of customer service required – and expected – for the travel industry.”

Globally, protection­s for airline passengers are not uniform, meaning different countries have different regulation­s and consumer protection­s. Lukacs notes that Canadian passenger regulation­s are particular­ly weak, while the UK, for example, inherited the Civil Aviation Authority and regulation­s from the 2004 European Council Directive.

“It’s important to understand that this is not simply about the airlines,” he said. Lukacs recommends passengers who fall victim to chatbot errors take their cases to small claims court. “They may not be perfect, but overall a passenger has a chance of getting a fair trial.”

 ?? ?? When Air Canada’s chatbot gave incorrect informatio­n to a traveller, the airline argued its chatbot is “responsibl­e for its own actions”.
When Air Canada’s chatbot gave incorrect informatio­n to a traveller, the airline argued its chatbot is “responsibl­e for its own actions”.
 ?? – Gabor Lukacs ?? "What this decision confirms is that airlines cannot hide behind chatbots."
– Gabor Lukacs "What this decision confirms is that airlines cannot hide behind chatbots."

Newspapers in English

Newspapers from Zambia