The Rise and Fall of Air Canada’s AI Chatbot


The Gist

  • AI hallucinations. AI chatbots can hallucinate 3% to 27% of the time, leading to incorrect or misleading responses.
  • Air Canada’s legal battle. A Canadian tribunal ruled Air Canada must honor a discount promised by its AI chatbot, highlighting the legal implications of AI errors.
  • Deploying AI wisely. Companies should implement guardrails, fine-tuning, action models, and hallucination prevention to minimize risks in AI-driven customer experiences.

We all know AI can lie. But is it now costing businesses money? 

Hallucinations are responses generated by artificial intelligence that are incorrect, misleading or downright nonsensical — though the bots present them as fact. And according to research, even in situations designed to prevent it from happening, AI-powered chatbots hallucinate anywhere from 3% to 27% of the time. 

Yet despite this flaw, more and more customer experience teams are deploying AI to bolster their efforts and improve service. As of 2023, 79% of organizations polled said they’re using artificial intelligence in their CX toolset in some capacity, according to CMSWire’s State of Digital Customer Experience report. 

Statistic showing number of companies that use AI for CX in some capacity

Air Canada was one such business, having set up an AI chatbot on its website to assist customers with questions and concerns. But now, after a landmark case that forced the company to honor an against-policy discount, the company seems to be rethinking its strategy. 

Air Canada Forced to Honor AI Chatbot’s Refund Promise 

Vancouver resident Jake Moffatt used Air Canada’s AI chatbot to see if the airline offered bereavement fares following the death of his grandmother. The bot told Moffatt that the company did offer a discount, which he could claim up to 90 days after flying. 

Moffatt booked the flight for a sum of $1,200. Yet when he later requested his promised discount, he was told by airline support staff that the chatbot’s responses were wrong and nonbinding. At a Canadian tribunal on the matter, Air Canada claimed that the AI chatbot was a “separate legal entity” to the company and couldn’t be held responsible for what it told customers. 

Tribunal member Christopher Rivers, however, disagreed. He determined that the airline must follow through with the chatbot’s promised discount. 

The airline, Rivers explained, committed negligent misrepresentation, writing, “It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot.” 

Air Canada did not respond to a request for comment. However, as of April 2024, the bot is no longer available on the airline’s website. 

Related Article: Inside the Failed Willy Wonka Experience



Source link

IPAD SHOW ROOM
Logo