Air Canada has been ordered to comply with a refund policy that was accidentally created by its chatbot.
The case began when Jake Moffatt, grieving over the death of his grandmother, contacted Air Canada’s AI chatbot system to understand the company’s death fare policy.
The chatbot told him he could book a flight and apply for a special discounted rate within 90 days of booking.
However, this information was incorrect. This is because Air Canada’s actual policy is that you cannot receive a refund for bereavement travel once you have booked a flight.
This information directly conflicts with Air Canada’s actual policy, which states that there are no refunds for bereavement travel once a flight has been booked.
The chatbot specifically said, “If you need to travel immediately or have already traveled and would like to submit a ticket to reduce the bereavement rate, please fill out and submit a ticket refund application within 90 days from the date the ticket was issued. ”
This set the stage for the legal battle that followed.
Air Canada argued that the chatbot, as an autonomous entity, was responsible for the misinformation and thus absolved Air Canada of liability. This claim was met with skepticism by the tribunal.
The tribunal judge Christopher Rivers, who presided over the case, criticized Air Canada’s position and said, “Air Canada asserts that it cannot be held liable for information provided by one of its agents, servants or representatives, including a chatbot.”
He questioned the airline’s expectation that customers would check information in the chatbot against other sections of the Air Canada website, pointing out the unreasonable nature of that expectation.
The ruling in Moffatt’s favor granted him a partial refund of $650.88 CAD, down from the original fare of $1,640.36 CAD, and additional damages, including interest on airfare and tribunal costs.
For those of you who don’t know about bereavement fares, they are specially discounted ticket fares offered to passengers who have an urgent need to travel due to the death or recent death of a close family member.
These fares are designed to recognize the high costs of last-minute travel and provide some financial relief in times of personal emergency. Not all airlines offer bereavement rates, and some have specific policies related to them, including eligibility criteria, required proof of emergency (such as a death certificate or letter from the funeral home), and discount ranges.
This demonstrates the legal and ethical responsibilities of businesses when using AI chatbots in customer service roles, and highlights the need for accuracy and accountability in the information these digital tools provide.
$650 is a nominal value compared to what you can imagine if an AI system were to give bad investment advice, instruct someone to make a fraudulent transaction, etc.
Air Canada previously embarked on an AI “experiment” to reduce call center workload during periods of high demand, such as weather-related flight disruptions.
Mel Crocker, Air Canada’s Chief Information Officer (CIO), envisioned this customer service to handle complex customer service issues. You may need to think again.
Now, other companies will think twice before relying on AI for legal advice.