In order to provide you with the best online experience this website uses cookies.
By using our website, you agree to our use of cookies. Learn more.
Air Canada found liable after chatbot gave misleading advice
Artificial intelligence is all fun and games until someone gets hurt.
A cautionary tale is making headlines in British Columbia where a small claims court on Wednesday (Feb. 15) ruled that Air Canada tried to deny liability when its AI-driven chatbot gave bad advice about bereavement fares.
As reported by CBC News, Air Canada apparently tried to hold its own chatbot responsible for misleading a customer, saying its online tool was "a separate legal entity that is responsible for its own actions."
That argument didn’t exactly fly with the adjudicator.
"This is a remarkable submission," Civil Resolution Tribunal (CRT) member Christopher Rivers wrote. "While a chatbot has an interactive component, it is still just a part of Air Canada's website. It should be obvious to Air Canada that it is responsible for all the information on its website. It makes no difference whether the information comes from a static page or a chatbot."
Air Canada was ordered to pay compensation to a grieving grandchild, who claimed an ill-informed chatbot told them to purchase full-price air tickets.
The airline has been ordered to pay Jake Moffatt $812 to cover the difference between the airline's bereavement rates and the $1,630.36 they paid for full-price tickets to and from Toronto, which were purchased after their grandmother died, CBC News reports.
The airline's chatbot, according to a screenshot that was shared, gave Moffatt the following advice: "If you need to travel immediately or have already travelled and would like to submit your ticket for a reduced bereavement rate, kindly do so within 90 days of the date your ticket was issued by completing our Ticket Refund Application form."
When Moffatt contacted Air Canada to get money back, they were told bereavement rates don't apply to completed travel —something that is explained on a different part of of Air Canada’s website.
Leading up to the small claims case, an Air Canada representative, at one point, responded to Moffatt and admitted that the chatbot was wrong, CBC reports.
Rivers, in his decision, determined that Air Canada “did not take reasonable care to ensure its chatbot was accurate.”
The case appears to be the first major legal decision in Canada relating to misleading advice shared by chatbots, according to the Canadian Legal Information Institute.
Speaking with the Canadian Press, Ira Parghi, a lawyer with expertise in information and AI law, said companies relying on AI systems need to be careful on what they include in their services – and to get information right.
Don't miss a single travel story: subscribe to PAX today! Click here to follow PAX on Facebook.