Air Canada must pay damages after chatbot lies to grieving passenger about discount – Airline tried arguing virtual assistant was solely responsible for its own actions

ylai@lemmy.ml to Not The Onion@lemmy.world – 618 points –
Air Canada must pay after chatbot lies to grieving passenger
theregister.com
54

You are viewing a single comment

If it's integrated in their service, unless they have a disclaimer and the customer has to accept it to use the bot, they are the ones telling the customer that whatever the bot says is true.

If I contract a company to do X and one of their employees fucks shit up, I will ask for damages to the company, and They internally will have to deal with the worker. The bot is the worker in this instance.

So what you're saying is that companies will start hiring LLMs as "independent contractors"?

No, the company contracted the service from another company, but that's irrelevant. I'm saying that in any case, the company is responsible for any service it provides unless there's a disclaimer. Be that service a chat bot, a ticketing system, a store, workers.

If an Accenture contractor fucks up, the one liable for the client is Accenture. Now, Accenture may sue the worker but that's besides the point. If a store mismanaged products and sold wrong stuff or inputted incorrect prices, you go against the store chain, not the individual store, nor the worker. If a ticketing system takes your money but sends you an invalid ticket, you complain to the company that manages, it, not the ones that program it.

It's pretty simple actually.