3
submitted 1 month ago* (last edited 1 month ago) by activistPnk@slrpnk.net to c/Law@europe.pub

cross-posted from: https://slrpnk.net/post/27072322

A chatbot erroneously told a traveler they get free travel in a particular situation. I don’t recall exact circumstances but it was something like a last minute trip for a funeral. The airline then denied him the free ticket. He sued. The court found that the chatbot represents the company and is therefore legally bound to agreements.

It’s interesting to note that agreements are now being presented which you must click to accept before talking to a chatbot. E.g., from Flixbus:

You are interacting with an automated chatbot. The information provided is for general guidance only and is not binding. If you require further clarification or additional information, please contact a member of our staff directly or check out our terms and conditions and privacy notice.

(emphasis mine)

I’m not in Canada so that may be true. I just wonder if this agreement is enforceable in Europe.

no comments (yet)
sorted by: hot top controversial new old
there doesn't seem to be anything here
this post was submitted on 05 Sep 2025
3 points (100.0% liked)

Law

67 readers
3 users here now

This community is dedicated to discussions about European law, the legal systems of European countries, and any legal topics that impact Europe from around the world. Whether you’re a legal professional, student, or simply interested in how laws shape our societies, this is the place to share insights, ask questions, and explore the complexities of European and international law.

Topics include:

Respectful and informed discussions are encouraged. Please keep posts and comments relevant to the community’s focus.

founded 6 months ago
MODERATORS