this post was submitted on 01 Jun 2024
1615 points (98.6% liked)

Technology

59589 readers
2838 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] CosmicTurtle0@lemmy.dbzer0.com 44 points 5 months ago (18 children)

Iirc cases where the central complaint is AI, ML, or other black box technology, the company in question was never held responsible because "We don't know how it works". The AI surge we're seeing now is likely a consequence of those decisions and the crypto crash.

I'd love CVS try to push a lawsuit though.

[–] Natanael@slrpnk.net 31 points 5 months ago (4 children)

In Canada there was a company using an LLM chatbot who had to uphold a claim the bot had made to one of their customers. So there's precedence for forcing companies to take responsibility for what their LLMs says (at least if they're presenting it as trustworthy and representative)

[–] LordPassionFruit@lemm.ee 23 points 5 months ago (3 children)

This was with regards to Air Canada and its LLM that hallucinated a refund policy, which the company argued they did not have to honour because it wasn't their actual policy and the bot had invented it out of nothing.

An important side note is that one of the cited reasons that the Court ruled in favour of the customer is because the company did not disclose that the LLM wasn't the final say in its policy, and that a customer should confirm with a representative before acting upon the information. This meaning that the the legal argument wasn't "the LLM is responsible" but rather "the customer should be informed that the information may not be accurate".

I point this out because I'm not so sure CVS would have a clear cut case based on the Air Canada ruling, because I'd be surprised if Google didn't have some legalese somewhere stating that they aren't liable for what the LLM says.

[–] Natanael@slrpnk.net 5 points 5 months ago

But it has to be clearly presented. Consumer law and defamation law has different requirements on disclaimers

load more comments (2 replies)
load more comments (2 replies)
load more comments (15 replies)