this post was submitted on 15 Oct 2024
126 points (97.7% liked)

Technology

59534 readers
3209 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] schizo@forum.uncomfortable.business 15 points 1 month ago (1 children)

I suspect that it's going to go the same route as the 'acting on behalf of a company' bit.

If I call Walmart, and the guy on the phone tells me that to deal with my COVID infection I want to drink half a gallon of bleach, and I then drink half a gallon of bleach, they're going to absolutely be found liable.

If I chat with a bot on Walmart, and it tells me the same thing, I'd find it shockingly hard to believe that the decisions from a jury would in any way be different.

It's probably even more complicated in that while a human has free will (such as it is), the bot is only going craft it's response from the data it's trained on, so if it goes off the rails and starts spouting dangerous nonsense, it's probably an even EASIER case, because that means someone trained the bot that drinking bleach is a cure for COVID.

I'm pretty sure our legal frameworks will survive stupid AI, because it's already designed to deal with stupid humans.

[–] Letstakealook@lemm.ee 2 points 1 month ago (1 children)

Would a court find Walmart liable for your decision to take medical advice from a random employee? I'm sure Walmart could demonstrate that the employee was not acting in the capacity of their role and any reasonable person would not consider drinking bleach because an unqualified walmart employee told them so.

[–] schizo@forum.uncomfortable.business 6 points 1 month ago (1 children)

I changed company names before posting and broke the clarity, sorry.

Imagine I wasn't a idiot and had said Walmart pharmacy, which is somewhere you'd expect that kind of advice.

[–] Letstakealook@lemm.ee 2 points 1 month ago (1 children)

That would make it more plausible. I don't think you're an idiot, I was asking because I was curious if there was precedent for a jackass conspiracy minded employee handing out medical advice causing liability for a business. I wouldn't think it is right, but I also don't agree with other legal standards, lol.

Thankfully there's not: you'd expect someone at a pharmacy to provide reasonable medical advice, or your mechanic to tell you the right thing to do with your car. Once you walk outside the field where a reasonable person would reasonably expect what they're being told to be uh, reasonable, then there's usually no real case for liabilities.

Buuuuuut, in the US at least, this is entirely civil law, and that means the law is mostly whatever you can convince a jury of, so you can end up with some wacky shit happening.