this post was submitted on 28 Aug 2025
305 points (99.7% liked)

Technology

74524 readers
3739 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] unexposedhazard@discuss.tchncs.de 5 points 9 hours ago* (last edited 9 hours ago) (1 children)

Because US cops will totally do something about it lmao

[–] 11111one11111@lemmy.world 12 points 9 hours ago (1 children)

Woooooosh

Im pretty sure the op you replied to is hoping for the five-oh to start taking GPT's advice to commit suicide... meaning they want dead cops not for cops to intervene. Not the classiest comment you replied to, which is why I think it woooooshed right over your head.

[–] unexposedhazard@discuss.tchncs.de -4 points 8 hours ago (1 children)

Yeah okay, that joke just doesnt work for me even if i know it, because instructions wouldnt make people commit suicide so its just odd. Would have worked if the conversation with the LLM itself had made the kid kill itself.

[–] balder1991@lemmy.world 1 points 7 hours ago

I suppose the issue isn’t exactly instructions, but the encouragement and justification.