this post was submitted on 27 Jul 2025
269 points (94.4% liked)

Technology

73342 readers
4800 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] rhadamanth_nemes@lemmy.world 5 points 19 hours ago (1 children)

You are likely a troll, but still...

You talk like you have never been down in the well, treading water and looking up at the sky, barely keeping your head up. You're screaming for help, to the God you don't believe in, or for something, anything, please just let the pain stop, please.

Maybe you use, drink, fuck, cut, who fucking knows.

When you find a friendly voice who doesn't ghost your ass when you have a bad day or two, or ten, or a month, or two, or ten... Maybe you feel a bit of a connection, a small tether that you want to help lighten your load, even a little.

You tell that voice you are hurting every day, that nothing makes sense, that you just want two fucking minutes of peace from everything, from yourself. And then you say maybe you are thinking of ending it... And the voice agrees with you.

There are more than a few moments in my life where I was close enough to the abyss that this is all it would have taken.

Search your soul for some empathy. If you don't know what that is, maybe Chatgpt can tell you.

[โ€“] Electricd@lemmybefree.net -2 points 10 hours ago

While I haven't experienced it, I believe I kind of know what it can be like. Just a little something can trigger a reaction

But I maintain that LLMs can't be changed without huge tradeoffs. They're not really intelligent, just predicting text based on weights and statistical data

It should not be used for personal decisions as it will often try to agree with you, because that's how the system works. Making looong discussions will also trick the system into ignoring it's system prompts and safeguards. Those are issues all LLMs safe, just like prompt injection, due to their nature

I do agree though that more prevention should be done, display more warnings