Ah, I was hoping this meant chatgpt would give canned responses, ie. "Seek help", whenever it detected it was being used for mental health issues, which it should. But no it's just open air flipping off anyone who asks why there chatbot pushed a person to suicide.
this post was submitted on 27 Jul 2025
205 points (95.2% liked)
Technology
73342 readers
5922 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
Reminds me of the time that Twitter, when it was still Twitter and freshly taken over by Musk, replied to all slightly critical questions with a poop emoji.
As far as I know, emailing their public affairs address still does this.
It should also refuse to answer, any medical question, any engineering question, any finance question, anything that requires the responsibility of an accredited member of the professional–managerial class, to read the question, read the answer, and then decide if the question is allowed all and if he will re-write, change or simply block replies to the question entirely.
Also such question should starting be 160$USD a pop