this post was submitted on 23 Oct 2024
191 points (95.7% liked)
Technology
61227 readers
4355 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I don’t think this is the fault of the AI yet. Unless the chat logs are released and it literally tries to get him to commit. What it sounds like is a kid who needed someone to talk to and didn’t get it from those around him.
That said, it would be good if cAI monitored for suicidal ideation though. Most of these AI companies are pretty hands off with their AI and what is said.
I don't think it's so cut and dry
Source: https://nypost.com/2024/10/23/us-news/florida-boy-14-killed-himself-after-falling-in-love-with-game-of-thrones-a-i-chatbot-lawsuit/
Yeah, not cut and dry at all. OPs article didn’t have the chat logs. Looks like it told him not to commit but did demand loyalty. He changed his wording from “I want a painless death” to “I want to come home to you” to get it to say what he wanted.