this post was submitted on 23 Oct 2024
191 points (95.7% liked)
Technology
59495 readers
3114 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The warning is a joke. Alike printing "Smoking kills" on ciragette packages, just that even less people care. And I doubt that sentence is going to change anything in a legal battle.
I'm like half convinced.
I think the dynamics are the same as with other things. Sometimes we like to escape reality. That can be done by reading books, watching TV or playing computer games. Or social media or watching some twitch streamer daily. I believe the latter is called parasocial interaction. It becomes an issue once done excessively. Or the lines get blurry. Or mental issues get into the mix.
Certainly AI chatbots are more convincing than some regular old book. (Allegedly already in 1775 we had young people commit copycat suicide after reading Goethe's "The Sorrows of Young Werther", so it's not a new topic.) But an AI can get to you and exploit your individual needs and wants and really get to you. I read the effects are currently being studied. I skimmed some long papers, but it seems we don't have a final answer, yet. About what that does psychologically.
I've tried roleplaying with AI. And I've also tried loading those characters like the famous AI therapist and pop culture characters. For me, it's pretty clear it's just a game. All of the interaction happens through text on the screen, I can't touch them or talk to them verbally (yet). I've heard from some other people here on Lemmy, they don't like the experience that is alike some pen and paper game... And I know how these things work, and that my hypothetical AI girlfriend is just a dream. So I don't think I'm at harm. And I don't think lots of other people are. But... obviously some people are. This isn't the first article about people getting harmed. And I can see how you wouldn't be able to defend yourself against some chatbot if you have serious issues or a mental condition.
I still think we can't skip all the other factors at play. We need to address (teenage) loneliness, guns and not having a caring and healthy social/human environment. A proper education and giving people some knowledge how these things work and what they are, would certainly help, too. It's always the same story. We leave people alone, without education, without a healthy social environment, the people close to them miss how much they're struggling, there is guns laying on the desk...
And after the inevitable happened, we don't address any of that. But completely focus on one topic that's more symptom then cause. And that's why I'm annoyed by the article.
(But I get there is some risk specific to chatbots that goes beyond other things. And it's probably not just symptom, but also contributing factor. We'd need more non-sensationalist information to judge...)