this post was submitted on 20 Aug 2025
109 points (95.8% liked)

Technology

74324 readers
3518 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TheAlbatross@lemmy.blahaj.zone 15 points 2 days ago (1 children)

I think that's a little cynical. I know a few people who work in psych, some in ER's, and it's becoming more common to hear people following advice they got via ChatGPT and harming themselves. One particularly egregious one was where the patient was using the program for therapy reasons then suddenly pivoted to asking what the highest buildings were locally, which, of course, the program answered.

[–] Dindonmasker@sh.itjust.works 8 points 2 days ago (1 children)

Thr highest building will just make you regret your action for longer while falling. May i suggest this building close to your location that is perfectly as tall as it needs to do the job? Chatgpt probably.

[–] TheAlbatross@lemmy.blahaj.zone 11 points 2 days ago

Funny, but the reality is even darker. There are zero safeguards built into the program for these scenarios so it makes absolutely no correlation between the two topics, something even a self-styled, unlicensed "life coach" would easily do.