this post was submitted on 03 Mar 2024
132 points (88.4% liked)

Technology

59495 readers
3114 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

It’s cheap, quick and available 24/7, but is a chatbot therapist really the right tool to tackle complex emotional needs?

you are viewing a single comment's thread
view the rest of the comments
[–] Candelestine@lemmy.world 11 points 8 months ago (2 children)

Eventually, yes, I think it will be. Not yet though, the tech just isn't strong enough atm. But an AI is resistant to the emotional toll, burnout and low pay that a real life therapist has to struggle with. The AI therapist doesn't need a therapist.

Personally though, I think this is going to be one of the first widespread, genuinely revolutionary things LLMs are capable of. Couple more years maybe? It won't be able to handle complex problems, it'll have to flag and refer those cases to a doctor. But basic health maintenance is simpler.

[–] snooggums@midwest.social 25 points 8 months ago (1 children)

That would assume the people designing AI want what is best for the person and not what will make them the most money at the expense of the consumer.

The companies involved in AI are NOT benevolent.

[–] Even_Adder@lemmy.dbzer0.com 5 points 8 months ago (1 children)

You could just run your own. There are plenty of open source models that don't answer to any company.

[–] BakerBagel@midwest.social 8 points 8 months ago (2 children)

Why dont i just give myself therapy? I know way more about what is going on in my head than anyone else does.

[–] Usernameblankface@lemmy.world 3 points 8 months ago

Because what's going on in your own head a would taint your treatment plan and cause it the be a self-defeating plan.

[–] Even_Adder@lemmy.dbzer0.com 1 points 8 months ago

Maybe one day that'll actually be possible.

[–] Usernameblankface@lemmy.world 5 points 8 months ago* (last edited 8 months ago) (1 children)

Yes, one thing it absolutely has to be good at is referring patients to human therapists, for anyone who need something beyond the standard strategies the AI is trained on. It has to be smart enough to know when to give up.

Edit, it would also be great if the AI would match up these difficult cases to therapists who are known to do well with whatever the patient is dealing with, as well as matching according to the patient's personality, communication style, etc wherever possible

Edit 2 for clarity above

[–] BakerBagel@midwest.social 0 points 8 months ago (1 children)

Where is the profit in sending someone to a different AI for help?

[–] Usernameblankface@lemmy.world 3 points 8 months ago

I meant referring them to human specialists.