this post was submitted on 29 Apr 2026
221 points (96.2% liked)

Technology

84222 readers
5114 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] HubertManne@piefed.social 3 points 19 hours ago (2 children)

It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.

[–] TubularTittyFrog@lemmy.world 2 points 16 hours ago* (last edited 16 hours ago) (1 children)

people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.

this isn't any different, it's just a AI version of it. it's still mostly imaginative fantasy at the end of the day, and it's a form of escapism from the real world.

the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who 'really understood her' and her struggles with the stillbirth trauma. it's all entirely a fiction in her head, but it's a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.

[–] HubertManne@piefed.social 1 points 14 hours ago

this I understand. I mean as a video game or a laugh. sure. but its not conversation.

[–] Grimy@lemmy.world 0 points 17 hours ago (2 children)

Kind of feels like semantics.

Let's say I give you a discord link and tell you that half the people are bots and half aren't. Realistically, LLMs are at a level where you won't be able to tell which is which.

So what then. You are only having a conversation half the time but you can't point out when that is? Feels a bit hollow.

This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.

[–] chunes@lemmy.world 1 points 4 hours ago

People are in for a rude awakening when we discover that 'next token prediction' is what intelligence means after all.

[–] HubertManne@piefed.social 2 points 14 hours ago (1 children)

back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.

[–] Grimy@lemmy.world 0 points 12 hours ago* (last edited 12 hours ago)

Ya I get what you mean, I'm just saying that to say there's a difference, you would have to be able to see that difference in a blind test.

I understand they have limits, but so do regular people. You don't need to be an expert on a subject to hold a conversation about it.

They aren't intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.

It's definitely hollow but I get why people are getting caught up in it.