this post was submitted on 29 Apr 2026
221 points (96.2% liked)

Technology

84222 readers
5114 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] TommySoda@lemmy.world 34 points 21 hours ago (5 children)

I tried one just for shits a giggles awhile back to see if there is any merit to the widespread use of them. The only way you'd find these even remotely realistic or interesting is if you've never had any kind of sexual encounter with a real person before, whether in person or through text. After about five minutes of "chatting" with one of these bots it started to respond like half baked fan fiction that didn't understand the basics of sex or even anatomy. The cadence is very predictable and it tends to repeat the same wording and phrasing constantly. If you have real world experience with people, it just feels like a generic chatbot.

In my opinion, this is more proof that these people need to interact with real humans. If these chat bots seem at all human to you, you need to interact with more actual humans.

[–] Earthman_Jim@lemmy.zip 1 points 5 hours ago

We need third places again. Having everything at home is bad for us, but doing everything at home is framed and sold to us as the state of the art status quo. Our tendencies to avoid rejection and conflict are being preyed on and encouraged by the Epstein class because it's most convenient for THEM that we rot alone in our houses. Almost everything that's sold as "convenience" is just another way to avoid each other, and here we are.

[–] stickyprimer@lemmy.world 1 points 7 hours ago (1 children)

Perhaps the point is to seek something that’s not like real humans.

[–] TommySoda@lemmy.world 1 points 5 hours ago (1 children)

Yeah, that's kinda what the article is about. People choosing chatbots over real people. I'm just saying that it's not good for your mental health and even worse for developing social skills.

[–] stickyprimer@lemmy.world 1 points 3 hours ago

Okay. Most of your comment seemed to be focused on whether they resemble actual humans. I don’t think we have any information about whether these impact your mental health but I would tend to agree they can’t be good.

[–] captainlezbian@lemmy.world 4 points 14 hours ago (1 children)

So at best it's a bad simulation of a half assed erp?

[–] TommySoda@lemmy.world 3 points 13 hours ago

Pretty much, yeah. It's like reading fan fiction and assuming that's how real people talk to each other. Similar to watching porn and assuming that's how sex works when in reality sex is clunky and often times gross.

That's because the porn bots are bullshit. You gotta finesse and woo chatgpt if you want real love.

[–] HubertManne@piefed.social 6 points 20 hours ago (4 children)

I really don't understand how anyone could want to chat with bots in general. Do people lack the ability to appreciate the genuine. It explains how you get people like trump. Who wants that kind of interaction?

[–] LordMayor@piefed.social 17 points 18 hours ago (1 children)

There are people that suffer from isolation, anxiety, depression, trauma or a host of other issues over which they have no control or support structures to address their problems. Of course, these bots aren’t a solution but they are accessible. It’s no wonder why people would use them.

They deserve sympathy not condescension.

[–] HubertManne@piefed.social -1 points 14 hours ago

heck I have those but I still don't understand how anyone could want to chat with bots and its not conensation.

[–] TommySoda@lemmy.world 3 points 19 hours ago

The issue arises when you don't have anyone to talk to. Having something to talk, even though it's not a real person, can be enticing to sate the need to communicate with people. The problem is that people that don't have a lot of real life experience in communication fall into the trap of thinking it's better because it's always agreeable and "listens" better than normal people. To me that sounds like someone that has difficulties with oversharing and has poor social skills. What these people should actually be doing in order to feel more satisfied socially is to work on their social skills instead of only talk to chatbots that can't say no. If the types of relationships people have with chatbots were translated into human relationships most people would consider them toxic. And how many people do you know that for some reason seek out and always end up in toxic relationships?

[–] TubularTittyFrog@lemmy.world 1 points 18 hours ago* (last edited 18 hours ago)

most of modern life isn't genuine. and yes, people don't like it when they encounter it.

they love artifice. they love their biases being confirmed, they love their egos being flattered.

[–] ArbitraryValue@sh.itjust.works 1 points 19 hours ago (1 children)

I find AI to be a better conversation partner than humans in most circumstances. It's not perfect but it's knowledgeable about pretty much every topic and it's always fully engaged and attentive. Most people, by contrast, aren't very interesting and most interesting people are busy. Of course I would prefer to talk to someone who was also subjectively experiencing and enjoying the conversation, but I can get a lot out of a conversation even without that.

[–] HubertManne@piefed.social 3 points 19 hours ago (2 children)

It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.

[–] TubularTittyFrog@lemmy.world 2 points 16 hours ago* (last edited 16 hours ago) (1 children)

people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.

this isn't any different, it's just a AI version of it. it's still mostly imaginative fantasy at the end of the day, and it's a form of escapism from the real world.

the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who 'really understood her' and her struggles with the stillbirth trauma. it's all entirely a fiction in her head, but it's a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.

[–] HubertManne@piefed.social 1 points 14 hours ago

this I understand. I mean as a video game or a laugh. sure. but its not conversation.

[–] Grimy@lemmy.world 0 points 17 hours ago (2 children)

Kind of feels like semantics.

Let's say I give you a discord link and tell you that half the people are bots and half aren't. Realistically, LLMs are at a level where you won't be able to tell which is which.

So what then. You are only having a conversation half the time but you can't point out when that is? Feels a bit hollow.

This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.

[–] chunes@lemmy.world 1 points 4 hours ago

People are in for a rude awakening when we discover that 'next token prediction' is what intelligence means after all.

[–] HubertManne@piefed.social 2 points 14 hours ago (1 children)

back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.

[–] Grimy@lemmy.world 0 points 12 hours ago* (last edited 11 hours ago)

Ya I get what you mean, I'm just saying that to say there's a difference, you would have to be able to see that difference in a blind test.

I understand they have limits, but so do regular people. You don't need to be an expert on a subject to hold a conversation about it.

They aren't intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.

It's definitely hollow but I get why people are getting caught up in it.