this post was submitted on 23 May 2025
106 points (85.3% liked)
Technology
72315 readers
2680 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can anyone make me a convincing argument against the sentience of AI at this point? Self preservation instinct ranks very high as an indicator of it.
LLMs (Large Language Modles, like Claude) are not AGIs (Artificial General Intelligence). LLMs generate convincing text by mapping the relationships between words scraped from their training data. Even if they are given "tools" that give them interfaces to reference new data or output data into other systems, they still don't really learn, understand, comprehend, gain actual awareness, or feel... they just mimic their training data.
I know how LLMs work.
There’s only one thing you mentioned there that is actually used as a basis to qualify or disqualify sentience: whether it feels or not.
How do you know it doesn’t feel? How do we define feeling for an entity that is inherently non biological?
I could make the argument that humans also merely mimic their training data, ie the values and behaviors we are taught by society, parents etc.
I have not been convinced that they aren’t sentient with this argument.
Feeling is analog and requires an actual nervous system which is dynamic. LLMs exist in a static state that is read from and processed algorithmically. It is only a simulacrum of life and feeling. It only has some of the needed characteristics. Where that boundary exists though is hard to determine I think. Admittedly we still don't have a full grasp of what consciousness even is. Maybe I'm talking out my ass but that is how I understand it.
You just posted random words like dynamic without explanation
Not them, but static in this context means it doesn't have the ability to update its own model on the fly. If you want a model to learn something new, it has to be retrained.
By contrast, a animal brain is dynamic because it reinforces neural pathways that get used more.
That makes more sense
You’re in a programming board and you don’t understand static/dynamic states?
Not in a hand wavy way from the last post. I understand that Python is dynamically typed, which would have nothing to do with the topic