this post was submitted on 04 Sep 2025
156 points (96.4% liked)
Technology
74827 readers
2768 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Same argument applies for consciousness as well, but I'm talking about general intelligence now.
Well I'm curious then, because I have never seen or heard or read that general intelligence would be needing some kind of wetware anywhere. Why would it? It's just computations.
I do have heard and read about consciousness potentially having that barrier though, but only as a potential problem, and if you want conscious robots ofc.
I don’t think it does, but it seems conceivable that it potentially could. Maybe there’s more to intelligence than just information processing - or maybe it’s tied to consciousness itself. I can’t imagine the added ability to have subjective experiences would hurt anyone’s intelligence, at least.
I don't think so. The consciousness has very little influence on the mind, we're mostly in on it for the ride. And general intelligence isn't that complicated to understand, so why would it be dependent on some substrate? I think the burden if proof lies on you here.
Very interesting topic though, I hope I'm not sounding condescending here.
Well, first of all, like I already said, I don’t think there’s substrate dependence on either general intelligence or consciousness, so I’m not going to try to prove there is - it’s not a belief I hold. I’m simply acknowledging the possibility that there might be something more mysterious about the workings of the human mind that we don’t yet understand, so I’m not going to rule it out when I have no way of disproving it.
Secondly, both claims - that consciousness has very little influence on the mind, and that general intelligence isn’t complicated to understand - are incredibly bold statements I strongly disagree with. Especially with consciousness, though in my experience there’s a good chance we’re using that term to mean different things.
To me, consciousness is the fact of subjective experience - that it feels like something to be. That there’s qualia to experience.
I don’t know what’s left of the human mind once you strip away the ability to experience, but I’d argue we’d be unrecognizable without it. It’s what makes us human. It’s where our motivation for everything comes from - the need for social relationships, the need to eat, stay warm, stay healthy, the need to innovate. At its core, it all stems from the desire to feel - or not feel - something.
I'm onboard 100% with your definitions. But I think you does a little mistake here, general intelligence is about problem solving, reasoning, the ability to make a mental construct out of data, remember things ...
It doesn't however imply that it has to be a human doing it (even if the "level" is usually at human levels) or that human experience it.
Maybe nitpicking but I feel this is often overlooked and lots of people conflate for example AGI with a need of consciousness.
Then again, maybe computers cannot be as intelligent as us 😞 but I sincerely doubt it.
So IMO, the human mind probably needs its consciousness to have general intelligence (as you said, it won't probably function at all without it, or very differently), but I argue that it's just because we are humans with wetware and all of that junk, and that doesn't at all mean it's an inherent part of intelligence in itself. And I see absolutely no reason for why it must.
Complicated topic for sure!