this post was submitted on 04 Sep 2025
154 points (96.4% liked)

Technology

74827 readers
2769 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

cross-posted from: https://programming.dev/post/36866515

Comments

you are viewing a single comment's thread
view the rest of the comments
[–] Perspectivist@feddit.uk 1 points 1 day ago (1 children)

Same argument applies for consciousness as well, but I'm talking about general intelligence now.

[–] ExLisper@lemmy.curiana.net 2 points 1 day ago* (last edited 1 day ago)

I don't think you can define AGI in a way that would make it substrate dependent. It's simply about behaving in a certain way. Sufficiently complex set of 'if -> then' statements could pass as AGI. The limitation is computation power and practicality of creating the rules. We already have supercomputers that could easily emulate AGI but we don't have a practical way of writing all the 'if -> then' rules and I don't see how creating the rules could be substrate dependent.

Edit: Actually, I don't know if current supercomputers could process input fast enough to pass as AGI but it's still about computation power, not substrate. There's nothing suggesting we will not be able to keep increasing computational power without some biological substrate.