this post was submitted on 18 Apr 2026
89 points (81.6% liked)

Technology

83893 readers
3825 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Simulation6@sopuli.xyz 10 points 4 hours ago (1 children)

AI is not something somebody is going to develop in their moms basement. AGI is NOT inevitable. The current models may grow sophisticated enough that it is hard to distinguish them from AGI, but will still be LLMs.
I see the current AI bubble as a bunch of guys digging a hole, realizing they can't get out and deciding the only way out is to keep digging.

[–] Iconoclast@feddit.uk 3 points 4 hours ago* (last edited 4 hours ago) (1 children)

AI is not something somebody is going to develop in their moms basement. AGI is NOT inevitable.

Plenty of AI systems have already been developed by private individuals on their personal computers. This is not hypothetical. And I'm not claiming that our first AGI will have anything to do with LLMs.

I view AGI as inevitable because it's the natural end goal of us incrementally improving our AI systems over a long enough period of time. As with all human-created technology, we will keep improving it. It doesn't matter how slow the process is - as long as we keep heading in that direction, we will eventually reach the destination. The only things that could stop us, as far as I can see, are either destroying ourselves some other way before we get there or substrate independence - meaning general intelligence simply cannot be created without our biological wetware. I however see no reason to assume that, since human brains are made of matter just like computers are and I don't think there's anything supernatural about intelligence.

[–] Simulation6@sopuli.xyz 1 points 1 hour ago

The term AI has been greatly diluted over time. I guess I should have said AGI instead.

For your second point, I quote the Spartans; if. Current tech is hugely expensive.