this post was submitted on 12 Oct 2024
110 points (69.5% liked)
Technology
59963 readers
3185 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Why is this getting downvoted?
There are those who fear A.I. ... but it's on its way regardless.
It's really not. It's a fad like zip disks or ewaste recycling. Only it's even more expensive while reducing productivity and quality of work everywhere it's implemented, all for the vague hope it eventually might get better.
Do you think AI and / or AGI is a possibly at all given enough time?
Because if the answer is yes, then don’t we need people working on it all the time to keep inching towards that? I’m not saying that the current implementations are anywhere close, but they do have their use cases. I’m a software developer and my boss the lead engineer (the smartest person I’ve ever met) has made some awesome tools tools that save our company of 7 people maybe a 100 hours of work a month.
People used to complain about the LHC and that’s made countless discoveries that help in other fields.
Powered flight was an important goal, but that wouldn't have justified throwing all the world's resources at making Da Vinci's flying machine work. Some ideas are just dead ends.
Transformer based generative models do not have any demonstrable path to becoming AGI, and we're already hitting a hard ceiling of diminishing returns on the very limited set of things that they actually can do. Developing better versions of these models requires exponentially larger amounts of data, at exponentially scaling compute costs (yes, exponentially... To the point where current estimates are that there literally isn't enough training data in the world to get past another generation or two of development on these things).
Whether or not AGI is possible, it has become extremely apparent that this approach is not going to be the one that gets us there. So what is the benefit of continuing to pile more and more resources into it?
LLMs and GANs in general are to AI and AGI like a hand pumped well is to the ISS. Sure, they both technological marvels of their time, but if you're wanting to float in microgravity there is no possible adjustment you can make to the former to get it to actually behave like the latter.