this post was submitted on 30 Jan 2025
46 points (85.9% liked)
Technology
61227 readers
4228 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
LLMs are a dead end
The way I see it, all these general LLMs and AIs are just the learning tools for the actual future use for ML.
Companies are throwing money and research at them for easy gains, but once the bubble pops, most of them will be irrelevant and will die off. Once there's no reason to "move fast and break things", the actual slow and methodical research will start happening to find where ML belongs in this world.
In the future, specialized companies will utilize all the research being done today to craft more focused tools that do things that machine learning is actually useful for.
ML tech isn't going away. It just needs to mature to the point where these useless bots aren't worth the effort.
I don't think that we're going to throw a little more hardware an one and it's going to suddenly become an AGI, but that doesn't mean that it doesn't have considerable utility.
Also, there are a bunch of "composite" systems that have been built in AI research that use multiple mechanisms. Even if you're off trying to do human-level AI, you may use components in that system that are not themselves fully-capable of acting in such a way.
Like, okay. Think of our own minds. We've got a bunch of hard-coded vision stuff, which is part of why we can get weird optical illusions. Our visual processing system isn't an intelligence on its own, but it's an important part of letting us function as humans in the world.