this post was submitted on 01 Apr 2024
36 points (92.9% liked)
Technology
72903 readers
2987 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Idk, I find this hard to believe. I would think the challenge is more access to the information (gates, bandwidth), a speedy vault to store that information, and improving their models.
When you think about what's available on the internet, how much of human knowledge and propaganda is out there. With enough/deus ex tech, there's no way ai shouldn't be able to learn most of anything with the knowledge available, and the right trainers.
Yes, it's BS, like most of the AI takes here.
The kernel of truth is scaling laws:
Well, that's the rub, right? Garbage in, garbage out. For an LLM, the value is predicting the next token, but we've seen how racist current datasets can be. If you filter it, there's not as much lot of high quality data left.
So yes, we have a remarkable amount of (often wrong) information to pull from.
Mhm, I wonder when we'll have the resources to build one that can tell the truth from other lies. I suppose you have to learn to crawl before you learn to walk, but these things still having trouble rolling over.