this post was submitted on 13 Aug 2025
164 points (97.1% liked)
Technology
74388 readers
3134 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Sloppier compute architecture needed to drive down costs on sloppier method of computing.
If it makes AI cheaper then great because AI is a massive fucking waste of power, but other than that I am grossed out by this tech and want none of it.
Seems like a win to me
Slop has nothing to do with it. Some problems just aren’t deterministic and this sort of chip could be a massive performance and efficiency boost for them. They’re potentially useful for all sorts of real world simulations and detection problems.
This is it literally. (granted I'm sure there are other use cases, but you know they're following those AI-dollars)
When the results doesn't matter use "Does not matter chips" They work 3% of the time 100% of the time. BUT they consume way less power! Great for any random statistics, if the results does not match what you want then just press again! Buy "Does not matter chips" now!
It makes some sense to handle self-discovered real numbers of infinite precision using analog methods, though I'm curious about how they handle noise, since in the real world and unlike the mathematical world all storage, transmission and calculations have some error.
That said, my experience way back with a project I did at Uni with Neural Networks in their early days, is that they'll even make up for implementation bugs (we managed about 85% rate of number recognition with a buggy implementation) so maybe that kind of thing is quite robust in the face of analog error.