this post was submitted on 24 May 2024
283 points (93.5% liked)
Technology
59534 readers
3197 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
This is because dedicated consumer AI hardware is a dumb idea. If it's powerful enough to run a model locally, you should be able to use it for other things (like, say, as a phone or PC) and if it's sending all its API requests to the cloud, then it has no business being anything but a smartphone app or website.
I can’t agree with that. ASICs can specialize to do one thing at lightning speeds, and fail to do even the most basic of anything else. It’s like claiming your GPU is super powerful so it should be able to run your PC without a CPU.
That's fair, dedicated ASICs for AI acceleration are totally a valid consumer product, but I meant more along the lines of independent devices (like Rabbit R1 and the AI Pin), not components you can add to an existing device. I should have been more clear.