this post was submitted on 21 Oct 2024
933 points (97.3% liked)

Technology

59534 readers
3143 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 76 points 1 month ago* (last edited 1 month ago) (5 children)

As a major locally-hosted AI proponent, aka a kind of AI fan, absolutely. I'd wager it's even worse than crypto, and I hate crypto.

What I'm kinda hoping happens is that bitnet takes off in the next few months/years, and that running a very smart model on a phone or desktop takes milliwatts... Who's gonna buy into Sam Altman $7 trillion cloud scheme to burn the Earth when anyone can run models offline on their phones, instead of hitting APIs running on multi-kilowatt servers?

And ironically it may be a Chinese company like Alibaba that pops the bubble, lol.

[–] pennomi@lemmy.world 27 points 1 month ago (4 children)

If bitnet takes off, that’s very good news for everyone.

The problem isn’t AI, it’s AI that’s so intensive to host that only corporations with big datacenters can do it.

[–] cybersandwich@lemmy.world 14 points 1 month ago (2 children)
[–] Starbuncle@lemmy.ca 20 points 1 month ago (1 children)
[–] Saleh@feddit.org 2 points 1 month ago

So will the return of the flag conclude the adventures of ressource usage in computers?

[–] brucethemoose@lemmy.world 1 points 1 month ago* (last edited 1 month ago)

What star said, but what it also does is turn hard matrix multiplication into simple addition.

Basically, AI will be hilariously easy to run compared to now once ASICs start coming out, thought it will run on CPUs/GPUs just fine.

load more comments (1 replies)
load more comments (1 replies)