this post was submitted on 24 Jul 2024
198 points (92.7% liked)
Technology
59627 readers
2911 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Technically correct (tm)
Before you get your hopes up: Anyone can download it, but very few will be able to actually run it.
What’s the resources requirements for the 405B model? I did some digging but couldn’t find any documentation during my cursory search.
Typically you need about 1GB graphics RAM for each billion parameters (i.e. one byte per parameter). This is a 405B parameter model. Ouch.
Edit: you can try quantizing it. This reduces the amount of memory required per parameter to 4 bits, 2 bits or even 1 bit. As you reduce the size, the performance of the model can suffer. So in the extreme case you might be able to run this in under 64GB of graphics RAM.
Or you could run it via cpu and ram at a much slower rate.
Yeah uh let me just put in my 512GB ram stick…
Samsung do make them.
Goodluck finding 512gb of VRAM.
https://www.ebay.com/p/116332559 lga2011 motherboards quite cheap, insert 2 xeon 2696v4 44 threads each totalling at 88 threads and 8 ddr4 32gb sticks, it comes quite cheap actually, you can also install Nvidia p40 with 24gb each, you can max out this build for ai for under 2000$
Finally! My dumb dumb 1TB ram server (4x E5-4640 + 32x32GB DDR3 ECC) can shine.