this post was submitted on 18 Jul 2024
490 points (98.4% liked)

Memes

45726 readers
1001 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] neidu2@feddit.nl 47 points 4 months ago* (last edited 4 months ago) (15 children)

Technically possible with a small enough model to work from. It's going to be pretty shit, but "working".

Now, if we were to go further down in scale, I'm curious how/if a 700MB CD version would work.

Or how many 1.44MB floppies you would need for the actual program and smallest viable model.

[–] PixelatedSaturn@lemmy.world 12 points 4 months ago (6 children)

Might be a dvd. 70b ollama llm is like 1.5GB. So you could save many models on one dvd.

[–] ignotum@lemmy.world 8 points 4 months ago (3 children)

70b model taking 1.5GB? So 0.02 bit per parameter?

Are you sure you're not thinking of a heavily quantised and compressed 7b model or something? Ollama llama3 70b is 40GB from what i can find, that's a lot of DVDs

[–] PixelatedSaturn@lemmy.world 9 points 4 months ago

Ah yes probably the Smaler version, your right. Still, a very good llm better than gpt 3

[–] 9point6@lemmy.world 7 points 4 months ago

Less than half of a BDXL though! The dream still breathes

[–] Steve@startrek.website 5 points 4 months ago

For some reason, triple layer writable blu-ray exists. 100GB each

https://www.verbatim.com/prod/optical-media/blu-ray/bd-r-xl-tl/bd-r-xl-tl/

load more comments (2 replies)
load more comments (10 replies)