this post was submitted on 20 Apr 2024
313 points (95.9% liked)

Technology

59534 readers
3199 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Womble@lemmy.world 2 points 7 months ago

Look into quantised models (like gguf format) these significantly reduce the amout of memory needed and speed up computation time at the expense of some quality. If you have 16GB of rm or more you can run decent models locally without any gpu, though your speed will be more like 1 word a second than chatgpt speeds