this post was submitted on 21 Feb 2024
278 points (95.1% liked)

Technology

59569 readers
4136 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] DarkThoughts@fedia.io 0 points 9 months ago (1 children)

Have you missed the first part where I explained that I couldn't get it to run through my GPU? I would only have a 6650 XT anyway but even that would be significantly faster than my CPU. How far I can't say exactly without experiencing it though, but I suspect with longer chats and consequently larger context sizes it would still be too slow to be really usable. Unless you're okay waiting for ages for a response.

[โ€“] Flumpkin@slrpnk.net 1 points 9 months ago

Sorry, I'm just curious in general how fast these local LLMs are. Maybe someone else can give some rough info.