this post was submitted on 22 Jan 2024
394 points (94.4% liked)

Technology

59534 readers
3195 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

"There's no way to get there without a breakthrough," OpenAI CEO Sam Altman said, arguing that AI will soon need even more energy.

you are viewing a single comment's thread
view the rest of the comments
[–] Petter1@lemm.ee 3 points 10 months ago* (last edited 10 months ago) (2 children)

Lol, "old M1 laptop" 3 to 4 years is not old, damn!

(I have running macbookpro5,3 (mid 2009) on Arch, lol)

But nice to hear that M1 (an thus theoretically even the iPad, if you are not talking about M1 pro / M1 max) can already run llamma v2 7B.

Have you tried the mistralAI already, should be a bit more powerful and a bit more efficient iirc. And it is Apache 2.0 licensed.

https://mistral.ai/news/announcing-mistral-7b/

[–] TheRealKuni@lemmy.world 3 points 10 months ago

But nice to hear that M1 (a thus theoretically even the iPad, if you are not talking about M1 pro / M1 max) can already run llamma v2 7B.

An iPhone XR/XS can run Stable Diffusion, believe it or not.

[–] FractalsInfinite@sh.itjust.works 2 points 10 months ago* (last edited 10 months ago)

3 to 4 years is not old

Huh, nice. I got the macbook air secondhand so I thought it was older. Thanks for the suggestion, I'll try mistralAI next, perhaps on my phone as a test.