this post was submitted on 23 Nov 2024
166 points (86.1% liked)

Technology

59589 readers
2891 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

I'm usually the one saying "AI is already as good as it's gonna get, for a long while."

This article, in contrast, is quotes from folks making the next AI generation - saying the same.

you are viewing a single comment's thread
view the rest of the comments
[–] cron@feddit.org 53 points 13 hours ago (3 children)

It's absurd that some of the larger LLMs now use hundreds of billions of parameters (e.g. llama3.1 with 405B).

This doesn't really seem like a smart usage of ressources if you need several of the largest GPUs available to even run one conversation.

[–] Cheems@lemmy.world 2 points 1 hour ago

That's capitalism

[–] WalnutLum@lemmy.ml 10 points 10 hours ago

Seeing as how the full unquantized FP16 for Llama 3.1 405B requires around a terabyte of VRAM (16 bits per parameter + context), I'd say way more than several.

[–] cyberpunk007@lemmy.ca 15 points 12 hours ago (3 children)

I wonder how many GPUs my brain is

[–] blackbelt352@lemmy.world 31 points 11 hours ago

It's a lot. Like a lot a lot. GPUs have about 150 billion transistors but those transistors only make 1 connection in what is essentially printed in a 2d space on silicon.

Each neuron makes dozens of connections, and there's on the order of almost 100 billion neurons in a blobby lump of fat and neurons that takes up 3d space. And then combine the fact that multiple neurons in patterns firing is how everything actually functions and you have such absurdly high number of potential for how powerful human brains are.

At this point, I'm not sure there's enough gpus in the world to mimic what a human brain can do.

[–] bobs_monkey@lemm.ee 16 points 12 hours ago (2 children)
[–] cyberpunk007@lemmy.ca 2 points 47 minutes ago (1 children)
[–] bobs_monkey@lemm.ee 1 points 18 minutes ago

You said GPUs, not CPUs and threading capabilities

[–] abcd@feddit.org 1 points 5 hours ago

The Answer to the Ultimate Question of Life, The Universe, and Everything

[–] cron@feddit.org 6 points 12 hours ago (1 children)

I don't think your brain can be reasonably compared with an LLM, just like it can't be compared with a calculator.

[–] GetOffMyLan@programming.dev 11 points 10 hours ago

LLMs are based on neural networks which are a massively simplified model of how our brain works. So you kind of can as long as you keep in mind they are orders of magnitude more simple.