this post was submitted on 17 Dec 2024
227 points (99.1% liked)

Technology

59963 readers
3471 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] brucethemoose@lemmy.world 61 points 19 hours ago (12 children)

In addition to pushing something before it’s ready and where it’s not welcome, Apple’s own stinginess completely screwed them over.

What do LLMs need to be smart? RAM, both for their weights and holding real data to reference. What has apple relentlessly price gouged and skimped on for years? Yeah, I’ll give you one guess…

[–] fuckwit_mcbumcrumble@lemmy.dbzer0.com 3 points 14 hours ago (6 children)

LLMs

Emphasis on the first L in LLM. Apple's model is specifically designed to be small to work on phones with 8 gigs of ram (the requirement to run this)

The price gouging for RAM was only ever on computers. With phones you got what got, and you couldn't pay for more.

[–] brucethemoose@lemmy.world 6 points 14 hours ago* (last edited 14 hours ago) (4 children)

Yeah... and it kinda sucks because it's small.

If Apple shipped with 16GB/24GB like some Android phones did well before the iPhone 16, it would be far more useful. 16-24GB (aka 14B-32B class models) are the current threshold where quantized LLMs really start to feel 'smart,' and they could've continue trained a great Apache 2.0 model instead of a tiny, meager one from scratch.

[–] dependencyinjection@discuss.tchncs.de 3 points 13 hours ago (1 children)

I don’t know how much RAM is in my iPhone 14 Pro, but I’ve never thought ooh this is slow I need more RAM.

Perhaps, it’ll be an issue with this stupid Apple Intelligence, but I don’t care about using that on my next upgrade cycle.

[–] brucethemoose@lemmy.world 2 points 13 hours ago (1 children)

My old Razer Phone 2 (circa 2019) shipped with 8GB RAM, and that (and the 120hz display) made it feel lighting fast until I replaced it last week, and only because the microphone got gunked up with dust.

Your iPhone 14 Pro has 6GB of RAM. Its a great phone (I just got a 16 plus on a deal), but that will significantly shorten its longevity.

[–] dependencyinjection@discuss.tchncs.de 1 points 13 hours ago (1 children)

I wonder how much more efficient the RAM can be when the manufacturer makes the software and the hardware? It has to help right, I don’t know what a 16 Pro feels like compared to this, but doubt I would notice.

[–] brucethemoose@lemmy.world 2 points 13 hours ago* (last edited 13 hours ago)

Your OS uses it efficiently, but fundamentally it also limits what app developers can do. They have to make apps with 2-6GB in mind.

Not everything needs a lot of RAM, but LLMs are absolutely an edge case where "more is better, and there's no way around it," and they aren't the only one.

load more comments (2 replies)
load more comments (3 replies)
load more comments (8 replies)