this post was submitted on 08 May 2024
469 points (98.6% liked)
Technology
59605 readers
3501 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
"we created the problem of soldered-on ram! now we have the solution: a new standard, for no fucking reason!" -every memory, board, and system company
But the article explains that there is a technical reason.
I still don't understand, why this is seemingly no problem in any other application.
Desktops, servers and even some chonkier laptops manage to work with regular (SO)DIMMs just fine.
I'm guessing regular non-LP DDR works fine socketed in desktops because power is nearly a non-issue. Need to burn a few watts to guarantee signal integrity? We've got a chonky PSU, so no problem. On mobile devices however every watt matters..
Plus the smaller chips (like the CPU) are designed for lower voltage and current. They can't handle dialing up the power, they'll melt.
I recently got a have Mini-PC which a processor with a TDP of 6W and it uses run of the mill SODIMMS and the power supply for that stuff is a pretty regular wall socket power adapter, the same kind you would see for, say, a media box.
I suspect it's not even a few watts (at 3.3V 1W is around 300mA is quite an insane amount of current for a signal line), more like tens or even hundreths of a watt.
Mind you, what really changes here is voltage rather than current: these things run at a lower voltage, which helps with speed and in reducing the power dissipating as heat (so they waste less power and heat up less) and that's were signal integrity on longer signal traces becomes more of a problem because lower voltage signals are closer to the noise level the drop in voltage from the resistance of the circuit board lines because a higher proportion of the original voltage so the longer the trace the more likely it is that whatever reaches the other side is pretty much at the same level as noise.
Still matches what you wrote, by the way, as power = voltage * current, so all else being the same lower voltage does mean less power consumed. It's just that you were a bit off on the scale of the power consumption involved plus there's some more stuff related to using a lower voltage not just for lower power dissipation but also lower heat generation (which is directly derived from lower power dissipation) and higher speeds (which is for different reasons).
Normal DIMMs work fine but soldered RAM can just be much faster and in general better. It's not an acceptable compromise on most desktops but for laptops which also has to be smaller and need to worry about stuff like battery life, it matters more.
Sounds like there is a bunch of nuance in this topic!
But I want clear black and white distinctions and outrage!!!
Laptops with sodimm DDR5 not only use much more power, but they're also slower than LPDDR5.
Ex: the Intel Thinkpad T16 has 5600mhz ram in sodimm form, but with soldered RAM (AMD version) it's like 6400mhz.
Desktops/servers get around this the best they can by just blasting the power away.
My understanding is that those are slower (SODIMMs) or are able to use more power (DIMMs) to maintain signal fidelity.