this post was submitted on 04 Nov 2024
278 points (98.3% liked)

Technology

59495 readers
3050 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] solrize@lemmy.world 20 points 2 weeks ago (19 children)

I'm an antifan of Apple but the M4 Max is supposed to be faster than any x86 desktop CPU, and use a lot less power. That's per geekbench 6. I'd be interested in seeing other measurements.

[–] Viri4thus@feddit.org 41 points 2 weeks ago* (last edited 2 weeks ago) (18 children)

Geekbech is as useful as a metric as an umbrella on a fish. Also the M4 max will not consume less energy than the competition. That is a misconception arising from the lower skus in mobile devices. The laws of physics apply to everyone, at the same reticle size the energy consumption in nT worlkloads is equivalent. The great advantage of Apple is that they are usually a node ahead and the eschewing of legacy compatibility saves space and thus energy in the design that can be leveraged to reduce power consumption on idle or 1T. Case in point, Intel's latest mobile CPUs.

[–] independantiste@sh.itjust.works 27 points 2 weeks ago (2 children)

Exactly, the apple chips excel at low power tasks and will consume basically nothing doing them. It's also good for small bursty tasks, but for long lived intensive tasks it behaves basically the same as an equivalent x86 chip. People don't seem to know that these chips can easily consume 80-90W of power when going full tilt.

[–] Buffalox@lemmy.world 5 points 2 weeks ago (2 children)

The new Intel Arrow Lake is supposed to max out at 150W, but it doesn't. And that's still almost 40% better than previous gen Intel!
So hovering around 80-90W max is pretty modest by today's standards.

[–] Cocodapuf@lemmy.world 6 points 2 weeks ago (2 children)

That's impressive, or should I say scary? 150w is a lot of heat to dissipate... I hope those aren't laptop chips...

[–] daellat@lemmy.world 8 points 2 weeks ago

The 14900k is an absolute oven

[–] Buffalox@lemmy.world 4 points 2 weeks ago

No but the M4 Max is claimed to be as fast, and Intel improved their chip, so it's down from 250W for previous gen! And the M4 Max is faster.

[–] independantiste@sh.itjust.works 2 points 2 weeks ago

Oh of course, the apple chips are faster, and this is likely a combination of more efficiency thanks to the newer process node and apple being able to optimize the chips and power draw much better because they make everything. Apple can also afford to use larger chips because they make a profit on the entire computer, not just the processor itself.

[–] Viri4thus@feddit.org 4 points 2 weeks ago (1 children)

We're condemned to suffer uninformed masses on this. Zen 5 mobile is on N4p at 143transistors/um2, the M4max is on N3E at 213transistors/um2. That's a gigantic advantage in power savings and logic per mm2 of die. Granted, I don't think the chiplet design will ever reach ARM levels of power gating but that's a price I'm willing to pay to keep legacy compatibility and expandable RAM and storage. That IO die will always be problematic unless they integrate it in the SOC but I'd prefer if they don't. (Integration also has power saving advantages, just look at Intel's latest mobile foray)

[–] pycorax@lemmy.world 4 points 2 weeks ago

Not to mention, Apple is able to afford the larger die size per chip since they do vertical integration and don't have to worry about the cost of each chip in the way that Intel and AMD has to when they sell to device manufacturers.

load more comments (15 replies)
load more comments (15 replies)