this post was submitted on 05 Nov 2024
502 points (99.0% liked)

Technology

59495 readers
3081 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 
top 37 comments
sorted by: hot top controversial new old
[–] Buffalox@lemmy.world 65 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

I predicted in 2017 stock price over $100 when that happened.
Took about 3-4 years longer than expected, but still congratulations to AMD, on their successful fight back from the brink of bankruptcy.

[–] SnotFlickerman@lemmy.blahaj.zone 35 points 2 weeks ago* (last edited 2 weeks ago) (4 children)

Not to diminish the hard work AMD has put in, but it's at least partially related to Intel's ongoing issues with quality assurance (or the lack thereof, rather), and thus it's arguable that they hold a stronger position at least partially due to Intel's weakness in the last 10 years.

[–] Fubarberry@sopuli.xyz 88 points 2 weeks ago (1 children)

Having a usable product while your opponents continually shoot themselves in the foot is a viable market strategy.

[–] SnotFlickerman@lemmy.blahaj.zone 43 points 2 weeks ago (2 children)

Valve Corporation has entered the chat.

[–] bruhduh@lemmy.world 12 points 2 weeks ago (1 children)

What is this strategy called?

[–] grue@lemmy.world 45 points 2 weeks ago (1 children)
[–] bruhduh@lemmy.world 16 points 2 weeks ago

The best strategy there is

[–] frezik@midwest.social 4 points 2 weeks ago

Sony is also really good at this. With the PS2 against the Dreamcast, they walked on stage, said "$299", and walked off. Later, the PS3 was struggling against the XB360, but then the Red Ring of Death issues popped up and they pulled way ahead. Microsoft then tries a bunch of Kintect crap with the next generation, and Sony says "do you want to play games? Buy a PS4. It will play games" and they win that generation outright.

Tons of other problems with Sony, but they are masters of taking advantage of competitors' mistakes.

[–] Buffalox@lemmy.world 32 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

Absolutely, if Intel hadn't been sleeping on their laurels for 5 years on desktop performance, and had made 6 and 8 core CPUs themselves before Ryzen arrived. Ryzen would not have been nearly as successful. This was followed by the catastrophic Intel 10nm fab failures, allowing AMD to stay ahead even longer.

So absolutely, AMD has been helped a lot by Intel failing to react in time, and then failing in execution when they did react.
Still I think congratulation is in order, because Ryzen was such a huge improvement on the desktop and server, that they absolutely deserve their success. Threadripper was icing on the cake, and completely trashed Intel in the workstation segment.

And AMD exposed Intel's weakness in face of real competition. Arm and Nvidia had already done that in their respective areas, but AMD did it on Intel's core business.

[–] aard@kyu.de 25 points 2 weeks ago (1 children)

For people who weren't looking for a developer workstation back then: Threadripper suddenly brought the performance of a xeon workstation costing more than 20k for just a bit over 2k.

That suddenly wasn't a "should I really invest that much money" situation, but a "I'd be stupid not to, productivity increase will pay for that over the next month or so"

[–] boonhet@lemm.ee 14 points 2 weeks ago

productivity increase will pay for that over the next month or so

Found the fellow Rust developer

Cargo build universe

[–] SnotFlickerman@lemmy.blahaj.zone 15 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

For sure, and as someone who has been stuck running Linux on an Intel box after being spoiled by all-AMD for about 6 years, I gotta say, the fact that a lot of AMD stuff "just works" in Linux when you have to jump through hoops for the same from Intel is probably a big reason they're picking up in datacenters, too. Datacenters don't usually run on fucking Windows Server, they usually run Linux, and AMD just plays better with Linux at the moment. (In my personal experience, anyway)

[–] Buffalox@lemmy.world 3 points 2 weeks ago

Yes this too is really a turnaround compared to "old times". Intel used to be the safe choice, that's definitely not the case anymore.

[–] frezik@midwest.social 3 points 2 weeks ago

Their entire architecture also seems to be just plain behind now. The Ultra 2xx series of processors is not only on TSMC, but on a better node than AMD is using for Ryzen 9000 series. But you wouldn't know it from the benchmarks of either performance or efficiency.

[–] frezik@midwest.social 3 points 2 weeks ago* (last edited 2 weeks ago)

Their market cap crossed paths well before that the 14th gen issues. Intel seems to be rushing things specifically because they're trying to catch up to AMD, and is sacrificing too much to get there.

[–] jagged_circle@feddit.nl 2 points 2 weeks ago

Not security?

[–] trespasser69@lemmy.world 62 points 2 weeks ago (2 children)

Intel's market cap: 98b $

AMD's market cap: 230b $

[–] Shadywack@lemmy.world 28 points 2 weeks ago (1 children)

What the fuck??? Insert Jumanji meme "What year is it?"

Numbers check out too. Wintel, slayed, and we didn't even notice.

[–] palordrolap@fedia.io 8 points 2 weeks ago (2 children)

The whole ring -3 / MINIX business a while back put a serious amount of FUD into the market and Intel has been on the wane ever since.

This is not necessarily unfounded FUD either. MINIX is literally there, lurking inside all modern Intel processors, waiting to be hacked by the enterprising ne'er-do-well. (NB: This is not to say that there aren't ways to do similar things to AMD chips, only that MINIX is not present in them, and it's theoretically a lot more difficult.)

Then bear in mind that MINIX was invented by Andrew Tanenbaum, someone Linus Torvalds has had disagreements with in the past (heck, Linux might not exist if not for MINIX and Linus' dislike of the way Tanenbaum went about it), and so there's an implicit bias against MINIX in the data-centre world, where Linux is far more present than it is on the desktop.

Thus, if you're a hypothetical IT manager and you're going to buy a processor for your data-centre server, you're ever so slightly more likely to go for AMD.

[–] frezik@midwest.social 8 points 2 weeks ago (1 children)

Note that Linus' disagreement was largely over design decisions and microkernel stuff. Linus actually respects Tanenbaum a great deal. Tanenbaum's book on operating systems is a CS classic and is a direct influence on the young Linus.

[–] palordrolap@fedia.io 1 points 2 weeks ago

Pretty sure my own education had a Tanenbaum book in amongst it, from which I learned a number of things. In another world, one where my brain isn't its own worst enemy, I could well be one of those IT managers. There the FUD would have been the main factor in my decision. Probably. Because I'm not sure I'd be completely happy if it was a Linux buried in the chipset either. Especially one largely outside my control.

[–] Laser@feddit.org 2 points 2 weeks ago

I'd guess this is less about MINIX vs. Linux and more about ultimately having 0 control over or insight into it.

[–] ripcord@lemmy.world 9 points 2 weeks ago* (last edited 2 weeks ago)

Their P/E is 125

One fucking hundred and twenty five.

That's more than twice Nvidia. It's completely disconnected from reality.

[–] schizo@forum.uncomfortable.business 11 points 2 weeks ago (3 children)

Granite Rapids is probably going win some of that back: a lot of the largest purchasers of x86 chips in the datacenter were buying Epycs because you could stuff more cores into a given amount of rack space than you could with Intel, but the Granite Rapids stuff has flipped that back the other way.

I'm sure AMD will respond with EVEN MORE CORES, and we'll just flop around with however many cores you can stuff into $15,000 CPUs and thus who is outselling whom.

[–] Pogogunner@sopuli.xyz 33 points 2 weeks ago* (last edited 2 weeks ago)

https://www.amd.com/en/products/processors/server/epyc/4th-generation-9004-and-8004-series/amd-epyc-9754.html

Launched June of 2023

128c/256t

$11,900

400w TDP

https://www.intel.com/content/www/us/en/products/sku/240777/intel-xeon-6980p-processor-504m-cache-2-00-ghz/specifications.html

Launched Q3 of 2024

128c/256t

$17,800

500w TDP

I don't think Granite Rapids are going to flip it back in Intels favor

[–] aard@kyu.de 15 points 2 weeks ago

It's not just cores - it is higher performance per rack unit while keeping power consumption and cooling needs the same.

That allows rack performance upgrades without expensive DC upgrades - and AMD has been killing dual and quad socket systems from intel with single and dual core epycs since launch now. Their 128 core one has a bit too high TDP, but just a bit lower core count and you can still run it in a rack configured for power and cooling needs from over a decade ago.

Granite rapids has too high TDP for that - you either go upgrade your DC, or lower performance per rack unit.

[–] frezik@midwest.social 3 points 2 weeks ago (1 children)

It's not just performance, though. It's also trust. If performance per watt was all that mattered, AMD would have cornered the server market years ago. Intel held on because they were considered rock solid stable--very important in a server. That trust was completely broken by the recent instability issues.

[–] schizo@forum.uncomfortable.business 1 points 1 week ago (1 children)

I didn't think the consumer-level chip immolation carried over to their xeons?

If it did, holy crap, they're mega-ultra-turbo-plaid levels of screwed.

[–] frezik@midwest.social 1 points 1 week ago

Not quite that, but more that the entire thing brings into question Intel's competence.

[–] jagged_circle@feddit.nl 6 points 2 weeks ago (4 children)

Intel's flagship 128-core Xeon 6980P 'Granite Rapids' processor costs $17,800, making it the company's most expensive standard CPU ever. By contrast, AMD's most expensive 96-core EPYC 6979P processor costs $11,805.

Jesus Christ when did we break 20 cores?

[–] levzzz@lemmy.world 16 points 2 weeks ago (1 children)

have you actually been living under a rock or something

[–] T156@lemmy.world 13 points 2 weeks ago (1 children)

CPUs have multiple cores now? Amazing.

[–] Jeffool@lemmy.world 2 points 2 weeks ago (1 children)

I remember reading columns saying soon, when multiple cores become common, compilers will thread your program for you...

[–] ozymandias117@lemmy.world 1 points 2 weeks ago

We were taught about OpenMP in like 2012 https://en.m.wikipedia.org/wiki/OpenMP

Intel's TBB was also used some, but not as frequently https://en.m.wikipedia.org/wiki/Threading_Building_Blocks

[–] deltapi@lemmy.world 6 points 2 weeks ago

I got a new laptop last month for $2200US, it has 24 cores. i9-14900HX

[–] Giooschi@lemmy.world 5 points 2 weeks ago

These are server CPUs, not something you wanna put in your laptop or desktop.

[–] TwanHE@lemmy.world 4 points 2 weeks ago

2016 i believe