this post was submitted on 30 Oct 2024
55 points (98.2% liked)
Technology
59534 readers
3135 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Agreed. Intel's design philosophy seems to be 'space heater that does math' for some reason. That's been true since at least 10th gen, if not before then. I don't know if it's just chasing benchmark wins at any cost, or if they're firmly of the opinion that hot and loud is fine as long as it's fast and no customers will care - which I don't really think is true anymore - or what, but they've certainly invested heavily in CPUs that push the literal limits of physics while trying to cool them.
That really stopped being true in the Skylake era when TSMC leapfrogged them and Intel was doing their 14nm++++++++ dance. I mean they did a shockingly good job of keeping that node relevant and competitive, but the were really only relevant and competitive on it until AMD caught up and exceeded their IPC with Ryzen 3000.
Yeah, if gaming is your use case there's exactly zero Intel products you should even be considering. There's nothing that's remotely competitive with a 7800x3d, and hell, for most people and games, even a 5800x3d is overkill.
And of course, those are both last-gen parts, so that's about to get even worse with the 9800x3d.
For productivity, I guess if you're mandated to use Intel or Intel cpus are the only validated ones it's a choice. But 'at the same price' is the problem: there's no case where I'd want to buy Intel over AMD if they cost the same and perform similarly, if for no other reason than I won't need something stupid like a 360mm AIO to cool the damn thing.
Absolutely, the 14nm process was leading when it was new, but the delays and ultimate failure of 10nm caused Intel to fall way behind. But before that from the very beginning of integrated circuits, Intel was the leader in manufacturing. From the late 70's Intel when Intel made the i8086 they achieved an economic advantage, that enabled them to stay ahead pretty much consistently in manufacturing.
In 2016 TSMC achieved parity with their 10nm equivalent to Intel 14nm with maybe a slight advantage over Intel, and after that it's well known that TSMC continued quickly improving past the points where Intel had failed, and TSMC became the leader.
I should have written always prior to 2016. Because it's 8 years ago now, but before that, Intel had stayed on top for half a century. Despite for instance M68000 and Alpha were way better processor designs than anything Intel had.
I agree, the only reason I quote this, is because of the insane change in how Intel vs AMD is viewed compared to before Ryzen! Compared to AMD FX series, the Intel Core and Core2 were so superior, it was hard to see how AMD could come back from that. But when Ryzen was presented late 2016 it was clear to me they had something new and exciting. And they really elevated desktop performance after years of minor iterations from Intel.
Yup, an advantage in this industry doesn't last forever, and a lead in a particular generation doesn't necessarily translate to the next paradigm.
Canon wants to challenge ASML and get back in the lithography game, with a tooling shift they've been working on for 10 years. The Japanese "startup" Rapidus wants to get into the foundry game by starting with 2nm, and they've got the backing of pretty much the entirety of the Japanese electronics industry.
TSMC is holding onto finFET a little bit longer than Samsung and Intel, as those two switch to gate all around FETs (GAAFETS). Which makes sense, because those two never got to the point where they could compete with TSMC on finFETs, so they're eager to move onto the next thing a bit earlier while TSMC squeezes out the last bit of profit from their established advantage.
Nothing lasts forever, and the future is always uncertain. The past history of the semiconductor industry is a constant reminder of that.
True, but with AMD the problem was that they had serious deficits and were near bankruptcy.
And these technologies are getting more an more expensive. The latest tapeout Apple did for M3, is estimated to have cost $1 billion. That's for tapeout alone!!!
We are at a point where this technology is so prohibitively expensive, that only the biggest global players can play.
The thing is, if Intel doesn't actually get 18A and beyond competitive, it might be on a death spiral towards bankruptcy as well. Yes, they've got a ton of cash on hand and several very profitable business lines, but that won't last forever, and they need plans to turn profits in the future, too.
Yes that's true, Intel has been behind for almost 8 years, but they've had massive ressources to make a comeback, A luxury AMD never had.
AMD was kept out of the market so they barely made money even when they were clearly ahead.
And only now after 8 years Intel is having a fiscal year deficit. That's not comparable to the situation AMD was in when they made their comeback.
But I agree that unless Intel succeed on 18A, things are not looking good. But there's some way yet until they are in a similar situation to where AMD was in 2016 when Ryzen was revealed.