this post was submitted on 19 Oct 2024
384 points (99.0% liked)

Technology

59495 readers
3081 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] Buffalox@lemmy.world 131 points 1 month ago* (last edited 1 month ago) (4 children)

Everybody in the know, knows that x86 64 bit was held back to push Itanium, Intel was all about market segmentation, which is also why Celeron was amputated on for instance RAM compared to Pentium.
Market segmentation has a profit maximization motive. You are not allowed to use cheap parts for things that you are supposed to buy expensive parts for. Itanium was supposed to be the only viable CPU for servers, and keeping x86 32 bit was part of that strategy.
That AMD was successful with 64 bit, and Itanium failed was Karma as deserved for Intel.

Today it's obvious how moronic Intel's policy back then was, because even phones got 64 bit CPU's too back around 2009.
32 bits is simply too much of a limitation for many even pretty trivial tasks. And modern X86 chips are in fact NOT 64 bit anymore, but hybrids that handle tasks with 256 bits routinely, and some even with 512 bits, with instruction extensions that have become standard on both Intel and AMD

When AMD came with Ryzen Threadripper and Epyc, and prices scaled very proportionally to performance, and none were artificially hampered, it was such a nice breath of fresh air.

[–] barsoap@lemm.ee 38 points 1 month ago (24 children)

And modern X86 chips are in fact NOT 64 bit anymore, but hybrids that handle tasks with 256 bits routinely, and some even with 512 bits, with instruction extensions that have become standard on both Intel and AMD

On a note of technical correctness: That's not what the bitwidth of a CPU is about.

By your account a 386DX would be an 80-bit CPU because it could handle 80-bit floats natively, and the MOS6502 (of C64 fame) a 16-bit processor because it could add two 16-bit integers. Or maybe 32 bits because it could multiply two 16-bit numbers into a 32-bit result?

In reality the MOS6502 is considered an 8-bit CPU, and the 386 a 32-bit one. The "why" gets more complicated, though: The 6502 had a 16 bit address bus and 8 bit data bus, the 368DX a 32 bit address and data bus, the 368SX a 32 bit address bus and 16 bit external data bus.

Or, differently put: Somewhere around the time of the fall of the 8 bit home computer the common understanding of "x-bit CPU" switched from data bus width to address bus width.

...as, not to make this too easy, understood by the instruction set, not the CPU itself: Modern 64 bit processors use pointers which are 64 bit wide, but their address buses usually are narrower. x86_64 only requires 48 bits to be actually usable, the left-over bits are required to be either all ones or all zeroes (enforced by hardware to keep people from bit-hacking and causing forwards compatibility issues, 1/0 IIRC distinguishes between user vs. kernel memory mappings it's been a while since I read the architecture manual). Addressable physical memory might even be lower, again IIRC. 2^48^B are 256TiB no desktop system can fit that much, and I doubt the processors in there could address it.

load more comments (24 replies)
[–] mox@lemmy.sdf.org 22 points 1 month ago (1 children)

Intel was all about market segmentation

See also: ECC memory.

[–] Wispy2891@lemmy.world 1 points 1 month ago (1 children)

Sometimes for some reason, there's no limit. Like the cheap i3-8100 can use ECC memory

[–] IndustryStandard@lemmy.world 2 points 1 month ago

AMD allowed procssors to use ECC memory since Ryzen so the the jig was up.

[–] frezik@midwest.social 21 points 1 month ago (1 children)

It was also a big surprise when Intel just gave up. The industry was getting settled in for a David v Goliath battle, and then Goliath said this David kid was right.

[–] Buffalox@lemmy.world 15 points 1 month ago (1 children)

Yes, I absolutely thought Intel would make their own, and AMD would lose the fight.
But maybe Intel couldn't do that, because AMD had patented it already, and whatever Intel did, it could be called a copy of that.

Anyways it's great to see AMD finally is doing well and finally is profitable. I just never expected Intel to fail as badly as they are? So unless they fight their way to profitability again, we may be in the same boat again as we were when Intel was solo on X86?

But then again, maybe x86 is becoming obsolete, as Arm is getting ever more competitive.

[–] frezik@midwest.social 11 points 1 month ago (1 children)

Right, I think the future isn't Intel v AMD, it's AMD v ARM v RISC-V. Might be hard to break into the desktop and laptop space, but Linux servers don't have the same backwards compatibility issues with x86. That's a huge market.

load more comments (1 replies)
[–] Valmond@lemmy.world 13 points 1 month ago

I hated that you had to choose, virtualization or overclocking so much. Among a lot of other forced limitation crap from intel.

A bit like cheap mobile phones had a too small ssd and buying one at least "normal" sized bumped everything else (camera, cpu, etc) up too, including price ofc.

[–] Technus@lemmy.zip 130 points 1 month ago (5 children)

This highlights really well the importance of competition. Lack of competition results in complacency and stagnation.

It's also why I'm incredibly worried about AMD giving up on enthusiast graphics. I have very few hopes in Intel ARC.

[–] barsoap@lemm.ee 32 points 1 month ago (1 children)

I expect them to merge enthusiast into the pro segment: It doesn't make sense for them to have large RDNA cards because there's too few customers just as it doesn't make sense for them to make small CDNA cards but in the future there's only going to be UDNA and the high end of gaming and the low end of professional will overlap.

I very much doubt they're going to do compute-only cards as then you're losing sales to people wanting a (maybe overly beefy) CAD or Blender or whatever workstation, just to save on some DP connectors. Segmenting the market only makes sense when you're a (quasi-) monopolist and want to abuse that situation, that is, if you're nvidia.

[–] bruhduh@lemmy.world 13 points 1 month ago (5 children)

True, in simple words, AMD is moving towards versatile solutions that is going to satisfy corporate clients and ordinary clients while producing same thing, their apu and xdna architecture is example, apu is used in playstation and Xbox, xdna and epyc used in datacenters, and AMD is uniting btb and btc merchandise for manufacture simplification

load more comments (5 replies)
[–] Alphane_Moon@lemmy.world 15 points 1 month ago (2 children)

They honestly seem to be done with high-end "enthusiast" GPUs. There is probably more money/potential for iGPUs and low/middle level products optimized for laptops.

[–] Technus@lemmy.zip 13 points 1 month ago (2 children)

Their last few generations of flagship GPUs have been pretty underwhelming but at least they existed. I'd been hoping for a while that they'd actually come up with something to give Nvidia's xx80 Ti/xx90 a run for their money. I wasn't really interested in switching teams just to be capped at the equivalent performance of a xx70 for $100-200 more.

[–] TheGrandNagus@lemmy.world 29 points 1 month ago (2 children)

The 6900XT/6950XT were great.

They briefly beat Nvidia until Nvidia came out with the 3090 Ti. Even then, it was so close you couldn't tell them apart with the naked eye.

Both the 6000 and 7000 series have had cards that compete with the 80-class cards, too.

The reality is that people just buy Nvidia no matter what. Even the disastrous GTX 480 outsold ATI/AMD's cards in most markets.

The $500 R9 290X was faster than the $1000 Titan, with the R9 290 being just 5% slower and $400, and yet AMD lost a huge amount of money on it.

AMD has literally made cards faster than Nvidia's for half the price and lost money on them.

It's simply not viable for AMD to spend a fortune creating a top-tier GPU only to have it not sell well because Nvidia's mindshare is arguably even better than Apple's.

Nvidia's market share is over 80%. And it's not because their cards are the rational choice at the price points most consumers are buying at. It really cannot be stressed enough how much of a marketing win Nvidia is.

[–] sugar_in_your_tea@sh.itjust.works 11 points 1 month ago* (last edited 1 month ago) (1 children)

Yup, it's the classic name-brand tax. That, and Nvidia also wins on features, like RTX and AI/compute.

But most people don't actually use those features, so most people seem to be buying Nvidia due to brand recognition. AMD has dethroned Intel on performance and price, yet somehow Intel remains dominant on consumer PCs, though the lead is a lot smaller than before.

If AMD wants to take over Nvidia, they'll need consistently faster GPUs and lower prices with no compromises on features. They'd have to invest a ton to get there, and even then Nvidia would probably sell better than AMD on name recognition alone. Screw that! It makes far more sense for them to stay competitive and suck up a bunch of the mid-range market and transition the low-end market to APUs. Intel can play at the low-mid range markets, and AMD will slot themselves as a bit better than Intel, and a better value than Nvidia.

That said, I think AMD needs to go harder on the datacenter for compute, because that's where the real money is, and it's all going to Nvidia. If they can leverage their processors to provide a better overall solution for datacenter compute, they could translate that into prosumer compute devices. High end gaming is cool, but it's not nearly as lucrative as datacenter. I would hesitate to make AI-specific chips, but instead make high quality general compute chips so they can take advantage of whatever comes after the current wave of AI.

I think AMD should also get back into ARM and low-power devices. The snapdragon laptops have made a big splash, and that market could explode once the software is refined, and AMD should be poised to dominate it. They already have ARM products, they just need to make low-power, high performance products for the laptop market.

[–] pycorax@lemmy.world 3 points 1 month ago (1 children)

I think AMD should also get back into ARM and low-power devices. The snapdragon laptops have made a big splash, and that market could explode once the software is refined, and AMD should be poised to dominate it. They already have ARM products, they just need to make low-power, high performance products for the laptop market.

They don't need to go with ARM. There's nothing inherently wrong with the x86 instruction set that prevents them from making low power processors, it's just that it doesn't make sense for them to build an architecture for that market since the margins for servers are much higher. Even then, the Z1 Extreme got pretty close to Apple's M2 processors.

Lunar Lake has also shown that x86 can match or beat Qualcomm's ARM chips while maintaining full compatibility with all x86 applications.

load more comments (1 replies)
load more comments (1 replies)
[–] pycorax@lemmy.world 13 points 1 month ago (2 children)

Were the 6000 series not competitive? I got a 6950 XT for less than half the price of the equivalent 3090. It's an amazing card.

[–] vithigar@lemmy.ca 9 points 1 month ago

Yes, they were, and that highlights the problem really. Nvidia's grip on mind share is so strong that AMD releasing cards that matched or exceeded at the top end didn't actually matter and you still have people saying things like the comment you responded to.

It's actually incredible how quickly the discourse shifted from ray tracing being a performance hogging gimmick and DLSS being a crutch to them suddenly being important as soon as AMD had cards that could beat Nvidia's raster performance.

[–] JaY_III@lemmy.ca 2 points 1 month ago (1 children)

The 6000 series is faster in Raster but slower in Ray Tracing.

Reviews have been primarily pushing cards based on RT since it has become available. nVidia has a much larger marketing budget than AMD, and ever since they have been able to leverage the fact they have the fastest Ray Tracing, AMD share has been noise diving.

[–] pycorax@lemmy.world 4 points 1 month ago (1 children)

I mean I guess? But the question here was about value and no way is RT worth double the price.

It is if that's the main thing you care about.

[–] pycorax@lemmy.world 3 points 1 month ago

Wouldn't be the first time they did this though, I wouldn't be surprised if they jump back into the high end once they're ready.

[–] chalupapocalypse@lemmy.world 8 points 1 month ago (1 children)

I don't see this happening with both consoles using AMD, honestly I could see Nvidia going less hard on graphics and pushing more towards AI and other related stuff, and with the leaked prices for the 5000s they are going to price themselves out of the market

[–] sunzu2@thebrainbin.org 7 points 1 month ago

Crypto and AI hype destroyed the prices for gamers.

I doubt we ate ever going back the

I am on 5-10 years upgrade cycle now anyway. Sure new shiti is faster but shot from 2 gen ago is still going everything I need. New features like ray tracing are hardly even worth. Lime sure it is cool but what is the actually value proposition.

If you bought hardware for raytecing, kinda Mehh.

With that being said. Local LLM is a fun use-case

[–] Buffalox@lemmy.world 5 points 1 month ago (1 children)

Lack of competition results in complacency and stagnation.

This is absolutely true, but it wasn't the case regarding 64 bit x86. It was a very bad miscalculation, where Intel wanted bigger more profitable server marketshare.
So Intel was extremely busy with profit maximization, so they wanted to sell Itanium for servers, and keep the x86 for personal computers.

The result was of course that X86 32 bit couldn't compete when AMD made it 64bit, and Itanium failed despite HP-Compaq killing the worlds fastest CPU at the time the DEC Alpha, because they wanted to jump on Itanium instead. But the Itanium frankly was an awful CPU based on an idea they couldn't get to work properly.

This was not complacency, and it was not stagnation in the way that Intel made actually real new products and tried to be innovative, but with the problem that the product sucked and was too expensive for what it offered.

Why the Alpha was never brought back, I don't understand? As mentioned it was AFAIK the worlds fastest CPU when it was discontinued?

load more comments (1 replies)
[–] kubica@fedia.io 4 points 1 month ago

Even successful companies themselves take care in not putting all eggs in one basket on anything they do. Having alternatives is a life saver. We should ensure that we have alternatives too.

[–] TimeSquirrel@kbin.melroy.org 65 points 1 month ago (2 children)

This is like Kodak inventing the digital camera and then sitting on it for the next 20 years. Because it doesn't use film. And Kodak is film.

[–] Buffalox@lemmy.world 16 points 1 month ago* (last edited 1 month ago) (3 children)

This is not entirely fair, Kodak invested a lot in digital photography, I personally bought a $1500 Kodak digital camera around 2002.
But Kodak could not compete with Canon and other Japanese makers.

To claim Kodak could have made more successful cameras earlier, is ignoring the fact that the technology to make the sensors simply wasn't good enough early on, and would never have been an instant hit for whoever came first to market. Early cameras lacked badly in light sensitivity dynamics and sharpness/resolution. This was due to limitations in even world leading CMOS production capabilities back then, it simply wasn't good enough, and to claim Kodak should have had the capability to leapfrog everybody doesn't make it true.

To claim Kodak could have beat for instance Canon and Sony, is ignoring the fact that those were companies with way more experience in the technologies required to refine digital photography.

Even with the advantage of hindsight, I don't really see a path that would have rescued Kodak. Just like typesetting is dead, and there is no obvious path how a typesetting company could have survived.

[–] barsoap@lemm.ee 11 points 1 month ago (3 children)

Kodak isn't dead they're just not dominating the imagining industry any more. They even multiplied, there's now Kodak Alaris in addition to the original Kodak.

Between them they still are dominating analogue film which still has its uses and it could even be said that if they hadn't tried to get into digital they might've averted bankruptcy.

There's also horse breeders around which survived the invention of the automobile, and probably also a couple that didn't because their investments into car manufacturing didn't pan out. Sometimes it's best to stick to what you know while accepting that the market will shrink. Last year they raised prices for ordinary photography film because they can't keep up with demand, their left-over factories are running 24/7.

Sometimes it’s best to stick to what you know while accepting that the market will shrink

I argue it's always best to do that. A company dying doesn't mean it failed, it just means it fulfilled its purpose. Investors should leave, not because the company is poorly run, but because other technologies are more promising. These companies shouldn't go bankrupt, but merely scale back operations and perhaps merge with other companies to maintain economies of scale.

I honestly really don't like companies that try to do multiple things, because they tend to fail in spectacular ways. Do what you're good at, fill your niche as best you can, and only expand to things directly adjacent to your core competency. If the CEO sees another market that they can capture, then perhaps the CEO should leave and go start that business, not expand the current business into that market.

[–] Buffalox@lemmy.world 2 points 1 month ago

it could even be said that if they hadn’t tried to get into digital they might’ve averted bankruptcy.

Now there's an interesting thought. ;)

There’s also horse breeders around which survived the invention of the automobile,

Exactly, and retro film photography is making a comeback. Kind of like Vinyl record albums.

[–] Deluxe0293@infosec.pub 2 points 1 month ago

as a former TKO on the Nexpress series, don’t sleep on Kodak’s presence in the commercial print manufacturing industry either. would love to still be on the shop floor to have an opportunity to run the Prosper inkjet web press.

[–] CosmicTurtle0@lemmy.dbzer0.com 8 points 1 month ago (1 children)

That made me true but let's not ignore the huge profit motive for Kodak to keep people on film. That was their money maker.

They had an incentive to keep that technology out of the consumer market.

[–] Buffalox@lemmy.world 4 points 1 month ago (2 children)

They absolutely did, but they knew they couldn't do that forever, because Moore's law goes for CMOS too. film photography would end as a mainstream product, so they actually tried to compete both in digital photography, scanners, and photo printing.
But their background was in chemical photo technologies, and they couldn't transfer their know how in that, to be an advantage with the new technologies, even with the research they'd done and the strong brand recognition.

load more comments (2 replies)
load more comments (1 replies)
[–] golli@lemm.ee 3 points 1 month ago* (last edited 1 month ago)

The concept you are describing is called Innovator's Dilemma and imo the most recent example for it happening is with legacy car manufacturers missing the ev transition, because it would eat into their margins from ICE. But i am not sure if this is a good example for it.

However imo it seems like a great example for what Steve Jobs describes in this video about the failure of Xerox. Namely that in a monopoly position marketing people drive product people out of the decision making forums. Which seems exactly the case here where the concerns of an engineer were overruled by the higher ups, because it didn't fit within their product segmentation.

[–] sunzu2@thebrainbin.org 59 points 1 month ago

108 billion spent on share buy backs while AMD took the lead 🤡

[–] mox@lemmy.sdf.org 8 points 1 month ago
[–] JakenVeina@lemm.ee 2 points 1 month ago* (last edited 1 month ago)

I decided to split the difference, by leaving in the gates, but fusing off the functionality. That way, if I was right about Itanium and what AMD would do, Intel could very quickly get back in the game with x86. As far as I'm concerned, that's exactly what did happen.

I'm sure he got a massive bonus for this decision, when all the suits realized he was right and he'd saved their asses. /s

load more comments
view more: next ›