this post was submitted on 27 Feb 2024
427 points (98.2% liked)
Technology
59534 readers
3195 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Limit them producing PCIe cards to low volume reference models and require their software to be open source to break that aspect of the lock-in, that's the two big things. As alternative to the latter, require them to have actual platform docs, right now they're not only providing the only compiler for their cards which is deliberately incompatible with everything else they're also making sure that noone else can get performance out of NVidia cards without excessive reverse-engineering, some things are even locked down hard via firmware signing. Splitting AI off from GPU would be a bonus.
I'm all for open source, but that would basically be like confiscating and giving away that part of the company.
Something we might expect from China, but not a democratic society.
Is their product the GPU or is their product softwares?
They are basically abusing their customers into doing less with the hardware by obfuscating it's functionality.
AFAIK you don't pay extra to use CUDA or drivers, so while software is part of the ecosystem, there is no doubt the product is the hardware. When in doubt, follow the money.
It's both. Jensen himself has said they aren't a GPU company anymore, highlighting their software stack. CUDA was not built in a day.
~~CUDA was not mostly 'built' by them. It was originally built on top of technology acquired by a company called Aegia. Aegia built an ASIC and a physics engine that could run instructions for the ASIC, called "PhysX" and that team ported their toolchain to run on GPUs and other ASICs.~~
CUDA was initially released in 2007, and Aegia was acquired in 2008. It would be extremely dishonest to not say that CUDA is what it is today due to Nvidia.
I get that hating on big corpos is cool on this platform, but there's no need to warp reality just to talk smack about them.
So essentially destroying one of US' most important companies going into the future. Their chips are so highly valued that the US government are creating sanctions specifically to stop the sale of their high end chips to hostile nations. I can't imagine the US shooting themselves in the foot like that.
If you think that would destroy nvidia you're selling them quite short. Other companies in the market are following that exact business model: Don't produce your own boards, actually document the hardware / have FLOSS drivers.
If you're an nvidia fanboy making nvidia compete on a level playing field by making them play fair sounds of course like a disaster, but you come here and throw national interest into the mix. How the fuck would nvidia losing market share to AMD damage US national interest it would strengthen its standing by having independent options.
It might even enable Intel to secure their foot into the market, remember, the only one among RGB to actually produce their own chips. In the US.
Nvidia not producing their own boards wouldn't solve anything but complicate matters for Nvidia. Ask Asus or EVGA what their margins are on their Nvidia GPUs. Nvidia opening their stack to the competition was the only half realistic suggestion.
Why do you think the whole 4090 D debacle happened? The US government have obvious interests in limiting the compute power China has access too. Nobody cares about their gaming GPUs, it's the ML chips that are making the waves, and those are of obvious national interest to the US government.
Brush that chip off your shoulder, not sure what's making you so angry. And why are you bringing AMD into the picture, they aren't even the biggest threat to Nvidia's ML hegemony. I was also specifically referring to how dismantling Nvidia would be counter productive to US interests, not Nvidia's market share.
Neither AMD nor Nvidia are into the foundry business so I don't see how that's relevant. Intel is decoupling their foundry so nothing is stopping either companies from porting their chips if need be.
Nvidia could only fuck them over like that because they were able to produce their own boards: If they have to rely on board manufacturers to sell their chips, they have to be nice enough for board manufacturers to actually bother doing that.
That's not the point of contention, this is:
...and market share going to other US companies would hurt that interest in what way exactly?
AMD shmahemde. There's a gazillion US startups in the space which could make it, or not, and/or be bought up by AMD or Intel, both certainly have their eyes and products on the market. The US' national interest is hurt by nvidia's unfair business practices limiting bringing innovation to market.
Margins wouldn't change. GPUs are brand sellers, OEMs would try to make their margins on other products. E.g. if Asus were to stop producing graphic cards for Nvidia, their mindshare would plummet
You are the one bringing up market share into the discussion. I haven't said anything about Nvidia losing market share hurting US interests
You're acting as if they're the only actor in the market, but there is competition from multiple sides. You don't dismantle a company purely on them having a dominant position.
Yes, yes you do because that's a market failure. In the free market there's no monopolies, thing is the real world lacks the perfect information and perfect rationality of actors for the market to actually be free, so we have to use regulations to approximate it.