this post was submitted on 25 Mar 2024
391 points (97.6% liked)
Technology
59653 readers
2807 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I mean, we do the same thing, for the same reasons, with our government and defense procurement orders these days. This isn’t that weird. It’s only weird in that they’re clearly cutting themselves off from the best high-volume x86 CPU manufacturers that currently exist, but aside from that, the geopolitical and strategic calculus adds up.
x86 is dying, legacy processing. It's all GPU's and ARM processing now. Apple is leaning hard into it so they set themselves as a leader in AI in the future.
Gaming though. The gaming situation on non-x86 cpus is passable at best. AFAIK you can't put a 4070ti in any non x86 system right now and have it work. Are there even any commercially available non-x86 systems that have pcie 16x slots?
The death of x86 is inevitable I just hope we can still play computer games on cheaper homebuilt systems afterwards because having to replace your entire system just to upgrade the integrated non upgradable gpu is no longer better or cheaper than consoles. I absolutely fucking doubt even indie developers, let alone others are going to downgrade graphics to let their games run on cheaper systems when this happens and everything becomes 10x more expensive.
Try an AMD card, much better chances because open drivers. There definitely have been people who got dedicated GPUs to run on ARM boards via the not even a handful of pcie lanes meant for m.2 storage.
I wouldn't be too sure about ARM because Qualcomm definitely is eyeing alternatives and other licensors might not exactly mind not being reliant on litigious bastards. That alternative is RISC-V. Most ARM licensors are making chips for products where apps don't really care about the architecture, that is, Android.
To actually make a dent in the completely entrenched x86 market we'd need probably chips with dual insn decoders. I certainly wouldn't put that past AMD they don't like being fused to Intel at the hip.