this post was submitted on 03 Apr 2026
288 points (99.3% liked)
Technology
83406 readers
4206 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I wish GPUs in AI data centers (or worse, the ones purchased and not installed yet) were more general-purpose than they appear to be. That's the part that makes them AI data centers: the optimized hardware.
I do agree things are complex. And I like reading about the intricacies of that complexity. The overall picture is still a pretty bad one, though.
Ehhhh.
Yes, there are some fairly revolutionary(-ish) chips. Those are few and far between because they tend to be hyper specialized. Inference but not training or only optimized for a very small input matrix (common for edge computing like cameras).
By and large? They really ARE "traditional" GPGPUs that are optimized to hell and back for vector operations and linear algebra. And a lot of the gains there come from multiplying their floating point performance by 2-4 (depending on if half or quarter precision). They aren't as good for double precision as something optimized for it but basically only a very small subset of users need that. There will be no issues repurposing the hardware in these data centers.
And the rest is data movement which has always been the real problem.
I don’t think most companies will find much value in that though. I know that none of the infrastructure I work with uses heavy calculations, and if we tried to jam it in, we’d be making solutions looking for problems.
An email server doesn’t need a GPU, neither does a file server, or a website, or an e-commerce platform.
Suppose they could rent it out as supercomputers but I don’t think the return on cost is going to be that good.