this post was submitted on 17 Sep 2024
207 points (96.8% liked)

Technology

59605 readers
3434 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] InvertedParallax@lemm.ee 17 points 2 months ago (13 children)

I'm considering it, but only just, my 5800x is good enough for most gaming, which is GPU bound anyway, and I run a dual xeon rig for my workstation.

zen 2-4 took care of a lot of the demand, we all have 8-16 cores now, what else could they give us?

[–] floofloof@lemmy.ca 5 points 2 months ago (5 children)

They do still seem to be making advances in single-core performance, but whether it matters to most people is a different question. Most people aren't using software that would benefit that much from these generation-to-generation performance improvements. It's not going to be anywhere near as noticeable as when we went from 2 or 4 cores to 8, 16, 24, etc.

[–] InvertedParallax@lemm.ee 5 points 2 months ago (1 children)

Single-thread is really hard, we've basically saturated our l1 working set size, adding more doesn't help much. Trying to extend the vector length just makes physical design harder and that reduces clock speed. The predictors are pretty good, and Apple finally kicked everyone up the ass to increase OOO like they should have.

Also, software still kind of sucks. It's better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.

Flash was the epochal change, maybe we have some new form of hybrid storage but that doesn't seem likely right now, Apple might do it to cut costs while preserving performance, actually yeah I see them trying to have their cake and eat it too.

Otherwise I don't know, we need a better way to deal with GPUs, there's nothing else that can move the needle, except true heterogenous core clusters, but I haven't been able to sell that to anyone so far, they all think it's a great idea, that someone else should do.

[–] floofloof@lemmy.ca 3 points 2 months ago* (last edited 2 months ago) (2 children)

Also, software still kind of sucks. It’s better than it was, but we need to improve it, the bloat is just barely being handled by silicon gains.

The incentives are all wrong for this, except in FOSS. It's never going to be a priority for Microsoft because everyone is used to the (lack of) speed of Windows, and "now a bit faster!" isn't a great marketing line. And it's not in the interests of hardware companies that need to keep shifting new boxes if the software doesn't keep bogging each generation down eventually. So we end up stuck with proprietary bloatware everywhere.

[–] naturlychee@lemm.ee 4 points 2 months ago (1 children)

"what intel gives, microsoft takes away"

dates from the mid 90s, still relevant.

[–] InvertedParallax@lemm.ee 2 points 2 months ago

Let's be fair, Ms was vastly outrunning Intel for a long time, it's only slowed down recently, and now the problem isn't single-thread bloat so much as it is an absolute lack of multicore scaling for almost all applications except some games, and even then windows fights as hard as it possibly can to stop you, like amd just proved yet again.

[–] InvertedParallax@lemm.ee 1 points 2 months ago

Yes, mostly the applications aren't there, if you need real cpu power (or gpu for that matter), you're running linux or on the cloud.

But we are reaching a point where the desktop has to either be relegated to the level of embedded terminal (ie ugly tablet, before it's dropped altogether), or make the leap to genuine compute tool, and I fear we're going to see the former.

load more comments (3 replies)
load more comments (10 replies)