this post was submitted on 12 Mar 2024
71 points (97.3% liked)

Games

16806 readers
897 users here now

Video game news oriented community. No NanoUFO is not a bot :)

Posts.

  1. News oriented content (general reviews, previews or retrospectives allowed).
  2. Broad discussion posts (preferably not only about a specific game).
  3. No humor/memes etc..
  4. No affiliate links
  5. No advertising.
  6. No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
  7. No self promotion.
  8. No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
  9. No politics.

Comments.

  1. No personal attacks.
  2. Obey instance rules.
  3. No low effort comments(one or two words, emoji etc..)
  4. Please use spoiler tags for spoilers.

My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.

Other communities:

Beehaw.org gaming

Lemmy.ml gaming

lemmy.ca pcgaming

founded 1 year ago
MODERATORS
top 17 comments
sorted by: hot top controversial new old
[–] deegeese@sopuli.xyz 11 points 8 months ago

From the description it kind of reminds me of the performance gains seen in the 1990s when the industry moved from direct geometry calls to using display lists to cut down on the number of times the GPU is waiting for instructions.

[–] echo64@lemmy.world 11 points 8 months ago (1 children)

One of the big reasons the ps5 usually beats out the Xbox hardware, despite worse hardware by the numbers, is because of this kind of thinking.

They built a custom ssd controller that the gpu has direct access to. This means when something needs to happen involving the ssd and gpu, the gpu can just directly access the memory it needs to without needing to wait on the cpu.

Just let the gpu do what it does best without having to wait on the cpu, and everything is going to go so much smoother.

[–] pastermil@sh.itjust.works 10 points 8 months ago (1 children)
[–] echo64@lemmy.world 4 points 8 months ago

Eh in a way, dma is like the cpu offloading to another chip. What sony did is make the gpu not need the cpu to use dma

[–] mindbleach@sh.itjust.works 0 points 8 months ago (4 children)

At some point... do you need the CPU? There's stuff it will be better at, yes, and more power is always better. But the GPU can run any code.

The whole computer outside the video card could be reduced to a jumped-up southbridge.

[–] zalgotext@sh.itjust.works 18 points 8 months ago (1 children)

GPUs are ridiculously, ludicrously good at doing an absolute shit-ton of very simple, non-dependent calculations simultaneously. CPUs are good at... Well, everything else. So yes, you do still need the CPU.

[–] aBundleOfFerrets@sh.itjust.works 13 points 8 months ago (1 children)

GPUs are really terrible at the kind of multitasking required to run an OS

[–] KeenFlame@feddit.nu 2 points 8 months ago (1 children)

Buncha dry students here giving you shit. It is not a stupid question.

Some day we might not need a cpu. The biggest hurdle probably isn't actually even the chip architecture, but that the software needs to be remade and it's not something you do in a day exactly

[–] Socsa@sh.itjust.works 3 points 8 months ago* (last edited 8 months ago)

Right, GPGPU is a thing. You can do branch logic on GPU and you can do SIMD on a CPU. But in general, logic and compute have some orthogonal requirements which means you end up with divergent designs if you start optimizing in either direction.

This is also a software architecture and conceptual problem as well. You simply can't do conditional SIMD. You can compute both graphs in parallel and "branch" when the tasks join (which is a form of speculative execution), but that's rarely more efficient than defining and dispatching compute tasks on demand when you get to the edges of the performance curve.

[–] mindbleach@sh.itjust.works 0 points 8 months ago* (last edited 8 months ago)

Fuck me for playing what-if, apparently.

Not like this news is explicitly about upending the typical CPU-GPU relationship.