this post was submitted on 17 Sep 2024
207 points (96.8% liked)

Technology

59605 readers
3501 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] cm0002@lemmy.world -1 points 2 months ago (2 children)

What are you using your computer for?? Just web browsing or something‽ I just upgraded from an i5-6600k/1060 setup and for like the past year and some change I've been hitting 100% CPU usage with just a few programs open, not even gaming lol

And that was with a CPU 3 generations newer lmao

[–] alsimoneau@lemmy.ca 2 points 2 months ago (1 children)

Gaming, working (data processing, physical modelling).

The trick is to use a lower overhead OS than Windows.

[–] cm0002@lemmy.world 2 points 2 months ago (1 children)

Gaming is one thing, a lot is GPU bound anyways, probably the same with "physical modeling"

But you cannot tell me your "data processing" would not be greatly sped up by using a newer proc (assuming it's not also GPU bound). Does it work, sure, but if it takes you 2 hours for it to process now but <30 minutes on something newer that's just a waste of time, resources and money. It's incredibly inefficient.

On the flip side, if all your work is GPU bound no wonder a 3rd gen proc from 2012 is keeping up lol

[–] alsimoneau@lemmy.ca 1 points 2 months ago

My modelling is CPU bound as it's a model made in Fortran by physicists (me included). The fact is that I wouldn't get a 4x boost, and a model running overnight still would. When I actually need performance I use a 1000 cores compute cluster for multiple days, so that would never run on any consumer CPU anyways.

For the data processing, the real bottle neck is disk access and my scripting speed, so the CPU doesn't really need to be amazing.

[–] OfficerBribe@lemm.ee 1 points 2 months ago* (last edited 2 months ago) (1 children)

Sounds like some bad software or something extra CPU intensive then. I use R5 2600 on W11 and it can handle everything I need with ease like web browsing (depending on pages and tab count it can be quite demanding), at least 3 VMs at the same time (2 Windows, 1 Linux), gaming, video transcoding. All that is not happening at the same time, but I can't remember last time I checked Task Manager to see what is using my CPU.

[–] cm0002@lemmy.world 1 points 2 months ago (1 children)

The R5 2600 is not only newer than my old i5 and faster, it also has a LOT more threads (12 vs 4) and an extra 2 full cores

Making it excellent for the multi threaded workloads (VMs) and leaving room for non-multithreaded optimized workloads

I have an RTSP client program running all the time displaying a handful of camera feeds. It had a ~45-55% average CPU usage even with GPU decoding/encoding enabled on it.

That same piece of software on my much newer 7600 changing absolutely nothing else software wise (I just dropped in the SSD from my old build) that same software barely cracks 5%

iCUE (for Corsair RGB control (yes I know there's open source versions I just never got around to it lol)) had a similar story with ~30-40% before and barely 4% now

[–] OfficerBribe@lemm.ee 1 points 2 months ago

I thought the comment was for R5 1600 which is close to my R5 2600 and those Intels were close in performance. Checked specs of them and I see they are not, also thought that i5 6600K was 4/8. In this case, yeah, upgrade probably was more than noticeable.