this post was submitted on 05 Dec 2023
60 points (95.5% liked)
Games
16800 readers
789 users here now
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
- News oriented content (general reviews, previews or retrospectives allowed).
- Broad discussion posts (preferably not only about a specific game).
- No humor/memes etc..
- No affiliate links
- No advertising.
- No clickbait, editorialized, sensational titles. State the game in question in the title. No all caps.
- No self promotion.
- No duplicate posts, newer post will be deleted unless there is more discussion in one of the posts.
- No politics.
Comments.
- No personal attacks.
- Obey instance rules.
- No low effort comments(one or two words, emoji etc..)
- Please use spoiler tags for spoilers.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
What is the right proportion? 7800 XT uses 25% more power than 4070 (200W vs 250W). It seems outstanding to me.
Are you measuring power actually used, or are you just looking at TDP figures on the marketing material? You can't directly compare those marketing numbers on products from different gens, much less different companies.
To really understand what's going on, you need to look at something like watts per frame.
I'm getting the numbers from GamersNexus' power consumption chart from their review of the card.
Ok, then those numbers are at full load running a benchmark, assuming you're talking about charts like this. Actual power usage in games could be a fair amount lower.
It could be, depending on the game. It's still a good indicator that 7800XT would run hotter than 4070 in general cases.
The numbers here are the maximum number of watts used if i recall correctly. So most of the time when your gaming its probably gonna be close to those numbers
No, it's TDP like with CPUs. So a 200W GPU needs a cooler rated to dissipate 200W worth it thermal load (and that's not scientific, AMD and NVIDIA do it differently). The actual power usage can be higher than that under full load, and it could be lower during normal, sustained usage.
So the wattage rating doesn't really tell you much about expected power usage unless you're comparing two products from the same product line (e.g. RX 6600 and 6700), and sometimes between generations from the same company (e.g. 6600 and 7600), and even then it's just a rough idea.
Oh ok thanks
You think 50 watts difference will noticeably heat up your room? You must have a tiny room then or the difference will hardly be measurable.
It is already hot enough that I don't want to add more heat to it. Also yes I have a tiny room.