this post was submitted on 07 Mar 2024
17 points (69.8% liked)

Technology

59589 readers
2910 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Technological feat aside:

Revolutionary heat dissipating coating effectively reduces temperatures by more than 10%

78.5C -> 70C = (78.5 - 70) / 78.5 = 0.1082 = 10% right?!

Well, not really. Celsius is an arbitrary temperature scale. The same values on Kelvin would be:

351.65K -> 343.15K = (351.65 - 343.15) / 351.65 = 0.0241 = 2% (???)

So that's why you shouldn't do % on temp changes. A more entertaining version: https://www.youtube.com/watch?v=vhkYcO1VxOk&t=374s

you are viewing a single comment's thread
view the rest of the comments
[–] 7heo@lemmy.ml 3 points 8 months ago* (last edited 8 months ago) (2 children)

I would argue that what makes sense when considering temperature percentages wrt dissipation, is the difference between old and new, divided by the difference between the system at rest and the old temperature.

Which is then a ratio of offsets, rather than a ratio of one offset and a difference with an arbitrarily defined origin.

In this case, it is fair to assume the temperature at rest of the system around 292K, or 19C.

Which would give: (78.5C - 70C) / (78.5C - 19C) = 14.29%, or (351.65K - 343.15K) / (351.65K - 292.15K) = 14.29%.

[–] conciselyverbose@sh.itjust.works 4 points 8 months ago (1 children)

GamersNexus presents their temperature testing in terms of difference from room temperature, so this is probably how they'd do this comparison.

I'm not sure they'd see a reason to cover ram temperature unless it was approaching actual risk of harm or enabled higher clocks, though. Comparing cases or CPU coolers by temperature makes sense. Comparing GPUs when they're using the same chip and cooling performance is a big part of the difference between models? Sure. But RAM? Who cares.

[–] 7heo@lemmy.ml 1 points 8 months ago* (last edited 8 months ago)

I mean I also don't really care about the temperature of my ram unless it prevents it from working. RAM overclocking isn't that useful, and unstable ram sucks ass.

However, it doesn't matter what the component is: the original difference over ambient is the amount of heat that operating the component generated. The difference after cooling is essentially the amount of heat that the specific cooling solution was able to handle. No matter the component. And dividing the latter by the former gives out the amount of cooling the cooling solution provided, relative to the amount of heat operating the component generated. This works for any component and any cooling solution. Cooling it further than ambient can be desirable for some use cases, that's why chillers exist, and that will essentially give out a percentage over 100.