frezik

joined 1 year ago
[–] frezik@midwest.social -1 points 5 months ago

Doesn't matter. Having a buffer means either the buffer must be full before drawing, or you get screen tearing. It wasn't like racing the beam.

[–] frezik@midwest.social 9 points 5 months ago (1 children)

60Hz is what any NTSC TV would have had for consoles. Plenty of older computers, too. Lots of people gamed that way well into the 2000s.

Incidently, if you do the same calculation above for PAL (50Hz), you end up at 10ms, or about 2ms more lag than NTSC. Many modern LCDs can have response times <2ms (which is on top of the console's internal framerate matched to NTSC or PAL). The implication for retro consoles is that the lag difference between NTSC CRTs and modern LCDs is about the same as the difference between NTSC and PAL CRTs.

[–] frezik@midwest.social 23 points 5 months ago* (last edited 5 months ago) (3 children)

Of course there's buffers. Once RAM got cheap enough to have a buffer to represent the whole screen, everyone did that. That was in the late 80s/early 90s.

There's some really bad misconceptions about how latency works on screens.

[–] frezik@midwest.social 31 points 5 months ago* (last edited 5 months ago) (4 children)

They don't have zero latency. It's a misconception.

The industry standard way to measure screen lag is from the middle of the screen. Let's say you have a 60Hz display and hit the mouse button to shoot the very moment it's about to draw the next frame, and the game manages to process the data before the draw starts. The beam would start to draw, and when it gets to the middle of the screen, we take our measurement. That will take 1 / 60 / 2 = 8.3ms.

Some CRTs could do 90Hz, or even higher, but those were really expensive (edit: while keeping a high resolution, anyway). Modern LCDs can do better than any of them, but it took a long time to get there.

[–] frezik@midwest.social 3 points 5 months ago (1 children)

They primarily use evaporative cooling. Way less energy use, but no, it doesn't get returned.

[–] frezik@midwest.social 4 points 5 months ago (1 children)

Datacenters moved to using evaporative cooling to save power. Which it does, but at the cost of water usage.

Using salt water, or anything significantly contaminated like grey water, would mean sediment gets left behind that has to be cleaned up at greater cost. So yes, they generally do compete with drinking water sources.

There's no way nuclear gets built out in less than 10 years.

[–] frezik@midwest.social 3 points 5 months ago

Unlike purchasing things for imaginary gods, carbon credits could work in theory. At least well enough to be part of the solution. That is, if they were properly regulated around strategies that actually absorb carbon and everyone is forced to be honest and transparent.

Which none of them do, of course.

[–] frezik@midwest.social 2 points 5 months ago

Compared to Porsche, or BMWs over an i3 size? No.

[–] frezik@midwest.social 3 points 5 months ago* (last edited 5 months ago) (2 children)

They aren't very good, and they probably can't be. You're limited by the laws of physics on what they can carry for their enormous size. The Hindenberg was the largest of them, but including passengers and crew together, it carried less than 100 people. They scale really, really poorly.

We can improve on old dirigibles somewhat with lighter weight materials and engines. We're ultimately limited by the volume of the lifting gas, and we're just not going to add that much more capacity. Even if someone figured out a vacuum dirigible (which would be very vulnerable to a puncture), it'd only improve things marginally. It's an interesting engineering challenge, though.

One thing where dirigibles might be useful is windmill blades. Blades aren't that heavy, but they can't get much bigger while being transported on highways. Constructing the blades on site is another option, so we'll see which one wins.

Science and engineering aren't magic that makes everything better over time always, and people need to stop acting like it does. There are physical limits that we can't breach. As another example, we haven't significantly improved on the drag coefficient of designs by Porsche or the Chrysler Airflow back in the 1930s. There was a design Mercedes came up with a while back that's based on the boxfish that did reduce it further, but its frontal cross section is so high that it doesn't matter, anyway. (It's also ugly as fuck, but that's a different matter.)

[–] frezik@midwest.social 3 points 5 months ago

There's also an argument out there that companies should stop talking about feature sizes (that are fudged for marketing all the time, anyway) and instead talk about density of components.

Also, if you think Moore's Law is about density of components, then the industry has kept up. However, that's not actually what Moore claimed way back when: https://wumpus-cave.net/post/2024/03/2024-03-20-moores-law-is-dead/index.html

[–] frezik@midwest.social 2 points 5 months ago (1 children)

Angstrom was invented in physics because they needed a length unit that was smaller than SI prefixes would allow. The industry only picked it up once they got to a certain level.

(Contrary to what a lot of people think, physicists do not strictly follow SI. They bypass it for reasons of convenience all the time.)

[–] frezik@midwest.social 2 points 5 months ago (5 children)

But a lot of those sedans have range around 120mi, like the Mini EV or BMW i3. Many of the one's that remain are luxury brands with luxury prices, like the BMW i7 or Porsche Taycan.

view more: ‹ prev next ›