this post was submitted on 16 Jun 2024
632 points (95.3% liked)
Greentext
4430 readers
945 users here now
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Can someone please explain why CRT is 0 blur and 0 latency when it literally draws each pixel one-by-one using the electron ray running across the screen line-by-line?
Because it is analog. There are no buffers or anything in between. Your PC sends the image data in analog throug VGA pixel by pixel. These pixels are projected instantly in the requested color on the screen.
That makes 0 latency in the monitor, but how much latency is there in the drivers that convert a digital image to analogue signals? Isn't the latency just moved to the PC side?
I warn you before you dive in, this is a rabbit hole. Some key points (not exact, but to make things more layman): You don't see in digital, digital is "code". You see in analog, even on an LCD (think of sound vs video, its the same thing). Digital-only lacked contrast, brightness, color, basically all adjustments. So the signal went back and forth, adding even more latency.
Maybe think of it like a TVs game mode, where all the adjustments are turned off to speed up the digital to analog conversions.
Or like compressed video (digital) vs uncompressed video (analog), where the compression means you can send more data, but latency is added because it is compressed and uncompressed at each end.