this post was submitted on 16 Jun 2024
632 points (95.3% liked)
Greentext
4459 readers
1309 users here now
This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.
Be warned:
- Anon is often crazy.
- Anon is often depressed.
- Anon frequently shares thoughts that are immature, offensive, or incomprehensible.
If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I miss my CRT
I mean, I have some nostalgia moments, but while I think that while OP's got a point that the LCD monitors that replaced CRTs were in many ways significantly technically worse at the time, I also think that in pretty much all aspects, current LCD/LEDs beat CRTs.
Looking at OP's benefits:
CRT phosphors didn't just immediately go dark. They were better than some LCDs at the time, yeah, which were very slow, had enormous mouse pointer trails. But if you've ever seen a flashing cursor on a CRT and the fact that it actually faded out, you know that there was some response time.
https://en.wikipedia.org/wiki/Comparison_of_CRT,_LCD,_plasma,_and_OLED_displays
That's not really a function of the display technology. Yeah, a traditional analog CRT television with nothing else involved just spews the signal straight to the screen, but you can stick processing in there too, as cable boxes did. The real problem was "smart" TVs adding stuff like image processing that involved buffering some video.
At the time that people started getting LCDs, a lot of them were just awful in many respects compared to CRTs.
As one moved around, the color you saw on many types of LCDs shifted dramatically.
There was very slow response time; moving a cursor around on some LCD displays would leave a trail, as it sluggishly updated. Looked kind of like some e-ink displays do today.
Contrast wasn't great; blacks were really murky grays.
Early LCDs couldn't do full 24-bit color depth, and dealt with it by dithering, which was a real step back in quality.
Pixels could get stuck.
But those have mostly been dealt with.
CRTs had a lot of problems too, and LED/LCD displays really address those:
They were heavy. This wasn't so bad early on, but as CRTs grew, they really started to suck to work with. I remember straining a muscle in my back getting a >200lb television up a flight of stairs.
They were blurry. That can be a benefit, in that some software, like games, had graphics optimized for them, that lets the blur "blend" together pixels, and so old emulators often have some form of emulation of CRT video artifacts. But in a world where software can be designed around a crisp, sharp display, I'd rather have the sharpness. The blurriness also wasn't always even, especially on flat-screen CRTs; tended to be worse in corners. And you could only get the resolution and refresh rate so high, and the higher you went, the blurrier things were.
There were scanlines; brightness wasn't even.
You could get color fringing.
Sony Trinitrons (rather nice late CRT computer displays) had a faint, horizontal line that crossed the screen where a wire was placed to stabilize some other element of the display.
They didn't deal so well with higher-aspect-ratio displays (well, if you wanted a flat display, anyway). For movies and such, we're better-off with wider displays.
Analog signalling meant that as cables got longer, the image got blurrier.
They used more electricity and generated more heat than LED/LCD displays.
My Trinitron monitor actually had two of those stabilizing wires. They were very thin, much thinner than even a single scan line, but you could definitely notice them on an all white background.
Apparently the dividing line was 15 inches:
https://en.wikipedia.org/wiki/Trinitron