this post was submitted on 27 Oct 2025
461 points (92.3% liked)
Technology
76418 readers
2614 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I can pretty confidently say that 4k is noticeable if you're sitting close to a big tv. I don't know that 8k would ever really be noticeable, unless the screen is strapped to your face, a la VR. For most cases, 1080p is fine, and there are other factors that start to matter way more than resolution after HD. Bit-rate, compression type, dynamic range, etc.
Seriously, articles like this are just clickbait.
They also ignore all sorts of usecases.
Like for a desktop monitor, 4k is extremely noticeable vs even 1440P or 1080P/2k
Unless you're sitting very far away, the sharpness of text and therefore amount of readable information you can fit on the screen changes dramatically.
The article was about TVs, not computer monitors. Most people don't sit nearly as close to a TV as they do a monitor.
Oh absolutely, but even TVs are used in different contexts.
Like the thing about text applies to console games, applies to menus, applies to certain types of high detail media etc.
Complete bullshit articles. The same thing happened when 720p became 1080p. So many echos of “oh you won’t see the difference unless the screen is huge”… like no, you can see the difference on a tiny screen.
We’ll have these same bullshit arguments when 8k becomes the standard, and for every large upgrade from there.
I agree to a certain extent but there are diminishing returns, same with refreshrates. The leap from 1080 to 4k is big. I don't know how noticeable upgrading from 4k to 8k would be for the average TV setup.
For vr it would be awesome though
8K would probably be really good for large computer monitors, due to viewing distances. It would be really taxing on the hardware if you were using it for gaming, but reasonable for tasks that aren't graphically intense.
Computer monitors (for productivity tasks) are a little different though in that you are looking at section of the screen rather than the screen as a whole as one might with video. So having extra screen real estate can be rather valuable.
So, a 55-inch TV, which is pretty much the smallest 4k TV you could get when they were new, has benefits over 1080p at a distance of 7.5 feet... how far away do people watch their TVs from? Am I weird?
And at the size of computer monitors, for the distance they are from your face, they would always have full benefit on this chart. And even working into 8k a decent amount.
And that's only for people with typical vision, for people with above-average acuity, the benefits would start further away.
But yeah, for VR for sure, since having an 8k screen there would directly determine how far away a 4k flat screen can be properly re-created. If your headset is only 4k, a 4k flat screen in VR is only worth it when it takes up most of your field of view. That's how I have mine set up, but I would imagine most people would prefer it to be half the size or twice the distance away, or a combination.
So 8k screens in VR will be very relevant for augmented reality, since performance costs there are pretty low anyway. And still convey benefits if you are running actual VR games at half the physical panel resolution due to performance demand being too high otherwise. You get some relatively free upscaling then. Won't look as good as native 8k, but benefits a bit anyway.
There is also fixed and dynamic foveated rendering to think about, with an 8k screen, even running only 10% of it at that resolution and 20% at 4k, 30% at 1080p, and the remaining 40% at 540p, even with the overhead of so many foveation steps, you'll get a notable reduction in performance cost. Fixed foveated would likely need to lean higher towards bigger percentages of higher res, but has the performance advantage of not having to move around at all from frame to frame. Can benefit from more pre-planning and optimization.
A lot of us mount a TV on the wall and watch from a couch across the room.
And you get a TV small enough that it doesn't suit that purpose? Looks like 75 inch to 85 inch is what would suit that use case. Big, but still common enough.
I've got a LCD 55" TV and a 14" laptop. Ok the couch, the TV screen looks to me about as big as the laptop screen on my belly/lap, and I've got perfect vision; on the laptop I can clearly see the difference between 4k and FULL HD, on the TV, not so much.
I think TV screens aren't as good as PC ones, but also the TVs' image processors turn the 1080p files into better images than what computers do.
Hmm, I suppose quality of TV might matter. Not to mention actually going through the settings and making sure it isn't doing anything to process the signal. And also not streaming compressed crap to it. I do visit other peoples houses sometimes and definitely wouldn't know they were using a 4k screen to watch what they are watching.
But I am assuming actually displaying 4k content to be part of the testing parameters.
Yeah well my comparisons are all with local files, no streaming compression
Also, usually when people use the term "perfect" vision, they mean 20/20, is that the case for you too. Another term for that is average vision, with people that have better vision than that having "better than average" vision.
Idk what 20/20 is, I guess you guys use a different scale, last mandatory vision test at work was 12/10 with 6/7 on I don't remember which color recognition range, but I'm not sure about the latter 'cause it was ok last year and 6/7 the year before also. IIRC the best score for visual acuity is 18/10, but I don't think they test that far during work visits, I'd have to go to the ophthalmologist to know.
I would imagine it's the same scale, just a base 10 feet instead of 20 feet. So in yours you would see at 24 feet what the average person would see at 20 feet. Assuming there is a linear relation, and no circumstantial drop off.
I doubt it's feet, but if it's a distance, I guess it doesn't matter much
People are legit sitting 15+ feet away and thinking a 55 inch TV is good enough... Optimal viewing angles for most reasonably sized rooms require a 100+ inch TV and 4k or better.
There's a giant TV at my gym that is mounted right in front of some of the equipment, so my face is inches away. It must have some insane resolution because everything is still as sharp as a standard LCD panel.
Would be a more useful graph if the y axis cut off at 10, less than a quarter of what it plots.
Not sure what universe where discussing the merits of 480p at 45 ft is relevant, but it ain't this one. If I'm sitting 8 ft away from my TV, I will notice the difference if my screen is over 60 inches, which is where a vast majority of consumers operate.
Good to know that pretty much anything looks fine on my TV, at typical viewing distances.
The counterpoint is that if you're sitting that close to a big TV, it's going to fill your field of view to an uncomfortable degree.
4k and higher is for small screens close up (desktop monitor), or very large screens in dedicated home theater spaces. The kind that would only fit in a McMansion, anyway.
How many feet away is a computer monitor?
Or a 2-4 person home theater distance that has good fov fill?