This is so much bullshit. 4K does make a difference, specially if playing console games on a large TV (65" and up).
Technology
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
HDR 1080p is what most people can live with.
Highly depends on screen size and viewing distance, but nothing reasonable for a normal home probably ever needs more than 8k for a high end setup, and 4K for most cases.
Contrast ratio/HDR and per-pixel backlighting type technology is where the real magic is happening.
4k is way better than 1080p, it's not even a question. You can see that shit from a mile away. 8k is only better if your TV is comically large.
I think you overestimate the quality of many humans' eyes. Many people walk around with slightly bad vision no problem. Many older folks have bad vision even corrected. I cannot distinguish between 1080 and 4k in the majority of circumstances. Stick me in front of a computer and I can notice, but tvs and computers are at wildly different distances.
And the size of most people's TV versus how far away they are.
4k is perfectly fine for like 99% of people.
Sony Black Trinitron was peak TV.
Sure but, hear me out, imagine having most of your project sourcecode on the screen at the same time without having to line-wrap.
I've been using "cheap" 43" 4k TVs as my main monitor for over a decade now. I used to go purely with Hisense, they have great colour and PC text clarity, and I could get them most places for $250 CAD. But this year's model they switched from RGB subpixel layout to BGR, which is tricky to get working cleanly on a computer, even when forcing a BGR layout in the OS. One trick is to just flip the TV upside down (yes it actually works) but it just made the whole physical setup awkward. I went with a Sony recently for significantly more, but the picture quality is fantastic.
And then there’s the dev that still insists on limiting lines to 80 chars & you have all that blank space to the side & have to scroll forever per file, sigh….
Split screen yo
Kind of a tangent, but properly encoded 1080p video with a decent bitrate actually looks pretty damn good.
A big problem is that we've gotten so used to streaming services delivering visual slop, like YouTube's 1080p option which is basically just upscaled 720p and can even look as bad as 480p.
I can still find 480p videos from when YouTube first started that rival the quality of the compressed crap “1080p” we get from YouTube today. It’s outrageous.
Sadly most of those older YouTube videos have been run through multiple re-comoressions and look so much worse than they did at upload. It's a major bummer.
Yeah I'd way rather have higher bitrate 1080 than 4k. Seeing striping in big dark or light spots on the screen is infuriating
For most streaming? Yeah.
Give me a good 4k Blu-ray though. High bitrate 4k
I mean yeah I'll take higher quality. I'd just rather have less lossy compression than higher resolution
i'd rather have proper 4k.
A big problem is that we’ve gotten so used to streaming services delivering visual slop, like YouTube’s 1080p option which is basically just upscaled 720p and can even look as bad as 480p.
YouTube is locking the good bitrates behind the premium paywall and even as a premium users you don't get to select a high bitrate when the source video was low res.
That's why videos should be upscaled before upload to force YouTube into offering high bitrate options at all. A good upscaler produces better results than simply stretching low-res videos.
This is why I still use 768p as my preferred resolution, despite having displays that can go much higher. I hate that all TVs now are trying to go as big as possible, when it's just artificially inflating the price for no real benefit. I also hate that modern displays aren't as dynamic as what CRTs were. CRTs can handle pretty much any resolution you throw at them but modern TVs and monitors freak out if you don't use an exact resolution, causing them to either have input lag because the display has to upscale the image or a potential performance hit if the display forces the connected device to handle the upscaling.

I can pretty confidently say that 4k is noticeable if you're sitting close to a big tv. I don't know that 8k would ever really be noticeable, unless the screen is strapped to your face, a la VR. For most cases, 1080p is fine, and there are other factors that start to matter way more than resolution after HD. Bit-rate, compression type, dynamic range, etc.
So, a 55-inch TV, which is pretty much the smallest 4k TV you could get when they were new, has benefits over 1080p at a distance of 7.5 feet... how far away do people watch their TVs from? Am I weird?
And at the size of computer monitors, for the distance they are from your face, they would always have full benefit on this chart. And even working into 8k a decent amount.
And that's only for people with typical vision, for people with above-average acuity, the benefits would start further away.
But yeah, for VR for sure, since having an 8k screen there would directly determine how far away a 4k flat screen can be properly re-created. If your headset is only 4k, a 4k flat screen in VR is only worth it when it takes up most of your field of view. That's how I have mine set up, but I would imagine most people would prefer it to be half the size or twice the distance away, or a combination.
So 8k screens in VR will be very relevant for augmented reality, since performance costs there are pretty low anyway. And still convey benefits if you are running actual VR games at half the physical panel resolution due to performance demand being too high otherwise. You get some relatively free upscaling then. Won't look as good as native 8k, but benefits a bit anyway.
There is also fixed and dynamic foveated rendering to think about, with an 8k screen, even running only 10% of it at that resolution and 20% at 4k, 30% at 1080p, and the remaining 40% at 540p, even with the overhead of so many foveation steps, you'll get a notable reduction in performance cost. Fixed foveated would likely need to lean higher towards bigger percentages of higher res, but has the performance advantage of not having to move around at all from frame to frame. Can benefit from more pre-planning and optimization.
A lot of us mount a TV on the wall and watch from a couch across the room.
I've got a LCD 55" TV and a 14" laptop. Ok the couch, the TV screen looks to me about as big as the laptop screen on my belly/lap, and I've got perfect vision; on the laptop I can clearly see the difference between 4k and FULL HD, on the TV, not so much.
I think TV screens aren't as good as PC ones, but also the TVs' image processors turn the 1080p files into better images than what computers do.
Hmm, I suppose quality of TV might matter. Not to mention actually going through the settings and making sure it isn't doing anything to process the signal. And also not streaming compressed crap to it. I do visit other peoples houses sometimes and definitely wouldn't know they were using a 4k screen to watch what they are watching.
But I am assuming actually displaying 4k content to be part of the testing parameters.
Seriously, articles like this are just clickbait.
They also ignore all sorts of usecases.
Like for a desktop monitor, 4k is extremely noticeable vs even 1440P or 1080P/2k
Unless you're sitting very far away, the sharpness of text and therefore amount of readable information you can fit on the screen changes dramatically.
Complete bullshit articles. The same thing happened when 720p became 1080p. So many echos of “oh you won’t see the difference unless the screen is huge”… like no, you can see the difference on a tiny screen.
We’ll have these same bullshit arguments when 8k becomes the standard, and for every large upgrade from there.
The article was about TVs, not computer monitors. Most people don't sit nearly as close to a TV as they do a monitor.
This discussion drives me crazy because it’s the EXACT SAME FUCKING discussion that happened when 1080p screens became available in the 00s. So many people argued “oh it depends how far away you sit but you don’t really notice it” and “oh if the screen size is small your eyes can’t tell”
NO monthafucka if you have halfway decent eyesight there’s NO WAY you won’t notice a huge change in quality from 720p to 1080p even on a 6” screen. 1080 to 4k is noticeable on almost ANY size screen (we all just skip 1440p, don’t we?) and as the size of the screen goes up and up, it just gets more and more noticeable.
Edit: Forgot to mention, a big reason I heard people making this argument so much in the ‘00s is because I was in TV and computer sales.
I don't remember that discussion at all... I remember people being super excited for 1080p, but annoyed that there was no content for it because DVDs were still 480p and TV content was similar. Blurays were 1080p, but weren't really a thing until the late 00s.
We've had 4k for a decade, and there's still not much content for it. When there is, the difference w/ 1080p isn't so significant as to be worth the cost, as it's usually just upscaled 1080 content. 4k makes a lot of sense for a monitor that's 30" or larger, but for a TV where you're 10-15 feet away it doesn't make nearly as much sense.
I forgot to mention, I sold TVs when 1080p was popularized and HD-DVD and Blu Ray came out hahaha. That’s mostly where I heard the “you can’t tell the difference between 720p and 1080” BS. There was plenty of 1080p stuff by the end of the ‘00s and people were still making that argument.
Ah, ok. I'm mostly going based on personal experience from the time.
If you read RTINGS before buying a TV and setting it up in your room, you already knew this. Screen size and distance to TV are important for determining what resolution you actually need.
Most people sit way too far away from their 4K TV.
My father in law loves to sit and research.. that's his thing.. made a career out of it yadda yadda yadda..
He asked me about a new TV.. I was like..well have you seen rtings.com?
My MIL had to remind him to eat.. lmfao
He just rabbit holed for days. It was like he clicked a TV tropes link or something.
Anyway, he made a very informed decision and loves his TV. Haha
Given how much time I spend actually looking at the screen while the show/movie is on, it might as well be in ca. 2000 RealVideo 160x120 resolution.
I’m supposed to be watching Haunted Hotel right now. It’s on, but here we are…
The difference between 1080p and 2160p is night and day to me.
with all the menus now days I mainly want sharp text
This is highly dependent on screen size and viewing distance.
On a computer screen or a phone screen? No, it's not really noticeable. Hell, on some phone screen sizes/distances, you might not even be able to tell 720p vs 1080p.
On a 120"+ projector screen? Yes, it is definitely noticeable.
I have a small home theater and picked up a refurbished 4K LED projector (Epson 3200) coming from an old 1080DLP (Viewsonic 8200) - massive difference!
As someone with a lowly 1080p projector and a 4k TV, I much prefer the 1080.
Viewsonic PX701HDH. I blow it up to 185" and I cannot see the pixels until I am uncomfortably close to the image.
The quality of the display/projector itself makes a huge difference. With a projector specifically - the bulb itself (and how much life left) is going to make a huge difference in picture quality.
My desktop monitor is a 54" 4K TV that I sit about 3' from. It's somewhat difficult for me to pick out individual pixels even when I lean in. My living room TV is 70" 4K, but I sit 15' away from it. There's no way I could tell the difference in 4K and 1080 from pixel density alone. I can however tell the difference between 4K and 1080 streams because of how shitty low bitrates look. 4K streams crush all of the dark colors and leave you with these nasty banding effects that I don't see as often on lower resolution streams.