this post was submitted on 16 Sep 2025
131 points (100.0% liked)
Linux
58352 readers
939 users here now
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Rules
- Posts must be relevant to operating systems running the Linux kernel. GNU/Linux or otherwise.
- No misinformation
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Well yeah given who makes it but it's what I care about. I couldn't care less about obscure and academic efforts (or the profits of some evil tech companies) except as vague curiosities. HEVC wasn't designed with people like me in mind either yet it means I can have oh 30% more stuff for the same space usage and the enccoders are mature enough that the difference in encode time between it and AVC is negligible on a decently powered server.
Transparency (or great visual fidelity period) also isn't likely the top concern here because development is driven by companies that want to save money on bandwidth and perhaps on CDN storage.
Which I think is a shame. Lower bitrates for transparency -should- be the goal. The goal should be to get streaming content to consumers at a very high quality, ideally close to or equivalent to UHD BluRay for 4k. Instead we get companies that bit-starve and hop onto these new encoders because they can use fewer bits as long as they use plenty of tricks to maintain a certain baseline of perceptual visual image quality that passes the sniff test for your average viewer so instead of getting quality bumps we just get them using less bits and passing the savings onto themselves with little meaningful upgrade in visual fidelity for the viewer. Which is why it's hard to care at all really about a lot of this stuff if it doesn't benefit the user in any way really.
Yep, fully agree. At least BluRays still exist for now. Building a beefy NAS and collecting full BluRay disks allows us to brute force the picture quality through sheer bitrate at least. There are a number of other problems to think about as well before we even get to the encoder stage, such as many (most?) 4k movies/TV shows being mastered in 2k (aka 1080p) and then upscaled to 4k. Not to mention a lot of 2k BluRays are upscaled from 720p! It just goes on and on. As a whole, we're barely using the capabilities of true 4k in our current day. Most of this UHD/4k "quality" craze is being driven by HDR, which also has its own share of design/cultural problems. The more you dig into all this stuff the worse it gets. 4k is billed as "the last resolution we'll ever need", which IMO is probably true, but they don't tell you that the 4k discs they're selling you aren't really 4k.