this post was submitted on 06 Aug 2024
340 points (92.3% liked)

Technology

59495 readers
3050 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] ProdigalFrog@slrpnk.net 406 points 3 months ago (11 children)

TL:DW, JPEG is getting old in the tooth, which prompted the creation of JPEG XL, which is a fairly future-proof new compression standard that can compress images to the same file size or smaller than regular JPEG while having massively higher quality.

However, JPEG XL support was removed from Google Chrome based browsers in favor of AVIF, a standalone image compression derived from the AV1 video compression codec that is decidedly not future-proof, having some hard-coded limitations, as well as missing some very nice to have features that JPEG XL offers such as progressive image loading and lower hardware requirements. The result of this is that JPEG XL adoption will be severely hamstrung by Google’s decision, which is ultimately pretty lame.

[–] Hellinabucket@lemmy.world 238 points 3 months ago (1 children)

This is why Google keeps getting caught up in monopoly lawsuits.

[–] altima_neo@lemmy.zip 148 points 3 months ago (3 children)

Modern Google is becoming the Microsoft of the 90s

[–] Telorand@reddthat.com 59 points 3 months ago (1 children)

And they'll make eleventy bajillion dollars in the meantime, plenty of money to pay their inevitable punitive "fines."

[–] TeoTwawki@lemmy.world 28 points 3 months ago* (last edited 3 months ago)

Hell old MSs penalty was giving free licenses in markets it never had a grip on, so its "lock 'em in!" model meant the "penalty" benefited them!

[–] FartsWithAnAccent@fedia.io 14 points 3 months ago

Which is funny and said because Microsoft is also the Microsoft of the 90s.

[–] nutsack@lemmy.world 13 points 3 months ago

Microsoft is still like this

[–] Ghostalmedia@lemmy.world 172 points 3 months ago (1 children)

I tried JPEG XL and it didn’t even make my files extra large. It actually made them SMALLER.

False advertising.

[–] pastermil@sh.itjust.works 25 points 3 months ago (1 children)

I think you took the wrong enlargement pill.

[–] Sabata11792@ani.social 6 points 3 months ago

Just set the pills to wumbo.

[–] reddig33@lemmy.world 69 points 3 months ago (9 children)

Jpeg XL isn’t backwards compatible with existing JPEG renderers. If it was, it’d be a winner. We already have PNG and JPG and now we’ve got people using the annoying webP. Adding another format that requires new decoder support isn’t going to help.

[–] MimicJar@lemmy.world 63 points 3 months ago

"the annoying webp" AFAIK is the same problem as JPEG XL, apps just didn't implement it.

It is supported in browsers, which is good, but not in third party apps. AVIF or whatever is going to have the same problem.

[–] ProdigalFrog@slrpnk.net 42 points 3 months ago* (last edited 3 months ago) (3 children)

Jpeg XL isn’t backwards compatible with existing JPEG renderers. If it was, it’d be a winner.

According to the video, and this article, JPEG XL is backwards compatible with JPEG.

But I'm not sure if that's all that necessary. JPEG XL was designed to be a full, long term replacement to JPEG. Old JPEG's compression is very lossy, while JPEG XL, with the same amount of computational power, speed, and size, outclasses it entirely. PNG is lossless, and thus is not comparable since the file size is so much larger.

JPEG XL, at least from what I'm seeing, does appear to be the best full replacement for JPEG (and it's not like they can't co-exist).

[–] reddig33@lemmy.world 27 points 3 months ago (2 children)

It’s only backwards compatible in that it can re-encode existing jpeg content into the newer format without any image loss. Existing browsers and apps can’t render jpegXL without adding a new decoder.

[–] ProdigalFrog@slrpnk.net 16 points 3 months ago (7 children)

Existing browsers and apps can’t render jpegXL without adding a new decoder.

Why is that a negative?

[–] seaQueue@lemmy.world 6 points 3 months ago* (last edited 3 months ago) (2 children)

Legacy client support. Old devices running old browser code can't support a new format without software updates, and that's not always possible. Decoding jxl on a 15yo device that's not upgradable isn't good UX. Sure, you probably can work around that with slow JavaScript decoding for many but it'll be slow and processor intensive. Imagine decoding jxl on a low power arm device or something like a Celeron from the early 2010s and you'll get the idea, it will not be anywhere near as fast as good old jpeg.

[–] RamblingPanda@lemmynsfw.com 10 points 3 months ago (1 children)

But how is that different to any other new format? Webp was no different?

[–] seaQueue@lemmy.world 4 points 3 months ago* (last edited 3 months ago)

Google rammed webp through because it saved them money on bandwidth (and time during page loading) and because they controlled the standard. They're doing the same thing with jpeg now that they control jpegli. Jpegli directly lifts the majority of features from jpegxl and google controls that standard.

[–] ProdigalFrog@slrpnk.net 3 points 3 months ago* (last edited 3 months ago) (1 children)

That's a good argument, and as a fan of permacomputing and reducing e-waste, I must admit I'm fairly swayed by it.

However, are you sure JPEG XL decode/encode is more computationally heavy than JPEG to where it would struggle on older hardware? This measurement seems to show that it's quite comparable to standard JPEG, unless I'm misunderstanding something (and I very well might be).

That wouldn't help the people stuck on an outdated browser (older, unsupported phones?), but for those who can change their OS, like older PC's, a modern Linux distro with an updated browser would still allow that old hardware to decode JPEG XL's fairly well, I would hope.

[–] seaQueue@lemmy.world 3 points 3 months ago* (last edited 3 months ago)

Optimized jpegxl decoding can be as fast as jpeg but only if the browser supports the format natively. If you're trying to bolt jxl decoding onto a legacy browser your options become JavaScript and WASM decoding. WASM can be as fast but browsers released before like 2020 won't support it and need to use JavaScript to do the job. Decoding jxl in JavaScript is, let's just say it's not fast and it's not guaranteed to work on legacy browsers and older machines. Additionally any of these bolt on mechanisms require sending the decoder package on page load so unless you're able to load that from the user's cache you pay the bandwidth/time price of downloading and initializing the decoder code before images even start to render on the page. Ultimately bolting on support for the new format just isn't worth the cost of the implementation in many cases so sites usually implement fallback to the older format as well.

Webp succeeded because Google rammed the format through and they did that because they controlled the standard. You'll see the same thing happen with the jpegli format next, it lifts the majority of its featureset from jpegxl and Google controls the standard.

[–] reddig33@lemmy.world 3 points 3 months ago (2 children)
[–] ProdigalFrog@slrpnk.net 16 points 3 months ago* (last edited 3 months ago) (2 children)

The video actually references that comic at the end.

But I don't see how that applies in your example, since both JPEG and JPEG XL existing in parallel doesn't really have any downsides, it'd just be nice to have the newer option available. The thrust of the video is that Google is kneecapping JPEG XL in favor of their own format, which is not backwards compatible with JPEG in any capacity. So we're getting a brand new format either way, but a monopoly is forcing a worse format.

load more comments (2 replies)
load more comments (1 replies)
load more comments (5 replies)
[–] JackbyDev@programming.dev 3 points 3 months ago

They're confusing backwards and forwards compatible. The new file format is backwards compatible but the old renderers are not forward compatible with the new format.

load more comments (2 replies)
[–] FaceDeer@fedia.io 36 points 3 months ago (1 children)

My understanding is that webp isn't actually all that bad from a technical perspective, it was just annoying because it started getting used widely on the web before all the various tools caught up and implemented support for it.

load more comments (1 replies)
[–] ArchRecord@lemm.ee 21 points 3 months ago

I just wish more software would support webp files. I remember Reddit converting every image to webp to save on space and bandwidth (smart, imo) but not allowing you to directly upload webp files in posts because it wasn't a supported file format.

If webp was just more standardized, I'd love to use it more. It would certainly save me a ton of storage space.

[–] KairuByte@lemmy.dbzer0.com 17 points 3 months ago (2 children)

So… your solution is to stick with extremely dated and objectively bad file formats? You using Windows 95?

[–] NateNate60@lemmy.world 3 points 3 months ago (1 children)
[–] KairuByte@lemmy.dbzer0.com 19 points 3 months ago (10 children)

For what it is? Nothing.

Compared to something like JPEG XL? It is hands down worse in virtually all metrics.

[–] NateNate60@lemmy.world 15 points 3 months ago (4 children)

I think this might sound like a weird thing to say, but technical superiority isn't enough to make a convincing argument for adoption. There are plenty of things that are undeniably superior but yet the case for adoption is weak, mostly because (but not solely because) it would be difficult to adopt.

As an example, the French Republican Calendar (and the reformed calendar with 13 months) are both evidently superior to the Gregorian Calendar in terms of regularity but there is no case to argue for their adoption when the Gregorian calendar works well enough.

Another example—metric time. Also proposed as part of the metric system around the same time as it was just gaining ground, 100 seconds in a minute and 100 minutes in an hour definitely makes more sense than 60, but it would be ridiculous to say that we should devote resources into switching to it.

Final example—arithmetic in a dozenal (base-twelve) system is undeniably better than in decimal, but it would definitely not be worth the hassle to switch.

For similar reasons, I don't find the case for JPEG XL compelling. Yes, it's better in every metric, but when the difference comes down to a measly one or two megabytes compared to PNG and WEBP, most people really just don't care enough. That isn't to say that I think it's worthless, and I do think there are valid use cases, but I doubt it will unseat PNG on the Internet.

[–] KairuByte@lemmy.dbzer0.com 10 points 3 months ago (1 children)

I’m not under the impression it would unseat PNG anytime soon, but “we have a current standard” isn’t a good argument against it. As images get higher and higher quality, it’s going to increase the total size of images. And we are going to hit a point where it matters.

This sounds so much like the misquoted “640K ought to be enough for anybody” that I honestly can’t take it seriously. There’s a reason new algorithms, formats and hardware are developed and released, because they improve upon the previous and generally improve things.

[–] NateNate60@lemmy.world 5 points 3 months ago (4 children)

My argument is not "we have a current standard", it's "people don't give enough of a shit to change".

load more comments (4 replies)
[–] AnUnusualRelic@lemmy.world 7 points 3 months ago (1 children)

You're thinking in terms of the individual user with a handful of files.

When you look at it from a server point of view with tens of terabytes of images, or as a data center, the picture is very different.

Shaving 5 or 10% off of files is a huge deal. And that's not even taking into account the huge leap in quality.

load more comments (1 replies)
load more comments (2 replies)
[–] TheRealKuni@lemmy.world 4 points 3 months ago (2 children)

Compared to something like JPEG XL? It is hands down worse in virtually all metrics.

Only thing I can think of is that PNG is inherently lossless. Whereas JPEG XL can be lossless or lossy.

[–] hedgehog@ttrpg.network 4 points 3 months ago

I haven’t dug into the test data or methodology myself but I read a discussion thread recently (on Reddit - /r/jpegxl/comments/l9ta2u/how_does_lossless_jpegxl_compared_to_png) - across a 200+ image test suite, the lossless compression of PNG generates files that are 162% the size of those losslessly compressed with JPEG XL.

However I also know that some tools have bad performance compressing PNG, and no certainty that those weren’t used

load more comments (1 replies)
load more comments (8 replies)
load more comments (1 replies)
[–] cygnus@lemmy.ca 14 points 3 months ago (1 children)

Forgive my ignorance, but isn't this like complaining that a PlayStation 2 can't play PS5 games?

load more comments (1 replies)
[–] Reverendender@sh.itjust.works 11 points 3 months ago (1 children)

All the cool kids use .HEIF anyway

[–] Imgonnatrythis@sh.itjust.works 10 points 3 months ago

I use jpeg 2000

[–] AnUnusualRelic@lemmy.world 9 points 3 months ago

You can't add new and better stuff while staying compatible with the old stuff. Especially not when your goal is compact files (or you'd just embed the old format).

[–] southsamurai@sh.itjust.works 9 points 3 months ago (1 children)

Isn't that the same as other newer formats though?

There's always something new, and if the new thing is better, adding/switching to it is the better move.

Or am I missing something about the other formats like webp?

[–] reddig33@lemmy.world 5 points 3 months ago (2 children)

You have to offer something compelling for everyone. Just coming out with yet another new standard™ isn’t enough. As pointed out earlier, we already have:

  • jpeg
  • Png
  • Webp
  • HEIC

What’s the point of adding another encoder/decoder to the table when PNG and JPEG are still “good enough”?

[–] pennomi@lemmy.world 13 points 3 months ago

PNG and JPEG aren’t good enough, to be honest. If you run a content heavy site, you can see something like a 30-70% decrease in bandwidth usage by using WebP.

[–] dezmd@lemmy.world 13 points 3 months ago (5 children)

Look it's all actually about re-encumberancing image file formats back into corporate controlled patented formats. If we would collectively just spend time and money and development resources expanding and improving PNG and gif formats that are no longer patent encumbered, we'd all live happily ever after.

load more comments (5 replies)
[–] fmstrat@lemmy.nowsci.com 7 points 3 months ago

Why was it not included? AVIF creator influence bias. It's a good story.

[–] seaQueue@lemmy.world 5 points 3 months ago

Google's handling of jxl makes a lot more sense after the jpegli announcement. It's apparent now that they declined to support jxl in favor of cloning many of jxl's features in a format they control.

[–] Moah@lemmy.blahaj.zone 3 points 3 months ago (2 children)

Why wasn't PNG enough to replace jpeg?

[–] ProdigalFrog@slrpnk.net 19 points 3 months ago

PNG is a lossless format, and hence results in fairly large file sized compared to compressed formats, so they're solving different issues.

JPEG XL is capable of being either lossy or lossless, so it sorta replaces both JPEG and PNG

load more comments (1 replies)
load more comments (4 replies)