Back

Chrome Jpegxl Issue Reopened

293 points3 monthsissues.chromium.org
markdog123 months ago

"Yes, re-opening.".

> Given these positive signals, we would welcome contributions to integrate a performant and memory-safe JPEG XL decoder in Chromium. In order to enable it by default in Chromium we would need a commitment to long-term maintenance. With those and our usual launch criteria met, we would ship it in Chrome.

https://groups.google.com/a/chromium.org/g/blink-dev/c/WjCKc...

concinds2 months ago

Context: Mozilla has had the same stance and many devs (including Googlers) are already working on a Rust decoder which has made good progress.

bigbuppo2 months ago

LOL. Google, the "yeah that thing we bought six months ago, we're killing it off 30 days for 4 weeks ago" company demanding "long-term" anything.

rjh292 months ago

That conversation doesn't apply to their core products: Search, Mail, Maps, Chrome, Android. Their commitment to maintaining these services over decades has been amazing. It's everything else that sucks.

professor_v2 months ago

Mail is dropping features left and right, like gmailify. I'm pretty sure they're trying to limit the maintenance costs as much as possible.

PaulHoule2 months ago

I could almost imagine the normal search going away to be replaced by a chatbot.

lonjil2 months ago

long term support is actually being provided by google...

just a different team in a different country :D

most jxl devs are at google research in zurich, and already pledged to handle long tetm support

malfist2 months ago

Just like google pledges long term support for everything until the next new and shiny comes along.

+2
tyre2 months ago
crazygringo2 months ago

Dupe. From yesterday (183 points, 82 comments):

https://news.ycombinator.com/item?id=46021179

markdog122 months ago

Ah, I think I searched for "jpegxl", that's why there was no match.

wizee2 months ago

JPEG-XL provides the best migration path for image conversion from JPEG, with lossless recompression. It also supports arbitrary HDR bit depths (up to 32 bits per channel) unlike AVIF, and generally its HDR support is much better than AVIF. Other operating systems and applications were making strides towards adopting this format, but Google was up till now stubbornly holding the web back in their refusal to support JPEG-XL in favour of AVIF which they were pushing. I’m glad to hear they’re finally reconsidering. Let’s hope this leads to resources being dedicated to help build and maintain a performant and memory safe decoder (in Rust?).

homebrewer2 months ago

It's not just Google, Mozilla has no desire to introduce a barely supported massive C++ decoder for marginal gains either:

https://github.com/mozilla/standards-positions/pull/1064

avif is just better for typical web image quality, it produces better looking images and its artifacts aren't as annoying (smoothing instead of blocking and ringing around sharp edges).

You also get it for basically free because it's just an av1 key frame. Every browser needs an av1 decoder already unless it's willing to forego users who would like to be able to watch Netflix and YouTube.

lonjil2 months ago

I don't understand what you're trying to say. Mozilla said over a year ago that they would support JXL as soon as there's a fast memory safe decoder that will be supported.

Google on the other hand never expressed any desire to support JXL at all, regardless of the implementation. Only just now after the PDF Association announced that PDF would be using JXL, did they decide to support JXL on the web.

> avif is just better for typical web image quality, it produces better looking images and its artifacts aren't as annoying (smoothing instead of blocking and ringing around sharp edges).

AVIF is certainly better for the level of quality that Google wants you to use, but in reality, images on the web are much higher quality than that.

And JXL is pretty good if you want smoothing, in fact libjxl's defaults have gotten so overly smooth recently that it's considered a problem which they're in the process of fixing.

bawolff2 months ago

> I don't understand what you're trying to say. Mozilla said over a year ago that they would support JXL as soon as there's a fast memory safe decoder that will be supported.

Did they actually say that? All the statements i've seen them have been much more guarded and vauge. More of a, maybe we will think about it if that happens.

+1
lonjil2 months ago
wizee2 months ago

I disagree about the image quality at typical sizes - I find JPEG-XL is generally similar or better than AVIF at any reasonable compression ratios for web images. See this for example: https://tonisagrista.com/blog/2023/jpegxl-vs-avif/

AVIF only comes out as superior at extreme compression ratios at much lower bit rates than are typically used for web images, and the images generally look like smothered messes at those extreme ratios.

kps2 months ago

Not everything in the world is passive end-of-the-line presentation. JPEG-XL is the only one that tries to be a general-purpose image format.

asadotzler2 months ago

If that's the case, let it be a feature of image editing packages that can output formats that are for the web. It's a web standard we're talking about here, not a general-purpose image format, so asking browsers to carry that big code load seems unreasonable when existing formats do most of what we need and want for the web.

+1
crote2 months ago
bananalychee2 months ago

Even though AVIF decoding support is fairly widespread by now, it is still not ubiquitous like JPEG/PNG/GIF. So typically services will store or generate the same image in multiple formats including AVIF for bandwidth optimization and JPEG for universal client support. Browser headers help to determine compatibility, but it's still fairly complicated to implement, and users also end up having to deal with different platforms supporting different formats when they are served WebP or AVIF and want to reupload an image somewhere else that does not like those formats. As far as I can tell, JXL solves that issue for most websites since it is backwards-compatible and can be decoded into JPEG when a client does not support JXL. I would happily give up a few percent in compression efficiency to get back to a single all-purpose lossy image format.

hirako20002 months ago

Even Google photo does not support avif.

It's almost as if Google had an interest in increased storage and bandwidth. Of course they don't but as paying Driver used I'm overcharged for the same thing.

+1
magicalist2 months ago
+1
lonjil2 months ago
danielheath2 months ago

The killer feature of JXL is that most websites already have a whole bunch of images in JPEG format, and converting those to JXL shrinks them by about 30% without introducing any new artifacts.

magicalhippo2 months ago

> Mozilla has no desire to introduce a barely supported massive C++ decoder for marginal gains

On a slightly related note, I wanted to have a HDR background image in Windows 11. Should be a breeze in 2025 right?

Well, Windows 11 only supports JPEG XR[1] for HDR background images. And my commonly used tools did either not support JPEG XR (Gimp fex) or they did not work correctly (ImageMagick).

So I had a look at the JPEG XR reference implementation, which was hosted on Codeplex but has been mirrored on GitHub[2]. And boy, I sure hope that isn't the code that lives in Windows 11...

Ok most of the gunk is in the encoder/decoder wrapper code, but still, for something that's supposedly still in active use by Microsoft... Though not even hosting their own copy of the reference implementation is telling enough I suppose.

[1]: https://en.wikipedia.org/wiki/JPEG_XR

[2]: https://github.com/4creators/jxrlib

infinet2 months ago

Another JPEG XR user is Zeiss. It saves both grayscale and color microscope images with JPEG XR compression in a container format. Zeiss also released a C++ library (libczi) using the reference JPEG XR implementation to read/write these images. Somehow Zeiss is moving away from JPEG XR - its newer version of microscope control software saves with zstd compression by default.

greenavocado2 months ago

"Marginal Gains"

Generation Loss – JPEG, WebP, JPEG XL, AVIF : https://www.youtube.com/watch?v=w7UDJUCMTng

edflsafoiewq2 months ago

Marginal gains over AVIF.

(Also I am highly skeptical of the importance of these generation loss tests.)

jlouis2 months ago

Very nice in video workflows, where it's common to write out image sequences to disk.

greenavocado2 months ago

Social media exists

xeeeeeeeeeeenu2 months ago

>avif is just better for typical web image quality,

What does "typical web image quality" even mean? I see lots of benchmarks with very low BPPs, like 0.5 or even lower, and that's where video-based image codecs shine.

However, I just visited CNN.com and these are the BPPs of the first 10 images my browser loaded: 1.40, 2.29, 1.88, 18.03 (PNG "CNN headlines" logo), 1.19, 2.01, 2.21, 2.32, 1.14, 2.45.

I believe people are underestimating the BPP values that are actually used on the web. I'm not saying that low-BPP images don't exist, but clearly it isn't hard to find examples of higher-quality images in the wild.

jnd-cz2 months ago

Can AVIF display 10 bit HDR with larger color gamut that any modern phone nowadays is capable of capturing?

CharlesW2 months ago

> Can AVIF display 10 bit HDR with larger color gamut that any modern phone nowadays is capable of capturing?

Sure, 12-bit too, with HDR transfer functions (PQ and HLG), wide-gamut primaries (BT.2020, P3, etc.), and high-dynamic-range metadata (ITU/CTA mastering metadata, content light level metadata).

JPEG XL matches or exceeds these capabilities on paper, but not in practice. The reality is that the world is going to support the JPEG XL capabilities that Apple supports, and probably not much more.

arccy2 months ago

if you actually read your parent comment: "typical web image quality"

+2
ansgri2 months ago
twotwotwo2 months ago

Wanted to note https://issues.chromium.org/issues/40141863 on making the lossless JPEG recompression a Content-Encoding, which provides a way that, say, a CDN could deploy it in a way that's fully transparent to end users (if the user clicks Save it would save a .jpg).

(And: this is great! I think JPEG XL has chance of being adopted with the recompression "bridge" and fast decoding options, and things like progressive decoding for its VarDCT mode are practical advantages too.)

kps2 months ago

> (in Rust?)

Looks like that's the idea: https://issues.chromium.org/issues/462919304

kllrnohj2 months ago

> and generally its HDR support is much better than AVIF

Not anymore. JPEG had the best HDR support with ISO 21496-1 weirdly enough, but AVIF also just recently got that capability with 1.2 ( https://aomedia.org/blog%20posts/Libavif-Improves-Support-fo... ).

The last discussion in libjxl about this was seemingly taking the stance it wasn't necessary since JXL has "native HDR" which completely fails to understand the problem space entirely.

lonjil2 months ago

The JXL spec already has gainmaps...

Also, just because there's a spec for using gainmaps with JPEG doesn't mean that it works well. With only 8 bits of precision, it really sucks for HDR, gainmap or no gainmap. You just get too much banding. JXL otoh is completely immune to banding, with or without gainmaps.

kllrnohj2 months ago

> With only 8 bits of precision, it really sucks for HDR, gainmap or no gainmap. You just get too much banding.

This is simply not true. In fact, you get less banding than you do with 10-bit bt2020 PQ.

> JXL otoh is completely immune to banding

Nonsense. It has a lossy mode (which is its primary mode so to speak), so of course it has banding. Only lossless codecs can plausibly be claimed to be "immune to banding".

> The JXL spec already has gainmaps...

Ah looks like they added that sometime last year but decided to call it "JHGM" and also made almost no mention of this in the issue tracker, and didn't bother updating the previous feature requests asking for this that are still open.

spaceducks2 months ago

> Nonsense. It has a lossy mode (which is its primary mode so to speak), so of course it has banding. Only lossless codecs can plausibly be claimed to be "immune to banding".

color banding is not a result of lossy compression*, it results from not having enough precision in the color channels to represent slow gradients. VarDCT, JPEG XL's lossy mode, encodes values as 32-bit floats. in fact, image bit depth in VarDCT is just a single value that tells the decoder what bit depth it should output to, not what bit depth the image is encoded as internally. optionally, the decoder can even blue-noise dither it for you if your image wants to be displayed in a higher bit depth than your display or software supports

this is more than enough precision to prevent any color banding (assuming of course the source data that was encoded into a JXL didn't have any banding either). if you still want more precision for whatever reason, the spec just defines that the values in XYB color channels are a real number between 0 and 1, and the header supports signaling an internal depth up to 64 bit per channel

* technically color banding could result from "lossy compression" if high bit depth values are quantized to lower bit depth values, however with sophisticated compression, higher bit depths often compress better because transitions are less harsh and as such need fewer high-frequency coefficients to be represented. even in lossless images, slow gradients can be compressed better if they're high bit depth, because frequent consistent changes in pixel values can be predicted better than sudden occasional changes (like suddenly transitioning from one color band to another)

12_throw_away2 months ago

> performant and memory safe decoder (in Rust?).

Isn't this exactly the case that wuffs [1] is built for? I had the vague (and, looking into it now, probably incorrect) impression that Google was going to start building all their decoders with that.

[1] https://github.com/google/wuffs

lonjil2 months ago

WUFFS only works for very simple codecs. Basically useless for anything complex enough that memory bugs would be common.

FerritMans2 months ago

Love this, been waiting for Google to integrate this, from my experience with AVIF and JPEGXL, JPEGXL is much more promising for the next 20years.

masswerk2 months ago

Nice example for how a standard, like PDF, can even persuade/force one of the mighty to adopt a crucial bit of technology, so that this may become a common standard in its own right (i.e. "cascading standards").

jlouis2 months ago

This is welcome.

AVIF is trying to be a distribution format for the Web. JPEG XL is trying to be a complete package for working with image data. JPEG XL can replace OpenEXR in many workflows. AVIF simply cannot.

There's a lot of power in not having to convert for distribution.

Pxtl2 months ago

> Lossless JPEG recompression (byte-exact JPEG recompression, saving about 20%) for legacy images

Lossless recompression is the main interesting thing on offer here compared to other new formats... and honestly with only 20% improvement I can't say I'm super excited by this, compared to the pain of dealing with yet another new image format.

For example, ask a normal social media user how they feel about .webp and expect to get an earful. The problem is that even if your browser supports the new format, there's no guarantee that every other tool you use supports it, from the OS to every site you want to re-upload to, etc.

F3nd02 months ago

If I remember correctly, WebP was single-handedly forced into adoption by Chrome, while offering only marginal improvements over existing formats. Mozilla even worked on an improved JPEG encoder, MozJPEG, to show it could compete with WebP very well. Then came HEIF and AVIF, which, like WebP, were just repurposed video codecs.

JPEG XL is the first image format in a long while that's been actually designed for images and brings a substantial improvement to quality while also covering a wide range of uses and preserving features that video codecs don't have. It supports progressive decoding, seamless very large image sizes, potentially large amount of channels, is reasonably resilient against generation loss, and more. The fact that it has no major drawbacks alone gives it much more merit than WebP has ever had. Lossless recompression is in addition to all of that.

The difference is that this time around, Google has single-handedly held back the adoption of JPEG XL, while a number of other parties have expressed interest.

Dwedit2 months ago

Having a PNG go from 164.5K to 127.1K as lossless WEBP is not what I'd call "marginal". An improvement of over 20% is huge for lossless compression.

Going from lossless WEBP to lossless JXL is marginal though, and is not worth the big decode performance loss.

JyrkiAlakuijala2 months ago

When I built WebP lossless format I kept testing design decisions against PNG. The average gain against my Internet PNG test corpus was 42 % and 26.5 % if I optimized the PNGs with pngcrush and pngout (kzip). I had not yet come up with ZopfliPNG ideas, those were backports from some WebP lossless ideas into gzip and PNG.

F3nd02 months ago

In context of the parent comment, 'only 20% improvement' is not super exciting, 'compared to the pain of dealing with yet another new image format'.

You raise a good point, though; WebP certainly did (and continues to do) well in some areas, but at the cost of lacking in others. Moreover, when considering a format for adoption, one should compare it with other candidates for adoption, too. And years before WebP gained widespread support in browsers, it had competition from other interesting formats like FLIF, which addressed some of its flaws, and I have to wonder how it compares to the even older JPEG 2000.

+2
Dwedit2 months ago
lonjil2 months ago

Since the person you replied to mentioned MozJPEG, I have to assume they meant that WebP's lossy capabilities were a marginal improvement.

halapro2 months ago

You're not being fair. Webp has been the only choice for lossy image compression with alpha layer. Give it some credit.

F3nd02 months ago

Fair point, though not entirely true: you can run an image through lossy compression and store the result in a PNG, using tools like pngquant [1]. Likely not as efficient for many kinds of images, but totally doable.

[1] https://pngquant.org/

7jjjjjjj2 months ago

I think there's a difference here.

If I right click save and get a webp, it was probably converted from JPG. Very very few images are uploaded in webp. So getting a webp image means you've downloaded an inferior version.

JXL doesn't have this issue because conversion from jpeg is lossless. So you've still gotten the real, fully-quality image.

Pxtl2 months ago

Let's be realistic - when most users are upset they got a .webp, they're not annoyed because of quality-loss, they're annoyed because they can't immediately use it in many other services & software.

_jzlw2 months ago

This is still a problem with AVIF, too. Image viewers that support the format often don't support animated AVIFs, and even GitHub still for some godforsaken reason treats .avif files in a repo/PR as binary files instead of images. I think Discord just recently started supporting AVIFs, so that's progress.

tempest_2 months ago

20% is massive for those storing those social media images though.

Pxtl2 months ago

I get that there are people who are super excited by this for very good reasons, but for those of us downstream this is just going to be a hassle.

BeFlatXIII2 months ago

> For example, ask a normal social media user how they feel about .webp and expect to get an earful.

I've seen enough software that gets petulant about not supporting webp to fight the Google monopoly or whatever to understand their frustration.

OneDeuxTriSeiGo2 months ago

That's also not the only potential gain. You get 20% gain on baseline compression but you also no longer need to store variants at different sizes since JPEG-XL's progressive decode is essentially equivalent to downscaling in terms of quality.

i.e. you can also serve downscaled & thumbnail versions directly from the original image.

spider-mario2 months ago

Since the recompression is lossless, you don’t need every tool you use to support it, as long as one of them is one that can do the decompression back to JPEG. This sounds a bit like complaining that you can’t upload .7z everywhere.

Pxtl2 months ago

AFAIK downconverting to jpeg is only an option for legacy jpegs that have been upconverted to jpegxl though. Many jpegxl images likely won't support downconverting if they were created as jxl from the get-go.

Basically, jpeg->jxl->jpeg is perfectly lossless conversion, but a newly-made jxl->jpeg is not, even if it doesn't use modern jxl-only features like alpha channels.

With that in mind I'd actually prefer if those were treated as separate file-formats with distinct file-extensions (backwards-compatible jpeg->jxls vs pure-jxl). The former could be trivially handled with automated tools, but the latter can't.

spaceducks2 months ago

I'm not sure if that will be an issue in practice. in any case, you need a JPEG XL decoder to perform the transition from a recompressed-JPEG-JXL to the original JPEG, so whatever tool is doing this, it can already handle native-JXL too. it could be the conversion happens on the server side and the client always sees JPEG, in which case a native JXL can also be decoded to a JPEG (or if lossless a PNG), though obviously with information loss since JPEG is a subset of JXL (to put it lightly)

spider-mario2 months ago

Well, sure, but wasn’t that the use case we were discussing?

+1
Pxtl2 months ago
gen2brain2 months ago

I like how even the nus product (jpegli) is a significant improvement. I am in the process of converting my comic book collection. I save a lot of space and still use JPEG, which is universally supported.

h1fra2 months ago

Webp was a nice new format now widely adopted in browsers, yet it's barely supported in websites (upload) and softwares. It's hard to see this being different.

spaceducks2 months ago

WebP is much more limiting than JPEG XL. in lossy mode WebP has forced 4:2:0 chroma subsampling, supports only 8 bit per channel colors (really only about 7.8 bits, because thanks to WebP being tv-range the values aren't in a 0-255 range but in a 16-235 range, causing even more color banding than 8 bit per channel already has), no HDR, a maximum resolution of 16385 x 16385 making it unsuitable for larger images...

JPEG XL on the other hand supports up to 4099 color channels, a bit depth up to 32 bit per channel (technically up to 64 bit, but this currently isn't used), supports HDR natively, can use splines to compress elements like strands of hair, thin tree branches or line art that are typically hard to compress with DCT, supports patches for compressing repeating image elements, supports thermal, depth and alpha channels, supports losslessly recompressing existing JPEGs saving about 20%, supports CMYK and spot colors for printing, supports layers and selection masks, supports storing raw camera sensor data in bayer patterns, etc.

WebP is just a web delivery format, JPEG XL was designed to support many uses cases like web delivery, medical imaging, raw camera sensor data, authoring, multi-spectral imaging... the list goes on. if we support JPEG XL now, chances are it'll be quite a while before we need a new general purpose image format because JPEG XL covers so many current use cases and was designed to accommodate potential future use cases as well.

networked2 months ago

I didn't realize WebP was limited-RGB in addition to 4:2:0. According to RFC 9649, this is accurate. While the ITU-R Recommendation 601 on color is only a "SHOULD" in the RFC, you'd need a custom decoder to break out of limited RGB:

> The VP8 specification describes how to decode the image into Y'CbCr format. To convert to RGB, Recommendation 601 [REC601] SHOULD be used. Applications MAY use another conversion method, but visual results may differ among decoders.

ChrisArchitect2 months ago
Andrew-Tate2 months ago

[dead]

claudiojulio2 months ago

[flagged]

fsflover2 months ago

This comment is of course breaking the HN Guidelines as a shallow dismissal, but the parent is right: After Google killed Ublock Origin and turned Android into a nanny OS, I have no idea why anyone would stick to anything from them. Also Firefox is better in almost every way.

albert_e2 months ago

> Chrome Jpegxl Issue Reopened

> (this is the tracking bug for this feature)

Is it just me -- or it's confusing to use the terms issue / bug / feature interchangeably?

mort962 months ago

It's not really used interchangeably: "bug" is used to mean "entry in the bug tracker database", while "feature" is used to mean what we colloquially think of as a feature of a computer program.

It's arguably a slight abuse of a bug tracking system to also track progress and discussion on features, but it's not exactly uncommon; it's just that many systems would call it an "issue" rather than a "bug".

nandomrumber2 months ago
_jzlw2 months ago

Google's internal issue tracker, Buganizer (which the Chromium Issue Tracker is based on), refers to everything as a "bug". It's confusing, yeah. You get used to it.

crazygringo2 months ago

Not really -- they're all "potential todos" that need to be tracked and prioritized in the same place.

And the difference between a bug and a feature is often in the eye of the beholder. I'll very often title a GitHub issue with "Bug/Feature Request:" since it's often debatable whether the existing behavior was by design or not, and I don't want to presume one way or the other.

So I do consider them all pretty interchangeable at the end of the day, and therefore not really confusing.

gethly2 months ago

jpg -> png -> webp -> avif

Why are we going backward?

hxtk2 months ago

JPEGXL doesn't refer to the same standard as JPEG. JPEGXL competes with AVIF in as a next-generation image format. It also has some properties that make it very nice for the web, such as the fact that a truncated (e.g., because the download hasn't completed yet) JPEGXL image is also a reduced-fidelity version of the same image, which with large images gets you much faster LCP compared to AVIF where the image remains unusable until fully downloaded.

gethly2 months ago

Seems like a very small advantage to warrant all the necessary work needed to make it mainstream. Who would use it as advanced image format when a person that would be interested in such a thing would likely already be handling webp or avif?