JPEGXR updates

Based on feedback from many collaborators, we are providing a few updates to the previously released JPEGXR software. In this post we are also providing an analysis comparing our JPEGXR implementation to WebP.

  • In the jxrlib GIT, version 1.1 is now available.  This release implements a slightly refined encoder quality. This means that setting the 0 – 100 quality value in the encoder library now translates to different encoder parameters, which should, in most cases, result in improved perceived image quality. The update also includes several community-contributed extensions to perform pixel format conversions.
  • The Photoshop plugins for Windows and Mac have also been updated. The new plugin includes the refined encoder quality implementation and an improved handling of Photoshop alpha layers.

Using our 1.1 JPEGXR encoder, we performed  an analysis of JPEGXR with the same methodology as the WebP vs. libjpeg study done by Google ( In our analysis we use the same images as in the Google study and encode them via JPEGXR, WebP, and Photoshop’s “Save for Web” JPEG encoder. For each encoder we sweep through 10 quality values* and plot the rate distortion curves (bits per pixel vs. SSIM)**.


Figure 1: Lena


Figure 2: kodim19.png


Figure 3: RGB_OR_1200x1200_061.png

In these graphs there are several interesting things to point out.

  • JPEGXR outperforms WebP in the high end of the quality range and equals or marginally beats WebP in the low end of the quality range.
  • Both WebP and JPEGXR handily outperform JPEG at low-quality settings.
  • WebP does not outperform JPEG at high-quality settings (Photoshop quality slider set to 80 or above), while JPEGXR does.
  • In each of the graphs, I have labeled what Photoshop has implemented as quality “50” and quality “0” (on a scale of 0-100).  Much of the Google analysis reports on quality below “50.”  I don’t know how many photos are saved out of tools like Photoshop at such low quality, but I would guess not many. In the above “50” portion of the quality range, our JPEGXR implementation clearly outperforms WebP.

Since the last blog post, we’ve been asked several times how JPEGXR compares to WebP. So we are providing the above rate-distortion curves. Of course, this is only one factor, albeit a very important one,  to consider. As pointed out in the last post, JPEGXR has many other interesting features, such as continuous quality from lossless to lossy, alpha channels, HDR, and compressed domain operations. And JPEGXR is an international  standard — many of these features are not currently applicable to WebP.

Matt Uyttendaele

* sweep from 0 to 100 in steps of 10

** pixel-count = width*height*3 (from Google’s scripts), SSIM is the average of red, green, blue

This entry was posted in Uncategorized. Bookmark the permalink.

13 Responses to JPEGXR updates

  1. Royi Avital says:

    Well Chrome, Safari and Firefox are Open Source projects.
    What you should do is incorporate JPEGXR support into them.

    Once you do that, and have JPEGXR support as part of HTML5, people will use it.

  2. oX Triangle says:

    JPEGXR does no animation, no alpha, no hardware acceleration or?

  3. hdview says:

    @oX Triangle
    I encourage you to read the previous post for the JPEGXR feature set:

    alpha – there is good alpha support
    animation – no
    hardware acceleration – no new compression standard (see HEVC) is going to have hardware support out of the gate. first step is to get browser support, hardware support will follow.

    • oX Triangle says:

      webp has free license model
      add layers soon
      webp use vp8 (important for hardware acceleration)
      webp is better for draws
      mozilla want a png/GIF replacer (animations)

      i think webp wins

      • hdview says:

        @oX Triangle

        RE license: JXR is free

        RE layers/animations: i’d rather focus on the quality of compression – layers, animations can pretty easily be done (in a more versatile way) in another layer of the stack, for example in javascript, webgl, or video tag

        RE “soon” (hardware/layers/animations): anything is possible for either technology “someday”

        RE better for draws: I don’t know what this means

    • battlebottle says:

      Regarding lack of animation support, a few weeks back I threw together I concept application for an implementation of an “animated jpeg xr”. It would be painfully simple to implement on top of the existing standard, and would degrade gracefully on older decoders.
      food for thought?

      The same method could be used to add layer support, simply have a bit in the header that describes how the frames in the JXR file should be interpreted by the decoder, either as animation or layers. That could potentially be a useful feature too, maybe for advanced profile.

  4. Thanks for this followup.

    > don’t know how many photos are saved out of tools like Photoshop
    > at such low quality [quality 0-50], but I would guess not many.

    As a web developer, I can tell you that *all* web sites that we are making these days use JPEG and WebP compressed images in the 25-45 range (same 0-100 scale used by Photoshop or ImageMagick, etc), because we need the content to look good on high DPI (“Retina™”) screens. The current best practice to achieve that is to double the height and width of the image while drastically over-compressing it, and then making the browser display it at half the intrinsic size. So even if the outputed image has horrendous compression artifacts when displayed at normal size, when squished in the browser it looks good, and sometimes great.

    In this use-case of highly compressed images, WebP really shines. In my tests with the Photoshop plugin, I couldn’t achieve such a drastic level of compression in JPEGXR, even when I set to the minimum value. The tests I did were with the version 1.0 of the plugin. Is the version 1.1 available for download somewhere?


  5. hdview says:

    Hi Ned. Thanks for your post. I hadn’t seen the highly compressed trick before. I thought that things were heading towards having multiple resolutions available on the server and using something like picturefill ( to serve the appropriate resolution.

    Oops, the 1.1 update of the PS plugin got hung up in the posting process, I will get it out shortly.

    But 1.1 won’t address your < 0 quality request. You'll notice in the curves above that we tried to tune our quality curve to match "save for web" curve in Photoshop. So for a given quality slider setting, "save for web" and JXR will give the same perceptual quality but the JXR file will be smaller. We wanted to make the slider intuitive to Photoshop users. Given that, I would expect that you also cannot achieve the super low quality that you want with JPEGs out of Photoshop?

    There is a work-around however, set the quality to '0' then click the "Advanced Mode" button, you can set the advanced Y,U,V quantization sliders below what '0' would set them to.

  6. I’m looking to run some of these tests myself, and the naive run of ImageMagick (using jxrlib) to convert ~100,000 JPEGs showed JXR to be worse than JPEG, at an across-the-board quality level of 85.

    Are there any techniques you’d point out to:
    1) Tune the JXR encoding (using ImageMagic, or another linux-supporting tool)
    2) Calculate the right quality level to achieve the same SSIM

    I’d like to extend the comparison further, but am not sure what’s the best way to do that beyond equal quality bars.

    • hdview says:

      Unfortunately there is no global ‘quality’ parameter that applies across all imaging tools, libraries, and codecs. A few suggestions:
      – for several images, plot graphs similar to the ones I’ve done above for the ImageMagick JPEG encoder and for jxrlib. This will give you an idea of how the library’s “quality” maps to SSIM.
      – once you have an approximate mapping of “quality” to SSIM for each library you can adjust the settings for each codec accordingly

  7. Ben says:

    WebP is probably struggling on the high end metrics due to its poorly chosen limited-range color transform (I assume you tested from RGB to RGB) and the fact it always uses chroma subsampling (4:2:0; Photoshop toggles this off for qualities > 50).

    On the other hand (and this is very subjective), I find WebP’s artifacts at low quality less objectionable than JXR’s, even if their SSIM metric is close.This might be tweakable encoder-side, but JXR still shows a “JPEG2000-like characteristic” which I dislike a lot.

    JXR lossless compression performance is pretty poor, WebP’s thoroughly beats it, specially on synthetic images such as screen captures. This is important for the alpha channel, where JXR will have a disadvantage on most cases. Of course the tradeoff is that the decoder is more complex (it uses a completely separate algorithm, unlike JXR).

    JXR decoding speed is too slow, slower than WebP and much slower than JPEG. I don’t think it should be in principle, maybe it’s an implementation issue in jxrlib. At least encoding is fast enough (still much slower than JPEG).

    At the current time I think both formats are somewhat flawed: the compression improvements over JPEG are “meh” compared with the really good modern codecs, JXR lacks animation and good lossless compression, WebP lacks in high-end quality and its file format and feature set are kind of duct taped together. The only reason they’re taken seriously is because of the big companies behind giving them a large installed base. Still either one would be an improvement over the current situation of JPEG or PNG, so I’ll take what I can get.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s