Closed Bug 680385 Opened 13 years ago Closed 9 years ago

Firefox does not show arithmetic coded jpegs

Categories

(Core :: Graphics: ImageLib, defect)

defect
Not set
normal

Tracking

()

RESOLVED WONTFIX

People

(Reporter: mail2jmsmith, Unassigned)

References

()

Details

User Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.18) Gecko/20110614 Firefox/3.6.18
Build ID: 20110614230723

Steps to reproduce:

Opened an arithmetic coded jpeg


Actual results:

Nothing was shown.


Expected results:

As arithmetic coding patents for jpegs have now expired and most of web pages' content are images, users a waiting 6 to 10% longer to see web pages for no reason.

Firefox could grab this opportunity to become the fastest web browser on earth by supporting arithmetic coded jpegs and advertising it to servers by sending the HTTP header:

Accept-Encoding: gzip,deflate,jpeg

After all, we did not had to wait 10 years to start using gzip compression.
Component: General → ImageLib
Product: Firefox → Core
QA Contact: general → imagelib
Version: 3.6 Branch → Trunk
Could you please attach a reference image?
Here's an example:

http://filmicgames.com/Images/Patents/bedroom_arithmetic.jpg

May you be interested on the expired patents background, you can read it here:

http://filmicgames.com/archives/778
Component: ImageLib → General
Product: Core → Firefox
Version: Trunk → 1.0 Branch
Component: General → ImageLib
OS: Windows XP → All
Product: Firefox → Core
Hardware: x86 → All
Version: 1.0 Branch → Trunk
Assignee: nobody → netzen
Thanks for the info and reference image!
Status: UNCONFIRMED → NEW
Ever confirmed: true
joe: Is this fair game to do? <link to this ticket>
<joe> bbondy: yeah; your best bet will be to implement it in libjpeg-turbo though
Seems that arithmetic encoding was added into libjpeg-turbo as of 1.1
So we may just need to upgrade to that to add support.

Reference: https://bugzilla.redhat.com/show_bug.cgi?id=639672
This will be fixed already in Bug 650899.  I'll test with the reference image in this ticket though, thanks.
Status: NEW → RESOLVED
Closed: 13 years ago
Resolution: --- → DUPLICATE
There is some question of whether we want to support arithmetic coded JPEGs at all. Doing so means we've created a fragmented market, since we'll load images that no other browser does. To do that, there had better be a pretty good reason, and I don't know whether JPEG with arithmetic coding is that good reason.
Status: RESOLVED → UNCONFIRMED
Ever confirmed: false
Resolution: DUPLICATE → ---
Status: UNCONFIRMED → RESOLVED
Closed: 13 years ago13 years ago
Resolution: --- → DUPLICATE
Apparently chromium already supports arithmetic code JPEGs, if comment 5 on the link I have provided is correct (cf. http://filmicgames.com/archives/778#comment-5404). So the market is already fragmented.
I don't think this is a duplicate of bug 650899.  See bug 650899 comment 12.  (Unless those huffman decoder changes I cherrypicked also include arithmetic decoding?  "arith" doesn't appear anywhere in the patch, which suggests to me that they don't.  But maybe that's wrong!)
Status: RESOLVED → REOPENED
Ever confirmed: true
Resolution: DUPLICATE → ---
Assignee: netzen → nobody
Does anyone have some good information on gains that arithmetic coding brings?

This post suggests it's pretty small:
http://cbloomrants.blogspot.com/2011/01/01-10-11-perceptual-metrics-warmup-jpeg.html
What is available on the literature is that arithmetic coding yields on average 7-10% reduction in comparison to optimized JPEGs.

For instance, on this paper:

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.62.3005&rep=rep1&type=pdf

you can read:

"One common technique is the arithmetic coding option proposed by JPEG [1], as most JPEG images are encoded using Huffman coding. A reduction of file size of about 7-10% can be achieved for lossless rate optimization."
I discovered this bug when GIMP switched the default for saved JPGs to use arithmetic coding in development builds and my JPGs stopped working in Firefox. It's switched back now, but this may come up more often as other programs start to use arithmetic coding by default -- especially if Chrome and other browsers support it.
I don't believe any other browsers support it. I just tried Chrome and Safari and neither did.
Chromium here (on Debian sid) shows arithmetic coded images just fine. I think chrome/chromium uses the system library, so it probably varies by OS.
(In reply to Jeff Muizelaar [:jrmuizel] from comment #14)
> I don't believe any other browsers support it. I just tried Chrome and
> Safari and neither did.

Indeed, Chrome does not support it, there are no plans to support it, and the related code files the come from upstream libjpeg_turbo are removed from the Chrome build.
(In reply to noel gordon from comment #16)
> Indeed, Chrome does not support it, there are no plans to support it, and
> the related code files the come from upstream libjpeg_turbo are removed from
> the Chrome build.

That seems definitive.

I'm resolving this bug again for now. (Please don't read too much into "WONTFIX".) Before reopening, we'd need at least two browser vendors to commit to supporting arithmetic-coded JPEGs. Mozilla could be one of them, but we won't implement this without evidence that it's going to be interoperable in the future.

For now, the right place to go to move forward with this is on the standards mailing lists, not in this bug.
Status: REOPENED → RESOLVED
Closed: 13 years ago9 years ago
Resolution: --- → WONTFIX
I think this is a bit strange for Mozilla team to introduce arithmetic JPEG encoding in mozjpeg [1], and reject its support in the browser. As WebP (bug #856375) and FLIF (bug #1240692) are also rejected, what is more progressive format (better than JPEG) that Mozilla team suggests to migrate to?

[1] https://github.com/mozilla/mozjpeg/blob/5198654f739552ed24c7f014574d1e74ee9ef8ac/usage.txt#L167
Sorry, your information isn't entirely correct (which you would have discovered on your own, if you followed the very links that you provided a bit deeper...)

(In reply to Dmitry from comment #18)
> I think this is a bit strange for Mozilla team to introduce arithmetic JPEG
> encoding in mozjpeg [1], and reject its support in the browser.

(1) Mozilla never "introduced" arithmetic JPEG encoding in mozjpeg. If you follow "git blame" on the line that you linked to, you'll see that support dates back to https://github.com/libjpeg-turbo/libjpeg-turbo/commit/19e6975e90027db025c0b7264a3efdd466275b47 -- a 2010 change in the original "libjpeg-turbo" library, which mozjpeg seems to have been forked from at some point.

(2) Comment 7 and Comment 17 (which is where this is WONTFIX'ed) didn't "reject its support in the browser". They simply pointed out that it's useless (and perhaps actively harmful) for Mozilla to be the only browser that supports this format. 

> As WebP (bug #856375) and FLIF (bug #1240692) are also rejected

WebP is no longer "rejected" -- if you look at the duplicate-target of the WebP bug that you linked, you'll see that we're actively working on supporting it (with most recent activity 2 weeks ago).

> what is more progressive format (better than JPEG) that Mozilla team suggests to migrate to?

I don't know the answer to this (and again, this is best for discussion on a mailing list).  I'm also not clear why there's pressure to migrate.
(In reply to Dmitry from comment #18)
> I think this is a bit strange for Mozilla team to introduce arithmetic JPEG
> encoding in mozjpeg [1], and reject its support in the browser. As WebP (bug
> #856375) and FLIF (bug #1240692) are also rejected, what is more progressive
> format (better than JPEG) that Mozilla team suggests to migrate to?

The gains from JPEG's arithmetic coding scheme are probably not large enough to justify the incompatibility concerns.

Overall, the space of lossy still image coding has been pretty neglected so there's not a lot of candidates that bring a big improvement. However, JPEG encoders keep getting better so that's probably the best option for now. https://github.com/thorfdbg/libjpeg and https://github.com/google/guetzli/ are recent attempts in this area.
I think that support of the arithmetic coded JPEGs will be much better than support of the WebP or some other completely new format. We can convert all existent JPEGs to arithmetic coded JPEGs losslessly, and it will decrease the size of every file. It is impossible with WebP. We just can't optimize all JPEGs losslessly using WebP.

Similar request for the Edge: https://wpdev.uservoice.com/forums/257854-microsoft-edge-developer/suggestions/11369337-add-support-for-the-arithmetic-coded-jpeg-which-s
Similar ticket in the Chromium bugtracker:
https://bugs.chromium.org/p/chromium/issues/detail?id=669501

> Before reopening, we'd need at least two browser vendors to commit to supporting arithmetic-coded
> JPEGs. Mozilla could be one of them, but we won't implement this without evidence that it's going to 
> be interoperable in the future.

It seems that Chromium developers have nothing against arithmetic coding support. Maybe Mozilla and Google can introduce arithmetic coding together? libjpg-turbo already supports it, so it will not be hard to implement.
I am agnostic as to whether browsers should support arithmetic-coded JPEGs or not, but I will say that I don't believe that this file format is nearly as much of a panacea as it's being made out to be.  Even if all of the major browsers started supporting arithmetic-coded JPEGs today, it would still be years before web designers were willing to embrace them, because they'd have to wait until the older browser versions were phased out.  And what we do here, and what Google does, is really irrelevant unless Microsoft and Apple are on board as well.  I just tested Chrome and Safari on my Mac, and neither displays arithmetic-codec JPEGs.  My favorite image viewer/converter (GraphicConverter) doesn't even recognize them as JPEGs.  The latest version of Photoshop displays an error when attempting to open them.  Arithmetic-coded JPEGs are part of the official spec, but the de facto reality is that they aren't any more of a standard image format than the SmartScale images that Guido championed in jpeg-8 (which aren't part of the official spec.).  Furthermore, as pointed out here:

https://github.com/libjpeg-turbo/libjpeg-turbo/issues/120

the arithmetic decoder doesn't support suspension.  I have no ability to make it do so, and Mozilla's past experience with the author of that code (Guido) suggests that he isn't going to be willing to help unless Mozilla agrees to switch from libjpeg-turbo back to libjpeg.

The arithmetic codec is extremely slow, so you may just be trading off a network bottleneck for a CPU bottleneck.  My testing reveals that it can compress about 13-24% better in the aggregate (average 15% better), but it takes, on average, 5x the CPU time to compress and 6x the CPU time to decompress those images (relative to baseline JPEG.)  That's not a good trade-off, particularly when you consider that progressive JPEG images can produce, on average, 11% better compression (and I'm talking plain vanilla progressive-- not mozjpeg) and require only 3x the CPU time to decompress.  (If I can find funding to integrate the SSE2 progressive Huffman encoder that was submitted to our project, I can bring the compression speed to within 3x of baseline as well.)
And of course it goes without saying that, as connections get faster, the size of JPEGs isn't going to matter as much.  When I first got a cable modem nearly 20 years ago, I was cruising along at 5-10 Mbps.  Now they're about to upgrade me to 300 Mbps for the same price as I was paying in the late 90s.  Web designers do not focus nearly as much on image size as they did in the days of dial-up.  The primary beneficiaries of smaller JPEGs are sites like Facebook, which have to pay for storing those files for billions of people.  But speaking as a photography enthusiast, I don't use Facebook for anything more than casual photo sharing, precisely because I want to be able to store the full-sized unaltered JPEGs online, and I'm willing to pay for the storage necessary to do that (which I do, through Google and Flickr.)

I don't think there's any harm to browsers supporting arithmetic-codec JPEGs, but I also think that the usage of these by web designers/developers will be approximately zero for the foreseeable future.
> When I first got a cable modem nearly 20 years ago, I was cruising along at 5-10 Mbps.

Do you think that internet is so fast everywhere? Huh. I still have a 6 Mbps connection. My parents have 2Mbps connection. I know many places where connection is much worse.
My point is that connections are getting faster everywhere.  Your country's average speed specifically has doubled in the past 2 years, so the baseline JPEGs are, on average, downloading faster now than arithmetic progressive JPEGs would have downloaded 2 years ago.  In our country, the average speed has only increased by about 25% in that same time, but that's still greater than the compression ratio difference between baseline and arithmetic progressive.  What I'm trying to say is-- look at this from a project management point of view.  If it takes a year to get suspension into libjpeg-turbo's arithmetic codec (which is probably optimistic unless someone steps forward to implement it and contribute the code), then it will take longer than that for the feature to get into Firefox and even longer for it to get into other browsers and commercial software packages like Photoshop.  Maybe, if you're lucky, 5 or 10 years down the line, it becomes ubiquitous enough that web designers will start using it, but in that same timeframe, everyone's Internet connection will have sped up by a lot more than 20%.

This isn't really my debate, though.  I'm just trying to be helpful.  If Firefox decides to adopt arithmetic coding, great, but I've given good reasons why I don't think adding that support will be enough to ensure wide adoption of the format in the industry at large.  This is all a moot point until the arithmetic codec supports suspension.
(In reply to Seth Fowler [:seth] [:s2h] from comment #17)

Chromium developers are considering arithmetic coded JPEG support:
https://bugs.chromium.org/p/chromium/issues/detail?id=669501#c7

> Basically, if additional binary size cost is very low, there is minimal additional security risk,
> and we don't need to provide more signals to authors about this, then we'll be more positive about
> taking this. If these factors are untrue, we need to think more carefully.

Also they are ready to discuss it with other vendors:
http://lists.w3.org/Archives/Public/public-whatwg-archive/2016Dec/0032.html

> I posted some questions on that bug, but their answers would probably be relevant for other
> vendors as well.

(In reply to Seth Fowler [:seth] [:s2h] from comment #17)

Before reopening, we'd need at least two browser vendors to
commit to supporting arithmetic-coded JPEGs. Mozilla could be one of them,
but we won't implement this without evidence that it's going to be
interoperable in the future.

By this logic, no web browser would support anything, and there wouldn't be any web browsers in the first place.

It's not other browsers that determine the requirements, it's the needs of your users. Being one of them, I can tell you that support of arithmetic coding is badly needed. Whenever I put Gimp-created images on the Web, people complain that they don't see them. Then I have to tell them to download the image and open it with a graphics viewer, because Firefox left this bug unresolved for more than 9 years. That's ridiculous.

I don't ultimately have any skin in this game, but as the sole maintainer and principal developer of libjpeg-turbo, I am in a good position to play devil's advocate:

-- Supporting arithmetic decoding in Firefox would require enhancing the arithmetic decoder in libjpeg-turbo to support suspension. I am unwilling to do that work unless someone fully pays for my labor, and the project is not likely to be cheap.
-- The current performance of the arithmetic encoder and decoder is quite poor. Much better performance can be achieved with progressive Huffman encoding/decoding, and the progressive Huffman codec will receive additional performance and fault tolerance enhancements in libjpeg-turbo 2.1.
-- Arithmetic coding provides only a small size advantage (in my testing, ~7-8% on average) over optimized progressive Huffman coding, and that advantage goes away if you are willing to use an asymmetric encoder such as mozjpeg, which exchanges encoding performance (to the tune of about 50-100x) for an incremental decrease in size.

The official builds of libjpeg-turbo have supported arithmetic coding for many years, so I don't really care one way or another. If the community wants arithmetic decoding in browsers badly enough for someone to pay for my labor to implement suspension in the libjpeg-turbo arithmetic codec, then great. Bring it. But also, GIMP doesn't generate arithmetic-coded images unless you tell it to, so I don't buy the argument that arithmetic decoding is necessary to support GIMP. As a web developer, why would you purposefully put an arithmetic-coded image on the web knowing that most browsers can't decode it? That's just creating your own pain in order to try to prove a point.

DRC, thanks for your verbose reply. Could you please share comparison results and methods you've used to compare arithmetic v.s. progressive Huffman encoding? I haven't made such a comparison myself, but if the advantage is 10% then probably it does not worth the effort to pull that into browser. Although one may argue that 10% is anyway good saving of your hard drive space...

Using the five images described in this article,

I compared

TJ_OPTIMIZE=1 tjbench {image} 95 -rgb -quiet -progressive

with

TJ_ARITHMETIC=1 tjbench {image} 95 -rgb -quiet -progressive

(NOTE: you can add -benchtime 0.001 -warmup 0 to speed things up, if you just want to measure the compression ratio rather than the performance. Replace -quiet with -qq to generate spreadsheet-friendly results.)

Compression ratio improvement with progressive arithmetic coding vs. optimized progressive Huffman coding was 2.4-10.2% (average 7.7%.)

For the set of Kodak test images, compression ratio improvement with progressive arithmetic coding vs. optimized progressive Huffman coding was 1.7-8.8% (average 5.2%.)

For the complete set of 8-bit RGB test images (the aforementioned set of five test images I typically use contains two of these), compression ratio improvement with progressive arithmetic coding vs. optimized progressive Huffman coding was 3.0-12.1% (average 7.9%.) The image with 12.1% improvement (spider_web.ppm) was an outlier. Without that image, the improvement was 3.0-10.4% (average 7.7%), which was very similar to the results from the set of five test images I use. (NOTE: that's why I use those images. Even though the origin of the first three images is somewhat odd, those five images have historically proven to represent a good range of "typical" JPEG codec behavior.)

As far as disk space, that's the raison d'etre for mozjpeg. It's intended as an offline tool for recompressing and optimizing existing JPEG images to save space. By combining optimized progressive Huffman coding with other techniques (trellis quantization, etc.), mozjpeg generally produces smaller JPEG files than can be produced using progressive arithmetic coding, and the files it produces are compatible with any browser. However, the tradeoff is that generating those files is extremely slow compared to generating "plain" optimized progressive Huffman images using libjpeg-turbo.

No horse in the race, but I'm curious whether or not arithmetic jpeg decoding time is actually a significant factor relative to the time needed for the rest of the page (e.g. network requests, disk/memory/cache i/o, rendering, flowing, vsync, etc.), using whatever current min-spec PC/mobile hardware and connection a web author would expect a user to have.

For instance, given a page horribly-encrusted with jpeg images, would the real-world time difference between loading it as huffman jpegs and arithmetic jpegs be palpably different to the user?

Let's put some rough (back-of-the-envelope) numbers on that, from my nearly 10-year-old machine (quad-core 2.8 GHz Intel Intel Xeon W3530.) This is old data and was obtained with libjpeg-turbo 1.3.x, but the relative differences should at least be similar with the current version.

For a set of test images, I compared the average decompression performance and compression ratio of baseline Huffman coding, progressive Huffman coding, and progressive arithmetic coding ("bits per pixel" is just 24 bits, the bit depth of the source images, divided by the compression ratio.)

Entropy coding type Average compression ratio / bits per pixel Average decompression performance (Megapixels/sec)
Baseline Huffman 13.0:1 / 1.85 127
Progressive Huffman 14.2:1 / 1.69 46.4
Progressive Arithmetic 15.3:1 / 1.57 19.3

For a hypothetical 1-megapixel image, this would amount to the following decompression and transmission times (all in milliseconds) for various networks:

Entropy coding type Decomp. time Trans. time (1 Mbps) Trans. time (10 Mbps) Trans. time (100 Mbps) Trans. time (1 Gbps)
Baseline Huffman 7.87 1850 185 18.5 1.85
Progressive Huffman 21.6 1690 169 16.9 1.69
Progressive Arithmetic 51.8 1570 157 15.7 1.57

Combining decompression and transmission time (NOTE: this assumes no pipelining of transmission and decompression, which may be a spherical chicken assumption), we get:

Entropy coding type Total time (1 Mbps) Total time (10 Mbps) Total time (100 Mbps) Total time (1 Gbps)
Baseline Huffman 1858 193 26.4 9.72
Progressive Huffman 1712 191 38.5 23.3
Progressive Arithmetic 1622 209 67.5 53.4

Actual mileage may vary, but for this specific test case, arithmetic coding is so slow that the decompression time outweighs any performance advantage from the compression ratio except on the slowest of networks. On a 1 Mbps network, arithmetic is only 5% faster than progressive Huffman, and on all other networks, it's much slower. Progressive Huffman definitely gives you more "bang for the buck" in terms of compression ratio vs. performance. That is likely to improve, given that progressive Huffman coding is receiving a lot of attention right now in the libjpeg-turbo community, whereas arithmetic coding is receiving exactly none.

NOTE: As a purveyor of remote desktop software that uses libjpeg-turbo (TurboVNC, specifically), I have evaluated whether progressive Huffman coding is even worth it. I ultimately concluded that, no, it wasn't. In real-world performance scenarios, I see pretty much what you see in the last table above-- you can get a barely perceptible performance improvement with progressive Huffman vs. baseline Huffman on very slow networks, but otherwise, baseline is as fast or faster.

Thanks, @DRC, that's exactly the sort of info I was curious to see.

I have to say, if it's that cpu-intensive to unpack the arithmetic-encoded images, then I think it's probably a favor to humanity as a whole not to use it. The added power consumption across the world doesn't seem worth the ~10% of storage space and bandwidth you get back. That not only hits the power grid but also shortens battery life on mobile devices.

Arithmetic coding is good for something like an embedded system (e.g. game console) where you can't just throw a few more dollars at the problem to create more space, so you have to make it work it the hard way, but it doesn't seem pragmatic on the web.

That's my 2¢ anyway.

Not even 10% if you're comparing arithmetic to progressive Huffman. In that case, the incremental storage/bandwidth saving is only about 8%, in exchange for more than doubling the compute time. I absolutely agree that arithmetic coding isn't worth it-- at least not until/unless it can be significantly sped up.

It's also worth noting that, if bandwidth is a concern for JPEG images on a web site, mozjpeg can be used to perform offline near-lossless recompression of existing JPEG images (the lossiness is due only to the use of trellis quantization, to the best of my understanding.) mozjpeg is an "asymmetric" codec, so it trades off extremely slow compression performance (like 30x slower than libjpeg-turbo) for a 15:1 compression ratio, which is almost as good as that of arithmetic coding. Since mozjpeg is generating standard progressive Huffman JPEG images, the decompression performance will be the same as that of any other progressive Huffman JPEG image. It's not a solution for real-time compression, but then again, neither is arithmetic coding at the moment.

DRC, thanks for your informative replies, we all really appreciate that. Am I right saying that arithmetic JPEG provides currently the best compression ration in comparison to other formats/algorithms (like WebP) for the comparable output image quality? The comparisons I found (like this one https://developers.google.com/speed/webp/docs/webp_study) seems to use traditional Huffman JPEG encoder.

I agree that support for arithmetic encoding should not become de-facto standard for JPEG also because of compatibility issues. Although I must admit having that support in Mozilla engine would be a plus for applications that use this engine, like Thunderbird. Maybe that is also not good example, as once I insert/embed arithmetic-encoded image into email body, the recipient would need an email client that supports it as well, but I can think of other engine applications where one would like to render HTML with arithmetic-encoded image when rendering time does not matter much, but disk saving matters.

Maybe support of arithmetic encoding could be just a build option that is disabled by default?

I haven't tested the lossy modes of webp, but I've done similar research with other codecs in the past (including lossless webp), and my methodology was similar to the methodology used in Google's webp study. (I used the JPEG codec with specific settings as a baseline, measured the perceptual image quality using the structural dissimilarity [DSSIM] metric, dialed in the webp codec so that it barely produced the same DSSIM as the JPEG image, and compared the compression ratios.) If we can assume that the same relative difference in compression ratios (1.17) from my tests above (comparing progressive arithmetic coding to baseline Huffman coding) bears out for the images used in the Google study, then I expect that webp would still compress better than arithmetic-coded JPEG.

You highlighted the compatibility problem exactly. If one piece of software supports arithmetic-coded JPEG images, then that's just going to create cross-compatibility issues unless all popular software supports them. No web developer would ever use such images right now, because no browsers support those images. That doesn't just include browsers that use libjpeg-turbo. It also includes proprietary browsers, such as Safari. It wouldn't surprise me if the underlying proprietary Windows and macOS JPEG codecs supplied by Microsoft and Apple lack support for arithmetic coding, and that could be because arithmetic coding was encumbered by patents until the late 2000s. Even if web developers wanted to use arithmetic-coded JPEG images, PhotoShop (which doesn't use libjpeg-turbo) will not even open an arithmetic-coded JPEG image, much less create one. I think you'd be hard pressed to find any software, other than libjpeg-turbo itself, that enables the creation of arithmetic-coded JPEG images. Firefox and Chrome aren't the only libjpeg-turbo-based programs that have chosen to disable arithmetic coding, even though libjpeg-turbo enables it by default.

The only way to break that chicken-and-egg scenario would be for one popular piece of software to lead by example, to begin supporting arithmetic-coded JPEG images so that other pieces of software could follow suit. However, in terms of web development, it wouldn't do much good for Firefox and Chrome to support arithmetic-coded JPEG images unless proprietary browsers like Safari also supported them, and it wouldn't do much good for proprietary browsers to support them unless PhotoShop also supported them. If there were a solid argument to be made for the use of arithmetic-coded JPEG images, then it might make sense to try to steer that train from the caboose, but right now, that argument doesn't seem to exist. webp probably compresses just as well if not better, and mozjpeg can be used to produce progressive JPEG images with almost the same compression ratio as arithmetic-coded JPEG images, without the compatibility or decompression performance concerns.

As far as enabling arithmetic coding in Firefox, my understanding (from https://github.com/libjpeg-turbo/libjpeg-turbo/issues/120) is that you couldn't do that right now, because libjpeg-turbo's arithmetic decoder doesn't support suspension. That seems like a gating issue for this whole discussion. I've been an independent OSS contractor for 11 years, so I'll take anyone's money to develop any enhancement that makes sense for libjpeg-turbo, but adding suspension to the arithmetic decoder is probably not something that will happen without that money or a very-well-crafted code contribution.

You need to log in before you can comment on or make changes to this bug.