User Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:184.108.40.206) Gecko/20110614 Firefox/3.6.18 Build ID: 20110614230723 Steps to reproduce: Opened an arithmetic coded jpeg Actual results: Nothing was shown. Expected results: As arithmetic coding patents for jpegs have now expired and most of web pages' content are images, users a waiting 6 to 10% longer to see web pages for no reason. Firefox could grab this opportunity to become the fastest web browser on earth by supporting arithmetic coded jpegs and advertising it to servers by sending the HTTP header: Accept-Encoding: gzip,deflate,jpeg After all, we did not had to wait 10 years to start using gzip compression.
Could you please attach a reference image?
Here's an example: http://filmicgames.com/Images/Patents/bedroom_arithmetic.jpg May you be interested on the expired patents background, you can read it here: http://filmicgames.com/archives/778
Thanks for the info and reference image!
joe: Is this fair game to do? <link to this ticket> <joe> bbondy: yeah; your best bet will be to implement it in libjpeg-turbo though
Seems that arithmetic encoding was added into libjpeg-turbo as of 1.1 So we may just need to upgrade to that to add support. Reference: https://bugzilla.redhat.com/show_bug.cgi?id=639672
This will be fixed already in Bug 650899. I'll test with the reference image in this ticket though, thanks.
There is some question of whether we want to support arithmetic coded JPEGs at all. Doing so means we've created a fragmented market, since we'll load images that no other browser does. To do that, there had better be a pretty good reason, and I don't know whether JPEG with arithmetic coding is that good reason.
Apparently chromium already supports arithmetic code JPEGs, if comment 5 on the link I have provided is correct (cf. http://filmicgames.com/archives/778#comment-5404). So the market is already fragmented.
I don't think this is a duplicate of bug 650899. See bug 650899 comment 12. (Unless those huffman decoder changes I cherrypicked also include arithmetic decoding? "arith" doesn't appear anywhere in the patch, which suggests to me that they don't. But maybe that's wrong!)
Does anyone have some good information on gains that arithmetic coding brings? This post suggests it's pretty small: http://cbloomrants.blogspot.com/2011/01/01-10-11-perceptual-metrics-warmup-jpeg.html
What is available on the literature is that arithmetic coding yields on average 7-10% reduction in comparison to optimized JPEGs. For instance, on this paper: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.62.3005&rep=rep1&type=pdf you can read: "One common technique is the arithmetic coding option proposed by JPEG , as most JPEG images are encoded using Huffman coding. A reduction of file size of about 7-10% can be achieved for lossless rate optimization."
I discovered this bug when GIMP switched the default for saved JPGs to use arithmetic coding in development builds and my JPGs stopped working in Firefox. It's switched back now, but this may come up more often as other programs start to use arithmetic coding by default -- especially if Chrome and other browsers support it.
I don't believe any other browsers support it. I just tried Chrome and Safari and neither did.
Chromium here (on Debian sid) shows arithmetic coded images just fine. I think chrome/chromium uses the system library, so it probably varies by OS.
(In reply to Jeff Muizelaar [:jrmuizel] from comment #14) > I don't believe any other browsers support it. I just tried Chrome and > Safari and neither did. Indeed, Chrome does not support it, there are no plans to support it, and the related code files the come from upstream libjpeg_turbo are removed from the Chrome build.
(In reply to noel gordon from comment #16) > Indeed, Chrome does not support it, there are no plans to support it, and > the related code files the come from upstream libjpeg_turbo are removed from > the Chrome build. That seems definitive. I'm resolving this bug again for now. (Please don't read too much into "WONTFIX".) Before reopening, we'd need at least two browser vendors to commit to supporting arithmetic-coded JPEGs. Mozilla could be one of them, but we won't implement this without evidence that it's going to be interoperable in the future. For now, the right place to go to move forward with this is on the standards mailing lists, not in this bug.
I think this is a bit strange for Mozilla team to introduce arithmetic JPEG encoding in mozjpeg , and reject its support in the browser. As WebP (bug #856375) and FLIF (bug #1240692) are also rejected, what is more progressive format (better than JPEG) that Mozilla team suggests to migrate to?  https://github.com/mozilla/mozjpeg/blob/5198654f739552ed24c7f014574d1e74ee9ef8ac/usage.txt#L167
Sorry, your information isn't entirely correct (which you would have discovered on your own, if you followed the very links that you provided a bit deeper...) (In reply to Dmitry from comment #18) > I think this is a bit strange for Mozilla team to introduce arithmetic JPEG > encoding in mozjpeg , and reject its support in the browser. (1) Mozilla never "introduced" arithmetic JPEG encoding in mozjpeg. If you follow "git blame" on the line that you linked to, you'll see that support dates back to https://github.com/libjpeg-turbo/libjpeg-turbo/commit/19e6975e90027db025c0b7264a3efdd466275b47 -- a 2010 change in the original "libjpeg-turbo" library, which mozjpeg seems to have been forked from at some point. (2) Comment 7 and Comment 17 (which is where this is WONTFIX'ed) didn't "reject its support in the browser". They simply pointed out that it's useless (and perhaps actively harmful) for Mozilla to be the only browser that supports this format. > As WebP (bug #856375) and FLIF (bug #1240692) are also rejected WebP is no longer "rejected" -- if you look at the duplicate-target of the WebP bug that you linked, you'll see that we're actively working on supporting it (with most recent activity 2 weeks ago). > what is more progressive format (better than JPEG) that Mozilla team suggests to migrate to? I don't know the answer to this (and again, this is best for discussion on a mailing list). I'm also not clear why there's pressure to migrate.
(In reply to Dmitry from comment #18) > I think this is a bit strange for Mozilla team to introduce arithmetic JPEG > encoding in mozjpeg , and reject its support in the browser. As WebP (bug > #856375) and FLIF (bug #1240692) are also rejected, what is more progressive > format (better than JPEG) that Mozilla team suggests to migrate to? The gains from JPEG's arithmetic coding scheme are probably not large enough to justify the incompatibility concerns. Overall, the space of lossy still image coding has been pretty neglected so there's not a lot of candidates that bring a big improvement. However, JPEG encoders keep getting better so that's probably the best option for now. https://github.com/thorfdbg/libjpeg and https://github.com/google/guetzli/ are recent attempts in this area.
I think that support of the arithmetic coded JPEGs will be much better than support of the WebP or some other completely new format. We can convert all existent JPEGs to arithmetic coded JPEGs losslessly, and it will decrease the size of every file. It is impossible with WebP. We just can't optimize all JPEGs losslessly using WebP. Similar request for the Edge: https://wpdev.uservoice.com/forums/257854-microsoft-edge-developer/suggestions/11369337-add-support-for-the-arithmetic-coded-jpeg-which-s
Similar ticket in the Chromium bugtracker: https://bugs.chromium.org/p/chromium/issues/detail?id=669501 > Before reopening, we'd need at least two browser vendors to commit to supporting arithmetic-coded > JPEGs. Mozilla could be one of them, but we won't implement this without evidence that it's going to > be interoperable in the future. It seems that Chromium developers have nothing against arithmetic coding support. Maybe Mozilla and Google can introduce arithmetic coding together? libjpg-turbo already supports it, so it will not be hard to implement.
I am agnostic as to whether browsers should support arithmetic-coded JPEGs or not, but I will say that I don't believe that this file format is nearly as much of a panacea as it's being made out to be. Even if all of the major browsers started supporting arithmetic-coded JPEGs today, it would still be years before web designers were willing to embrace them, because they'd have to wait until the older browser versions were phased out. And what we do here, and what Google does, is really irrelevant unless Microsoft and Apple are on board as well. I just tested Chrome and Safari on my Mac, and neither displays arithmetic-codec JPEGs. My favorite image viewer/converter (GraphicConverter) doesn't even recognize them as JPEGs. The latest version of Photoshop displays an error when attempting to open them. Arithmetic-coded JPEGs are part of the official spec, but the de facto reality is that they aren't any more of a standard image format than the SmartScale images that Guido championed in jpeg-8 (which aren't part of the official spec.). Furthermore, as pointed out here: https://github.com/libjpeg-turbo/libjpeg-turbo/issues/120 the arithmetic decoder doesn't support suspension. I have no ability to make it do so, and Mozilla's past experience with the author of that code (Guido) suggests that he isn't going to be willing to help unless Mozilla agrees to switch from libjpeg-turbo back to libjpeg. The arithmetic codec is extremely slow, so you may just be trading off a network bottleneck for a CPU bottleneck. My testing reveals that it can compress about 13-24% better in the aggregate (average 15% better), but it takes, on average, 5x the CPU time to compress and 6x the CPU time to decompress those images (relative to baseline JPEG.) That's not a good trade-off, particularly when you consider that progressive JPEG images can produce, on average, 11% better compression (and I'm talking plain vanilla progressive-- not mozjpeg) and require only 3x the CPU time to decompress. (If I can find funding to integrate the SSE2 progressive Huffman encoder that was submitted to our project, I can bring the compression speed to within 3x of baseline as well.)
And of course it goes without saying that, as connections get faster, the size of JPEGs isn't going to matter as much. When I first got a cable modem nearly 20 years ago, I was cruising along at 5-10 Mbps. Now they're about to upgrade me to 300 Mbps for the same price as I was paying in the late 90s. Web designers do not focus nearly as much on image size as they did in the days of dial-up. The primary beneficiaries of smaller JPEGs are sites like Facebook, which have to pay for storing those files for billions of people. But speaking as a photography enthusiast, I don't use Facebook for anything more than casual photo sharing, precisely because I want to be able to store the full-sized unaltered JPEGs online, and I'm willing to pay for the storage necessary to do that (which I do, through Google and Flickr.) I don't think there's any harm to browsers supporting arithmetic-codec JPEGs, but I also think that the usage of these by web designers/developers will be approximately zero for the foreseeable future.
> When I first got a cable modem nearly 20 years ago, I was cruising along at 5-10 Mbps. Do you think that internet is so fast everywhere? Huh. I still have a 6 Mbps connection. My parents have 2Mbps connection. I know many places where connection is much worse.
My point is that connections are getting faster everywhere. Your country's average speed specifically has doubled in the past 2 years, so the baseline JPEGs are, on average, downloading faster now than arithmetic progressive JPEGs would have downloaded 2 years ago. In our country, the average speed has only increased by about 25% in that same time, but that's still greater than the compression ratio difference between baseline and arithmetic progressive. What I'm trying to say is-- look at this from a project management point of view. If it takes a year to get suspension into libjpeg-turbo's arithmetic codec (which is probably optimistic unless someone steps forward to implement it and contribute the code), then it will take longer than that for the feature to get into Firefox and even longer for it to get into other browsers and commercial software packages like Photoshop. Maybe, if you're lucky, 5 or 10 years down the line, it becomes ubiquitous enough that web designers will start using it, but in that same timeframe, everyone's Internet connection will have sped up by a lot more than 20%. This isn't really my debate, though. I'm just trying to be helpful. If Firefox decides to adopt arithmetic coding, great, but I've given good reasons why I don't think adding that support will be enough to ensure wide adoption of the format in the industry at large. This is all a moot point until the arithmetic codec supports suspension.
(In reply to Seth Fowler [:seth] [:s2h] from comment #17) Chromium developers are considering arithmetic coded JPEG support: https://bugs.chromium.org/p/chromium/issues/detail?id=669501#c7 > Basically, if additional binary size cost is very low, there is minimal additional security risk, > and we don't need to provide more signals to authors about this, then we'll be more positive about > taking this. If these factors are untrue, we need to think more carefully. Also they are ready to discuss it with other vendors: http://lists.w3.org/Archives/Public/public-whatwg-archive/2016Dec/0032.html > I posted some questions on that bug, but their answers would probably be relevant for other > vendors as well.