Closed Bug 856375 Opened 11 years ago Closed 8 years ago

Implement WebP image support - take 2

Categories

(Core :: Graphics: ImageLib, defect)

defect
Not set
normal

Tracking

()

RESOLVED DUPLICATE of bug 1294490

People

(Reporter: gal, Assigned: shay)

Details

(Whiteboard: [SUMMARY in comments 105-112][parity-chrome][parity-opera][fuzzing:queue:cdiehl][parity-safari])

Attachments

(3 files, 1 obsolete file)

      No description provided.
What about a deal: if Google implements APNG, we implement WebP ? ;)
Attached patch Add webp 0.2.1 support (obsolete) — Splinter Review
Updated version of Vikas Arora's patch from bug 600919.
(using libwebp 0.2.1)
I've taken the patch from bug 600919, and updated it both to recent libwebp, and to build with current codebase, and verified to work with Linux x86-64 and B2G emulator builds.
Feedback would be mich appreciated.
Could it be possible to add image/webp to the Accept header?
cf. http://www.igvita.com/2012/12/18/deploying-new-image-formats-on-the-web/
The patch was bitrotted. Here's a rebased version of it and a Try run to boot.
https://tbpl.mozilla.org/?tree=Try&rev=e8e5bbbb67d6

Also, please make sure your patches follow the guidelines below so that they have all the needed commit information in them.
https://developer.mozilla.org/en-US/docs/Mercurial_FAQ#How_can_I_generate_a_patch_for_somebody_else_to_check-in_for_me.3F
Attachment #731587 - Attachment is obsolete: true
Also, this patch doesn't include any tests. This should have tests similar to the other image formats.
Component: Graphics → ImageLib
OS: Mac OS X → All
Hardware: x86 → All
CC'ing the right people who actually know about that stuff...
WebP hasn't even hit 1.0 yet. The monumental task of becoming a full-fledged web image format to eventually be used by the entire world is not going to be instantaneous. It's just Google's pet project for now, and yes, it would be nice if something could replace JPEG ASAP, but everyone needs to slow down here and let this go through the needed development processes in peace.

I'm sorry to say, but due to the incomprehensible level of emotion attached to a new file format this bug really should be set to restricted commenting too so as to not let it turn into another unreadable mess.
Google is going to release WebP 0.3.0 next week (it will have animation support).
Then they are planning to develop specs for "progressive WebP".
I would suggest to wait a little bit.
Whiteboard: [parity-chrome]
Can we please, please not get into blame, recrimination, accusation or anything beyond implementation progress on this thread?

Yes, WebP 0.3.0 is currently at RC7 and expected to be finalised soon: https://groups.google.com/a/webmproject.org/forum/#!msg/webp-discuss/ZcAhh1-A290/xs-PfWl300AJ
(In reply to Dave Garrett from comment #11)
> WebP hasn't even hit 1.0 yet.

0.2, 0.3, 1.0 etc. are just numbers. WebP is stable and it's already supported by Google Chrome, Opera, the Android browser and Maxthon.

Mozilla could support the current WebP implementation. When a new WebP specs version is released, a new browser version can support the new specs, just like Chrome does.
It's not just numbers. WebP 0.3.0 adds animation (competing with GIF and APNG), that's a significant change in implementation. Worth to wait a few days?

By the way, APNG specs have not changed since 2008 but Chromium devs replied to my patch with "we don't want the attack surface" and "we don't want the maintenance surface" arguments.
I'm already at work on supporting 0.3 features (but took a slight break to address #7 and #8.)
@Shay -- it would he helpful if your changes also included modifying the Accept header for image and HTML requests.  See http://www.igvita.com/2012/12/18/deploying-new-image-formats-on-the-web/ for background on why this is, and http://crbug.com/169182 for the equivalent change being made for Chrome.
Comment on attachment 731595 [details] [diff] [review]
Add webp 0.2.1 support

Review of attachment 731595 [details] [diff] [review]:
-----------------------------------------------------------------

::: image/decoders/nsWEBPDecoder.cpp
@@ +16,5 @@
> + * The Original Code is mozilla.org code.
> + *
> + * The Initial Developer of the Original Code is
> + * Netscape Communications Corporation.
> + * Portions created by the Initial Developer are Copyright (C) 2001

I don't believe a word of it. Not that you should land MPL1.1'd code anyway.
Attachment #731595 - Flags: review-
Just to be clear, no decision on adopting webp has been made. The only thing that has changed is that we've just received some more interest from large non-google web properties which we never really had before.
(In reply to :Ms2ger from comment #18)
> I don't believe a word of it. Not that you should land MPL1.1'd code anyway.

Is it possible that code from ancient image decoders was used as a template? Note the "Original Code"/"Initial Developer". At least one of the other patches in the old bug said it was based on an old BMP decoder and that patch contained that license block AFAIK.
@Jeff: That's great to hear.  We've been working with a number of large third-party sites as well that are keenly interested in WebP; I'll follow-up with you off-bug.
To further clarify, the new interest in webp is for lossy+alpha, e.g. for everything.me. Probably that means for application icons.

See Jeff's comment on the dev-platform thread, https://groups.google.com/d/msg/mozilla.dev.platform/8LI6ZSZQFAo/4dpeHTq5LGsJ
(In reply to Stephen Konig from comment #21)
> @Jeff: We've been working with a number of large third-party sites as well
> that are keenly interested in WebP; I'll follow-up with you off-bug.

It would be open and informative if you could follow-up on-bug or on the dev-platform thread though.
>To further clarify, the new interest in webp is for lossy+alpha

JPEG2K? Are there other reasons for not going with something that has proper tool support and is more mature?
For WebGL I would need a format with lossy+alpha. The most important thing is that all browsers that support WebGL support it. Tool support is not that important as the format can be converted from a lossless format such as png automatically when publishing to WebGL.
We have several different reasons to prefer Webp over existing options:

*We compress PNGs we do not create ourselves*
When we take existing PNG icons (e.g. icons from Play marketplace, etc) that we didn't create, where the designer decides how to design the icon and is not restricted to a specific alpha mask template, we cannot take those PNG-24 icons and compress them while preserving the alpha mask (at least not in a reasonable effort). With Webp we can just grab these icons, resize and compress them, and voila they can be re-used in their resized, compressed form without worries.

*Webp has proven to be superior to JPG with our existing icons*
We took our existing JPG icons, compressed them with webp instead of JPG and reached them same visible result (not empiric, but reviewed by our design team that have proven to have little tolerance to bad pixels) with ~30% less bytes. This means we can save ~10KB in our API responses for little work done.

We've already moved forward with integrating webp into our backend system, it will serve Android platform to begin with, but we'd be very happy to enjoy the fruits of this work on FxOS as well and are willing to keep working on the patch.

Btw - Webp 0.3.0 has been released.
Google is discussing adding Animated WebP support to Chrome here: <https://groups.google.com/a/chromium.org/forum/?fromgroups=#!topic/blink-dev/Y8tRC4mdQz8>
When WebP was released, I looked at some possible advantages for wikipedia.org if thumbnails were served as WebP files instead of JPEG files. Given the fact that roughly 80-90% of the size of a random Wikipedia article is due to thumbnails, any reduction in size for those thumbnails would have a huge impact on the overall volume of traffic for those users.

Wikipedia has an open bug requesting the addition of WebP thumbnails for users (https://bugzilla.wikimedia.org/show_bug.cgi?id=25611). Since the bug was filed, WebP added support for lossless image compression, hance png thumbnails could be replaced as well.

There is an experiment available to registered users of the English language Wikipedia. By adding the line "importScript('User:Magnus Manske/webp.js');" to your own common.js file, i.e. http://en.wikipedia.org/wiki/User:$username/common.js, you can replace jpeg thumbnails with webp ones when hovering over the image with your mouse. The experiment is not meant so save bandwith but rather to allow fast comparison of the difference in size and quality for both thumbnails. On average, WebP thumbnails are much smaller but still of sufficient quality to perform their duty as thumbnails.
(In reply to Joey Simhon from comment #26)
> We have several different reasons to prefer Webp over existing options:
> 
> *We compress PNGs we do not create ourselves*
> we cannot take those
> PNG-24 icons and compress them while preserving the alpha mask (at least not
> in a reasonable effort).

Thanks for contributing to the discussion. Are you talking about a lack of tool support here, or a format limitation? For example you want to reduce the 24-bit icon to a limited palette, but png doesn't allow partial transparency with indexed colour images?
As a CDN we’ve found it to be quite powerful to automatically encode images to WebP (given the customer’s choice of course). We have just implemented this practice and are beginning to collect data on bandwidth reduction and would like to ultimately prove cost savings. We see this not only as a benefit to the end-user (faster load times, lower bandwidth costs), but also as a benefit to CDN customers as their expenses will also be lower. For sites having significant CDN expenses (or any CDN costs for that matter), any immediate reduction in their costs is an easy to obtain benefit for organizations. We recently wrote an article discussing the use of the “Accept” header to automatically choose the image’s response content-type (http://blog.netdna.com/developer/how-to-reduce-image-size-with-webp-automagically/) hopefully it can help others develop a process to reduce their CDN costs. This being said, if Firefox were to support WebP it would help to improve cache hits and response times, along with lowered CDN expenses.
Adam nice site. I use Comodo Dragon and noticed that it does not yet support the Accept header. It is still set to just */*. If anyone else needs to hack their request headers for either chrome or dragon to test out Adam's link (comment #30), here's how I was able to do it:

1) Grab the change http request header extension:
https://chrome.google.com/webstore/detail/change-http-request-heade/ppmibgfeefcglejjlpeihfdimbkfbbnm?hl=en

2) Click Options, then click Add, then type Accept for the Name. Then under Auto Setting Rules click Add, then type for Name the word Accept, then for the value enter the following:

text/html, application/xml, application/xhtml+xml, image/png, image/webp, image/jpeg, image/gif, image/x-xbitmap, */*

3) Next to "Condition(s)" click the Add button, and for entry one choose URL, then Include, then type http. Click the Save button.

4) Clear your cache in case you've opened his blog before, otherwise it will grab the old .png you fetched before from your browser cache. Then, open Adam's netdna blog link:
http://blog.netdna.com/developer/how-to-reduce-image-size-with-webp-automagically/

5) Right click the picture of the lion cub sitting on the CDN logo and click Inspect. Right click on the .png in the inspector. Click "Open link in resources panel" and then right click it in the list on the left and click "Reveal in network panel." Left click on the png in the list and it will display the network traffic log for that image.

6) Scroll down to "Response headers" and then click "View source" and you will see a display very similar to the one pictured in his blog post. You should now see in the response:

Content-Type: image/webp

This indicates that although the browser looks like it fetched a .png it was given to the browser as a .webp, thereby saving bandwidth. Very clever. I love it, thanks Adam.
I'd like to key in to this discussion about Web-P.

Personally I think the greatest source of demand for a new image format for the web comes from the absence of an image format that can do both lossy compression and alpha compression. This is something that would be pretty important in canvas/web-gl games and in fact has been something flash has supported natively since the early days and has always been a frequently used feature in that platform.

Another source of demand I think comes from at least some desire on the web for an image format that supports more color formats. Formats like RGB565 or Float RGBE (Radiance) could be potentially desirable in web-gl applications (HDR?), and formats such as Float RGB128 Could be used as a clever way of storing data such as a depth map or height map or simply a float array. An efficient way to sent large amounts of data to the browser.

Better lossless compression is desirable, but I personally don't think it's even the most demanded feature but merely a nice to have. Any new web image format though should seek to offer superior compression to JPEG however.

I think any new image format that can provide these features to the web is worth putting some effort into choosing and implementing as demand for the above features, I believe, is relatively high right now and growing by the day as web applications grow more rich. I think there are two potential contenders to fill this hole: Web-P and JPEG-XR.

I'm omitting Jpeg2000 here because of the computational intensity required to encode/decode it which, I believe, alone is enough to make it inappropriate as a web image format.

That leave us with Web-P and Jpeg-XR as currently suitable formats (there may be others but I'm unaware of them)

My understanding with Web-P at the moment is that it does not support extra color depth, it only supports YUV420 and RGBA32 for lossy and lossless respectively. I'm not aware if this situation is going to be improved in future iterations of Web-P.

Here's a list of pro's and cons between Web-P and Jpeg-XR respectively
Web-P:
Pros:
Completely free and open
Superior lossy compression to Jpeg-XR
Supports lossy + transparency
Supports lossless mode
Supports animation

Cons:
Not a finalised specification
Not a standardised specification
No broad color depth support

Jpeg-XR:
Pros:
Completely free and open
Finalised specification
Standardised specification (JPEG group/ISO/ITU)
Supports lossy + transparency
Supports lossless mode
Supports broad range color depth and pixel formats
Relatively simple compression algorithm (lossy and lossless compression use the same algorithm)

Cons:
Inferior lossy compression to Web-P
No animation support
Created by Microsoft and all the legal anxiety that implies.

The legal situation with Jpeg-XR is a pretty moot point at this stage as Microsoft grants patent immunity to anyone creating an implementation of Jpeg-XR for any purposes (via the community promise). They also now have a BSD licenced Jpeg-XR library available for use by anyone who wants to implement it.

I'm not trying to start a fan boy war between Web-P and Jpeg-XR, but it's my personal opinion by simply assessing the two formats as they stand today that Jpeg-XR exhibits some strong advantages over Web-P as a candidate for a future image format for the web. I have nothing against the Web-P effort, but I would feel any new image format for the web should support all the features I've listed above at least. There should not be any room for demand for more features in the more distant future that would require yet another image format, as far as we can help that. The unfortunate fact regarding Web-P is that it is an incomplete specification today. It seems at least prudent that any new image format that the web should choose to adopt in the future should at least first be standardised by some reliable authority.

If Web-P were to add support most desirable pixel formats and become finalised and standardised as Jpeg-XR is today I think it would be the superior choice, but I think that process would take many years at least and I'm not sure if it would all happen ever should we decide to wait for that. Basically I think Jpeg-XR should be looked at more seriously based on its technical merits even though it was originally developed by a less popular company than Google. But mostly the point I want to make is that *a* format should be chosen for support and in a timely manner. There is more that the web can benefit from here than just smaller lossy images.
How does WebP compare to jpeg-xr in computational complexity?
(In reply to myutwo33 from comment #32)
> Not a finalised specification

WebP 0.1 is finalized.
WebP 0.1.2 is finalized.
...
WebP 0.3.0 is finalized.

There are new versions because it's evolving itself and that's a good thing, I don't want "yet another dead specification".

> No broad color depth support

I think that's merely a "nice to have".
 
> No animation support

That's a big issue for JPEG-XR when everybody seemed to want animation support in bug 600919.


Moreover, WebP has a larger market as:
JPEG-XR support -> IE 9, IE 10 (IE Mobile?)
WebP support -> Chrome (Desktop + Mobile), Opera (Desktop + Mobile + Mini), Maxthon, Android browser

A better image format is particularly important for mobile browsers; WebP has a better mobile browser support and a better lossy compression than JPEG and JPEG-XR. Hint: http://my.opera.com/chooseopera/blog/on-a-horse-opera-turbo-to-the-rescue
(In reply to ekerazha from comment #34)
> > No animation support
> 
> That's a big issue for JPEG-XR when everybody seemed to want animation
> support in bug 600919.

Thanks for pointing that out. I remember that was a gigantic show stopper for webp in the previous bug report that prevented Mozilla from implementing it. Now suddenly JPEG-XR doesn't have it and that seems okay to everyone. I find that absolutely fascinating. I think there's some serious bias going on here. I wonder if some of the JPEG-XR proponents here are literally being paid by Microsoft to throw some wrenches in the wheels.
> I remember that was a gigantic show stopper for webp 
> in the previous bug report that prevented Mozilla from implementing it. 

You remember wrong. Nobody even requested WebP animation, much less called it a "gigantic show stopper".

> Now suddenly JPEG-XR doesn't have it and that seems okay to everyone.

Because it *is* okay. It's not essential for a JPEG-replacement format to have animation.
(In reply to Max Stepin from comment #36)
> Because it *is* okay. It's not essential for a JPEG-replacement format to
> have animation.

It's also not essential for a JPEG-replacement to have alpha channel support (JPEG doesn't have it), but it was a "gigantic show stopper" for WebP. It's obvious that we aren't talking about a JPEG-replacement anymore. We are talking about a JPEG+PNG+GIF replacement.
(In reply to ekerazha from comment #34)
> WebP 0.1 is finalized.
> ...
> WebP 0.3.0 is finalized.
I'm comfortable having an open debate about this, but I'm not sure a new image format for the web should come in the form of a "living standard". It seems to beg for incompatibilities among decoder implementations, at least more so that a for stable standard. But I can see a lot of advantages to this as well however. Perhaps I don't understand Google's versioning system, but specifications versioned below v1.0 seem like they're incomplete, or not intended for production use. I'm not entirely sure what Google's own view on this is.
> There are new versions because it's evolving itself and that's a good thing,
> I don't want "yet another dead specification".
By "yet another dead specification" do you mean specifications like JPEG and PNG? Because I believe those formats have benefited enormously from being standardised for years and providing the user with confidence that any application claiming to support JPEG or PNG files can open any JPEG or PNG file. Evolving standards are great too, but I'm unsure if they are great for image formats. I'm willing to have a discussion about it though.
> > No broad color depth support
> I think that's merely a "nice to have".
I would disagree. I don't think it's very important today, but I think it will become more important in the near future as web-gl evolves and web applications become more rich.
> > No animation support
> 
> That's a big issue for JPEG-XR when everybody seemed to want animation
> support in bug 600919.
I agree.
> Moreover, WebP has a larger market as:
> JPEG-XR support -> IE 9, IE 10 (IE Mobile?)
> WebP support -> Chrome (Desktop + Mobile), Opera (Desktop + Mobile + Mini),
> Maxthon, Android browser
That's true. It's effectively Blink vs Trident right now, but it's true that web-p has far more support. Really though the only big players support either format are the companies that developed them. I believe whichever format Mozilla chooses to support could go some ways towards resolving the stalemate.

> A better image format is particularly important for mobile browsers; WebP
> has a better mobile browser support and a better lossy compression than JPEG
> and JPEG-XR. Hint:
> http://my.opera.com/chooseopera/blog/on-a-horse-opera-turbo-to-the-rescue
True again. That's certainly a great example of where the superior image compression in web-p pays dividends.

(In reply to blakesteel from comment #35)
> (In reply to ekerazha from comment #34)
> > > No animation support
> > 
> > That's a big issue for JPEG-XR when everybody seemed to want animation
> > support in bug 600919.
> 
> Thanks for pointing that out. I remember that was a gigantic show stopper
> for webp in the previous bug report that prevented Mozilla from implementing
> it.
Source?
> Now suddenly JPEG-XR doesn't have it and that seems okay to everyone.
Who's everyone? Who's "okay" with it?
> I find that absolutely fascinating. I think there's some serious bias going on
> here. I wonder if some of the JPEG-XR proponents here are literally being
> paid by Microsoft to throw some wrenches in the wheels.
Both Web-P and Jpeg-XR are formats created by multi billion dollar corporations that a lot of people don't like, yet both of these formats may represent a good way to improve the value of the web for everybody. If that's going to happen, everyone is going to have to decide which one is best for the web unanimously and all work together to support it. Making completely baseless accusations that people supporting image format X are being paid by corporation Y are completely unhelpful to the discussion.

Of course another outcome is that the conclusion that neither of these image formats are suitable to become broadly supported web image formats yet. This seems to be the stance Mozilla are adopting right now and it may be the right decision for some time to come.
> It's also not essential for a JPEG-replacement to have alpha channel support
> (JPEG doesn't have it), but it was a "gigantic show stopper" for WebP.

No. Two things are absolutely essential for any JPEG-replacement format: 
1. Better compression than JPEG. 
2. Alpha channel support.

It's no coincidence that JPEG2000 and JPEG-XR had both.

Other stuff range from nice-to-have to not-really-needed.
Hey, guys.

Please keep in mind that we abandoned bug 600919 because we were getting a lot of bugspam in there.  I appreciate that the discussion here is technical and not (yet) a flame war, but I assure you that the imagelib owners are well aware of the trade-offs between WebP and JPEG-XR.  Please remember that every time you comment in this bug, more than 97 people get an e-mail.

The discussion here makes it materially less likely that we'll take either WebP or JPEG-XR, because people really hate cc'ing themselves to bugs with this much chatter.  I know it's counter-intuitive and frustrating, but by far the best thing to do if you want us to add one of these image formats is to stop commenting in this bug.

Thanks.
(In reply to myutwo33 from comment #38)
> Perhaps I don't understand Google's versioning system,
> but specifications versioned below v1.0 seem like they're incomplete, or not
> intended for production use. I'm not entirely sure what Google's own view on
> this is.

They are just numbers. Google ships its own mainstream browser with WebP support enabled by default, it's not an experimental feature which you activate from chrome://flags

> Evolving standards are great too, but I'm unsure if they are great for
> image formats.

I think the important thing is backward compatibility. Correct me if I'm wrong, but I think WebP versions are backward compatible, you just miss the latest features if you use the implementation of an older version.

> I don't think it's very important today, but I think it
> will become more important in the near future as web-gl evolves and web
> applications become more rich.

Many (most?) people use 6-bit TN LCD screens which don't even cover the sRGB color space. In my opinion it's still a "nice to have".

> Really though the only big players support
> either format are the companies that developed them.

Opera supports WebP too with Presto (although they are going to switch to Blink).
(In reply to ekerazha from comment #41)
> They are just numbers. Google ships its own mainstream browser with WebP
> support enabled by default, it's not an experimental feature which you
> activate from chrome://flags

Those format versions make it hard to determine what "WebP support" even means.

It's fragmented into "WebP 0.2.1 with alpha channel support" and "WebP 0.3.0 with animation support" and you'd have to version-sniff for Android/Chromium to use them properly.

Right now the latest Chromium doesn't have WebP 0.3.0 support. 
As I understand from blink-dev, they are considering to implement it behind the flag, for now. It could be months before it's in stable (and enabled by default instead of flag).

> I think the important thing is backward compatibility. Correct me if I'm
> wrong, but I think WebP versions are backward compatible, you just miss the
> latest features if you use the implementation of an older version.

Unfortunately, it's not backwards compatible.
I tried gif2webp and Chromium can't show the result. 
Backwards compatible means you would hope to see the first frame of animated WebP, but no, it can't show the first frame.
Awesome, so this is already turning into a dumping ground like the last one. needinfo? me if there's another patch that needs pushing to Try.
(In reply to Max Stepin from comment #42)
> Backwards compatible means you would hope to see the first frame of animated
> WebP, but no, it can't show the first frame.

Backwards compatibility and graceful degradation are different concepts. Just like "JPEG replacement" and "JPEG2000/JPEG-XR replacement" are different concepts. If you use a new feature (i.e. animation) it's normal that it could be unsupported on the implementation of older versions because animation support doesn't exist there. If version 0.3.1 or 0.4.0 will improve the animation feature, then backwards compatibility for animated images will be necessary, but animation support never existed before 0.3.0, so there is nothing to be compatible with. Seeing the first frame is graceful degradation, not backwards compatibility and backwards compatibility doesn't necessarily involve graceful degradation.
Facebook has already started experimenting with WebP to serve images, in the browsers that support it, Chrome and Opera and for a site with hundreds of billions of photos, which would be Facebook, a few KB per image add up quickly.
Also Google added WebP accept-header support to Chrome
Whiteboard: [parity-chrome] → [parity-chrome][fuzzing:queue:cdiehl]
(In reply to Marilyn Munster from comment #45)
> Also Google added WebP accept-header support to Chrome
Let's try to not spread rumors here. Chrome added a flag disabled by default. See https://code.google.com/p/chromium/issues/detail?id=169182 . Comment 44 there:
"This switch is off by default. It will be turned on later once we verify that this change doesn't break any sites."
How do decode times for comparable webp and jpeg encoded images compare?
(In reply to David Bruant from comment #46)
I am sorry you feel that way(check Revision 148318)

but on the latest CANARY build & Fb
when i save images it has a webp extension
now why is that i do not know but it is accepting headers & yes Fb has implemented webp to users with browsers that support it so
please verify your fact before you accuse
(In reply to Ralph Giles (:rillian) from comment #33)
> How does WebP compare to jpeg-xr in computational complexity?

I've compared the windows version of Google's libwebp with Microsoft's recently released jxrlib. The results largely favour web-p with jpeg-xr taking between 20%-200% longer to decode the same image encoded to roughly the same quality.

Though I remember reading quite awhile back, though I can't find the source now, that both web-p and jpeg-xr claimed to be roughly 15-20% more computationally complex than jpeg. So either Microsoft is wrong, or perhaps more likely jxrlib isn't particularly well optimised yet. I'll compare them with libjpeg at some point in the future too.
Ars has a little article on the topic here:
http://arstechnica.com/information-technology/2013/04/chicken-meets-egg-with-facebook-chrome-webp-support/

It seems to be a good concise read for anyone who would like a good summation of the state of things. It also mentions that apparently Facebook tried WebP out a bit and ticked off a bunch of users because they couldn't share downloaded files or saved URLs anymore as only Google Chrome and Opera (which is soon to be Chrome-based) support it.

(Ars directly linked to both WebP bug reports here, by the way)
(In reply to Marilyn Munster from comment #48)
> (In reply to David Bruant from comment #46)
> I am sorry you feel that way(check Revision 148318)
> 
> but on the latest CANARY build & Fb
Latest canary build is different than mainstream Chrome in terms of user adoption. That difference matters *a lot* when it comes to considering changes to the Accept header.
I've just published some results we (Everything.me) found when evaluating WebP for our use. 

You can find those on our Github repository: https://github.com/EverythingMe/webp-test
inspecting and optimizing your image assets will likely yield the highest rate of return.

Case in point, the new data compression proxy for Chrome applies dozens of different content optimizations, but image optimization almost invariably comes out at the top. End result? On average, data usage is reduced by 50%!

https://developers.google.com/chrome/mobile/docs/data-compression

http://www.webpagetest.org/video/compare.php?tests=130125_6N_KZA%2C130125_NH_KZ8&thumbSize=200&ival=100&end=full

http://www.ebaytechblog.com/2013/02/22/a-picture-is-worth-a-thousand-words/
Subscribed.  I am the Facebook engineer examining the adoption of webp as a serving format.  We are very excited about the new format and keeping a close eye on the community to monitor adoption.  It is entirely likely we will be serving webp images in some capacity in the short term after we address user complaints from our limited testing.  It goes without saying, we would love to see webp support coming to Firefox soon.
(In reply to Bryan Alger from comment #54)
> Subscribed.  I am the Facebook engineer examining the adoption of webp as a
> serving format.  We are very excited about the new format and keeping a
> close eye on the community to monitor adoption.  It is entirely likely we
> will be serving webp images in some capacity in the short term after we
> address user complaints from our limited testing.  It goes without saying,
> we would love to see webp support coming to Firefox soon.

Is there any way to join the waiting list to test the webp implementation ??
would love to see how much better overall it really is
and also try to give feed back to solve the quirks & user complaints
http://blog.webmproject.org/2013/05/vp9-codec-nears-completion.html

"We’ll freeze the VP9 bitstream on June 17, allowing Chrome and Chrome OS to enable VP9 by default."

I don't think Mozilla should add WebP support until Google switches WebP to use the VP9 encoder for lossy compression still images. VP9 shows promising results for video encoding and the quality improvements can be adapted to WebP.
There is no need or plan for WebP to migrate to the VP9 spec.  WebP will remain based upon VP8.
(In reply to Dave Garrett from comment #50)
> It also mentions that apparently Facebook tried WebP out a bit and ticked off a bunch of users because they couldn't share downloaded files or saved URLs anymore as only Google Chrome and Opera (which is soon to be Chrome-based) support it.

It was because the files saved as .webp and most users did not know what it was or where to find tools to edit or view those files. To be fair, if it was JPEG-XR or any number of other formats proponents have mentioned on the two bug-threads regarding webp it would have caused the same anguish as a majority of tools, smartphones, etc. do not yet support it either. The specific issue Ars brought up is grandma not knowing how to download a relative's face to her iphone. It's a valid issue, but not the fault of the format itself.

(In reply to NVD from comment #56)
> I don't think Mozilla should add WebP support until Google switches WebP to
> use the VP9 encoder for lossy compression still images. VP9 shows promising
> results for video encoding and the quality improvements can be adapted to
> WebP.

The VP9 bitstream mentioned by Google is for the video format, not the picture format.

(In reply to NVD from comment #58)
> Then there's no point in supporting WebP and encouraging clueless masses to
> convert from lossy compression JPEG to lossy compression WebP just to save a
> few kilobytes while destroying image quality is stupid to say the least.

If you actually compare the image quality between the two, it's an enormous gain in webp's favor over traditional jpeg. Faces become clearer instead of 8-bit looking video game characters. There is a lot less distortion. I cite my comments from the previous webp bug thread specifically, where I refuted points made by mozilla engineers who cited a x264 engineer's blog who was anti-webp. I even used his own data.
Large serving platforms like Facebook and Google are interested in WebP due to the marginal benefit it provides users.  WebP is no free lunch, it is a tradeoff of CPU on the server for network to the user.  The user experience improves since the resource is downloaded faster than an equivalent quality JPEG.  As an engineer from Google put it, this is about "making the web fast."  Speaking for myself and not any organization, I am really excited to see progress in this direction.  And to see it released open source.  For a photo-heavy website, a ~30% reduction in serving size probably doesn't mean much to Joe Smith on his 4G smartphone in a developed nation, but you will really really appreciate that when you are on public WiFi at a coffee shop in SE Asia, experiencing low bandwidth and heavy packetloss and trying to checkup on friends back home.

There are software tricks that can be done to fix the user experience that have been outlined elsewhere.  A smart proxy in front of a caching node can examine Accept & User-Agent headers and decide to serve up a JPEG rather than WebP image when a Chrome users shares a link to some resource with a friend using an unsupported browser.  You could modify the frontend to pull down a more widely accepted format (JPEG) when saving until the tooling support is in place on all devices.  I am happy to discuss offline and will refrain from polluting this bug report further.
But that logic, such platforms as Facebook should've started serving JPEG XR to Internet Explorer users like years ago.
Years ago, mobile devices weren't so widespread. Now we need a better compression format and WebP has a better lossy compression (at similar quality) than both JPEG and JPEG-XR and many mobile browsers already support it (the Android browser, Chrome, Opera). Firefox has been weighed, it has been measured and it has been found wanting.
Does anyone know if there's a good mechanism for detecting support for individual versions of WebP? Let's say if a browser supports WebP 0.2 and has image/webp in the accept header. A website takes this is a sign that it's okay to send it a WebP image encoded in WebP 0.3, in this case the browser could end up with an image it can't decode. In this case it would of been far more desirable to send that browser a jpeg image if it didn't support the same WebP version.

I know WebP images are somewhat backwards compatible with older decoders, but Google say future versions of WebP will support progressive decoding. I imagine these images will fail to decode with 0.3 decoders or older. I can only image there's other similar breaking changes are down the road and being able to differentiate WebP support to specific versions is important.

Modernizr has some pretty neat ways it could detect different versions of WebP using JavaScript tests, but is there a better method of detecting that?
https://github.com/Modernizr/Modernizr/tree/master/feature-detects/img
To those who are asking: This bug is as close to a "waiting list" as you can get. Just be patient. Eventually this will land somewhere, maybe a test build first or branch or just on Trunk, and when it does you'll hear where you can get it to test here.
I try to maintain a webp-enabled and updated version of mozilla-central (the patch is the same as the one in this bug)

You can find it at https://github.com/EverythingMe/mozilla-central

Assuming you have a configured build environment, the following will get you a Firefox build with webpage support: 

git clone --branch webp git@github.com:EverythingMe/mozilla-central.git
cd mozilla-central
./mach build
The only realistic way is to host both WebP+JPEG. Or maybe three-way WebP+JPEG+JPEGXR if you really want to save on bandwidth. But that would hurt you on storage. Regardless, the idea of ignoring IE and Safari users does not seems to be a good way to solve your problems.
(In reply to Max Stepin from comment #79)
> The only realistic way is to host both WebP+JPEG. Or maybe three-way
> WebP+JPEG+JPEGXR if you really want to save on bandwidth. But that would
> hurt you on storage. Regardless, the idea of ignoring IE and Safari users
> does not seems to be a good way to solve your problems.

First of all Apple has no excuse for not including webp in Safari as it is webkit based. That being said, there is a plugin for Safari that adds on webp support. That is, if Apple finds it too technically challenging to turn on the already present webp code by flipping a compilation symbol from false to true.

IE support is an issue, I agree. Microsoft will probably only add support for webp if Firefox implements it first. They don't have a lot of respect for Safari, but they fear falling behind Mozilla. I wouldn't doubt that they have a covert project at MS for webp and already have webp support in an alpha build so that they're ready in case they need to be. That's Microsoft's modus operandi, they adopt open source standards when it's clear they have to and not before.

I completely agree with you that Doug in comment #78 should try to not lose potential viewers/customers by serving up jpg/png when IE users visit if he can. But, it's up to him how he wants to handle his server. If he doesn't have the resources to do that, then they're just not there to be had. So, I understand where he's coming from.

One possible solution for him is to add the webp javascript decoder when he notices someone visiting with IE: http://webpjs.appspot.com/ This is a javascript webp decoder. The browser on the client's side does the decoding in javascript to draw directly into an html5 canvas. The user can then save this as a .png file. I've tested this on IE10 and it works. It will also render on android, iphone, Firefox and Safari.
(In reply to myutwo33 from comment #63)
> Does anyone know if there's a good mechanism for detecting support for
> individual versions of WebP?

One, server-driven. I don't know if people forgot about it or what, but MIME types have parameters. We could define a "version" parameter that would state the version and you could communicate to the server which versions you support. Another approach is to define parameters based on the functionality, so for example a client can tell that it wants "image/webp", but not, let's say, "image/webp;animation=y" when it doesn't support animation.

Another approach is to make a standard for client-driven negotiation and define a standard way of telling the client that it needs to choose a representation. For example, if there are multiple representations of a resource, the server could send the client a 300 response with text/uri-list message-body in the syntax of:

# metadata
URI
# metadata
URI

and the client would then choose one it deems is the best option. Since this response could be cached, the next request for the resource would be redirected on the client side, or cached as a redirect.
will WebP image support land in FF24 esr??
else lot of people who use ESR will be left out to dry.
Any time frame when this will land?
Is the addition of WebP image support such a huge burden?
or is it the chicken or the egg debate?
(In reply to ElevenReds from comment #83)
probably not, no clue, yes and also yes
Hi,

This is a patch which gives decoding/encoding WebP support to Firefox. It is based on version 0.3.0 of libwebp.

Thanks,
Matheus Morais.
(In reply to Matheus Morais from comment #85)
You should probably be basing it off of 0.3.1 now that 0.3.1 is released. There are some nice new benefits in that build.
http://www.androidpolice.com/2013/07/15/the-new-web-play-store-is-insanely-fast-here-is-why-analysis/

Just in case anyone missed it, the updated Google Play Store website is now serving images in webp format. Unscientifically, it does feel quite a bit faster on Chrome. This bug is actually linked in that article.
(In reply to Ibrahim Jadoon from comment #88)
> http://www.androidpolice.com/2013/07/15/the-new-web-play-store-is-insanely-
> fast-here-is-why-analysis/
> 
> Just in case anyone missed it, the updated Google Play Store website is now
> serving images in webp format. Unscientifically, it does feel quite a bit
> faster on Chrome. This bug is actually linked in that article.

FWIW, that article seems to suggest that they are serving lossy webp vs lossless png which makes any comparison largely meaningless.
(In reply to Jeff Muizelaar [:jrmuizel] from comment #89)
> (In reply to Ibrahim Jadoon from comment #88)
> > http://www.androidpolice.com/2013/07/15/the-new-web-play-store-is-insanely-
> > fast-here-is-why-analysis/
> > 
> > Just in case anyone missed it, the updated Google Play Store website is now
> > serving images in webp format. Unscientifically, it does feel quite a bit
> > faster on Chrome. This bug is actually linked in that article.
> 
> FWIW, that article seems to suggest that they are serving lossy webp vs
> lossless png which makes any comparison largely meaningless.

Here is a quick comparison of a couple of store images:
PNG	WebP
176K	24K
143K	8K
186K	17K
65K	11K


What's interesting is that Google is apparently using jpg images for at least some of the phone screenshots, while it is using png exclusively for larger screenshots.
There are upsides and downsides, but the upsides massively outweigh the downsides.

Let's take a look at some differences using four random samples of Play Store images, shall we?

    WebP (52 KB) vs PNG (827 KB) = WebP is about 16x smaller (wow)
    WebP (11 KB) vs PNG (62 KB) = WebP is about 6x smaller
    WebP (42 KB) vs PNG (345 KB) = WebP is about 8x smaller
    WebP (132 KB) vs PNG (1,281 KB) = WebP is about 10x smaller

Pretty impressive, isn't it?
Upsides

1. WebP images occupy a fraction of the comparable PNG ones - the savings are as crazy as 90% or more in some of my tests. I'm talking a 1.3MB image now being 132KB. Yup, that's going to save a lot of bandwidth and load way faster.

2. Because Google went with the lossy WebP variant rather than lossless, we should compare it to JPEG, which was their other alternative. WebP offers better compression and therefore doesn't degrade as much as JPEG at the same file size. While there is a very mild difference in sharpness and detail between PNG and WebP variants of the same images, it's not something most of you will ever notice, especially since you don't get to see both formats side-by-side.


1. WebP images are not supported by every browser - in fact, if you're not using Chrome, Opera your browser does not currently support WebP

2. WebP comes in both lossy and lossless variants, and Google went with the lossy one for more savings. This means that you may notice some image quality degradation, but in my experience, it's so small, most of you will be just fine.
@Thellel - Please refrain from making non-technical arguments on this bug. They are not helping anyone, and especially not helping the WebP cause.

@ElevenReds - as Jeff Muizelaar said, the comparison in this case is between lossless PNGs and lossy WebP. While converting these PNGs to lossy WebP provides great savings, a conversion to JPEG would too.
Note that there are cases where such a comparison would make sense, for example real-life PNG32 that includes a meaningful alpha channel. The PNGs in question are PNG32, but their alpha channel is practically blank, meaning they can be converted to JPEG without losing transparency.
We’ve recently added WebP support to Cloudinary (image management SaaS) and we are really happy with the benefits this format brings to our customers and their web (and mobile) visitors.

Our customers frequently ask us how they can save bandwidth and how they can improve their website’s speed. Once we help them overcome the common pitfalls (Using PNG instead of JPEG for photos, delivering full size images and resizing on the browser-side instead of using Cloudinary to resize before delivery, etc.), we are basically left with “Try reducing the JPEG quality to a bare minimum level”.

In this regard, WebP is a huge win. It’s smaller in size (lossy and lossless), and to our naked eye it appears to degrade more gracefully in higher compression levels than JPEG. This directly translates to saved bandwidth costs and improved visitor experience, both great incentives for using WebP.

It’s great that the Chrome/Opera/Android4 users are already enjoying these savings. We are anxiously waiting to see WebP supported (+ “image/webp” in Accept) on Firefox as well.

Here’s a short introductory article we’ve written on WebP usage with Cloudinary - http://cloudinary.com/blog/how_to_support_webp_images_save_bandwidth_and_improve_user_performance
Follow-up from comment #78, my real-world experience for your consideration:

In need of storage, bandwidth and having a desire to improve my site for visitors with supporting browsers, I converted all the images on my image-heavy site (24K of them, 1.6MB per on average as jpegs) to WebP without leaving jpeg dupes for redirects for non-supporting browsers in July. Soon the majority of my visits were from WebP-supporting browsers. Bad bot traffic halted, legroom on space and bandwidth, not that bad a hit to traffic as anticipated. Google Images quickly dropped the jpegs and put up the WebPs, including serving them in SERPs to Firefox and IE users, not just Chrome, interestingly. No complaints about WebP quality, only a few complaints about not being able to save the images to use for other purposes, little site veteran attrition, victory. Enough free bandwidth now to play with WebM. Though I could not find a javascript decoder for Firefox that was good enough to make my often-very highres images viewable without problems.

More specific to this bug thread's purpose, a site regular tipped me off that someone made portable Firefoxes with native WebP and was hosting Windows binaries and its source, https://code.google.com/p/lawlietfox/, so I encouraged visitors attempting to access the site with Firefox who were unwilling to use Chrome to try this modded Firefox and report any stability complaints (versions 17, 22 and 23). Interest was significant and after over a month I received only reports of success and gratitude, none of stability problems or any other complaints about this modded Firefox.

So, if there is doubt among you that whatever this man and others like the Everything.me developer did to the source to somehow enable WebP in Firefox creates technical problems for users, I hope my experiment offers some data to mitigate that doubt a little. As for an absence of demand for you to implement WebP, I submit that more and more both guys like me and up to the more marquee sites like Facebook (comment #54) over time will plead with you to implement it, as will your users who found out about WebP somehow and would rather ask you to implement it first before switching browsers. Otherwise, how much longer does this have to go on before you finally hit a WONTFIX point?

Cheers gents.
Doug Simmons
(In reply to Doug Simmons from comment #101)
> Follow-up from comment #78, my real-world experience for your consideration:
 
The patches in the link you posted are diffed against firefox versions up to 22 only
https://code.google.com/p/lawlietfox/source/browse/trunk/

Can you make a patch against current mozilla-central and verify it builds?
By the way Inka3D (www.inka3d.com) now supports webp because webp supports compressed rgba images.
I went through the entire discussion to figure out the current state of this. What I have found (please correct me if I'm wrong or missed something) is:

* no decision on adopting webp has been made as of April 2013
* it is kind of a chicken-egg-problem
* patches (that are bit-rotted by now) have been provided
* some people have requested new patches
* countless people have requested WebP, including a request by Facebook and some interest (chicken-egg-problem) from Wikipedia 

(See below for a full summary with links.)


Could we *please* get an official update on the state of the decision, especially given the countless additional requests for WebP after the last "no decision yet" announcement? (i.e. an answer to the question "Would a high-quality patch implementing the current state of WebP be merged?")

If the answer is "no, not at this time/maybe/ask again later", it is pretty pointless to request/create patches now (since they will likely bitrot). In this case, it would be helpful to mention what has to happen in order to be able to make a decision so we can avoid comments that do not help in making that decision.

If the answer is "yes" (i.e. developers can be reasonably sure their patch will be used and not left to bitrot), I'm sure someone will be willing create a current high-quality patch and we can finally get this bug closed. Also, the "plz implement webp" comments will most likely be significantly reduced.




-----------------------

For people wanting a summary, here is what I extracted from the discussion so far. Names included only where they seemed necessary to follow the discussion and see who is in charge - please do NOT blame these people for anything, especially since I may have confused/missed something:

* WebP was requested in Bug 600919

* Patches were provided by various people

* In April 2011, a decision not to implement WebP at its then-current state was posted in Bug 600919, Comment 26 by Joe Drew, which led to WONTFIXing of that bug. The reasons for that were explained in http://muizelaar.blogspot.de/2011/04/webp.html (by Jeff Muizelaar) and it was said that the decision may be reevaluated as WebP develops.

* A heated discussion/bugspam ensued

* In September 2012, Bug 600919, Comment 122 claimed that all the missing features were implemented at that time.

* In November 2012, a joint decision by Joe Drew, Jeff Muizelaar and "other learned people" confirmed that WebP should not be included in Bug 600919, Comment 129. No specifics (e.g. missing features) were given at that time (though see below), only that the conclusion is unlikely to change unless "the market changes so that implementing WebP is more than a nice-to-have, or WebP becomes more compelling"

* In  Comment 146, Jeff Muizelaar clarified that what was impacting the decision at that point was "Largely the likelihood of Microsoft and Apple also adding support."

* In March 2013, after significantly more Bugspam, the bug was restricted

* The bug was reopened by Andreas Gal two weeks later and the discussion was redirected to this bug for technical reasons.

* Shay Elkin (to whom the bug is currently assigned) provided a patch, which was negatively reviewed due to a weird-looking license block in Comment 18. The license block may have been ok as pointed out in Comment 20.

* In April 2013 in Comment 19, Jeff Muizelaar pointed out that "no decision on adopting webp has been made", but that more interest by non-google web properties was received. More of such interest was strongly voiced throughout the following months in this bug (see below)

* In Comment 43, Ryan VanderMeulen said that he should be notified if a new patch is ready for testing (and presumably unsubscribed due to the bugspam)

* In Comment 66, Shay Elkin (assginee) said that he is maintaining a webp-enabled version in his github repository.

* Several comments mentioned issues with a patch, unclear if the repo mentioned above is somehow more current

* Comment 102 requested a current patch.

Requests for WebP in Firefox were voiced in numerous comments. Notable ones are e.g. interest by Facebook voiced in Comment 54 and an experiment by Wikipedia in Comment 28.
Can't concur more. I can confirm, being an employee of Netflix, that a massive adoption of WebP is underway internally. Many of the Netflix ready devices will be using WebP and there is a big movement to get it in use on the web site. Of course, this is being hindered by *some* browser vendors, but we are proceeding anyhow with the hopes to optimize for browsers that can support it. Early tests are very promising.
(In reply to Jan from comment #105)
Comment 105 is a pretty good neutral summary of events to this point, I think. I'd put it in the whiteboard, but I think we've stopped doing that because nobody ever reads the comments mentioned in them.

The things I'd add to your summary, which I mentioned in the other bug a while back, is that people need to understand three things here:

1) This is noisy as hell. I'm not just talking about these messy bugs, either. There is ALWAYS some new file format that some group is passionately demanding, and most of them are not worth bothering with. Attempts to improve JPEG have largely gone nowhere. There are multiple competing "new" image formats and WebP is just one of the latest. It probably has the best chance yet to actually go somewhere, but it's still "yet another web format".

2) WebP is an experimental file format made and pushed by one company. The request here is to permanently and forever add Google's pet image format to Firefox and eventually the whole Internet. This is a much bigger deal than people treat it. How long did it take to transition from GIF to PNG? We're still stuck with GIFs for animated crap for the foreseeable future. The people involved here worry about this way more than the people just requesting something less crappy than JPEGs for the hundredth time. They would really like to implement something that has everything they want and has been able to do it reliably for a while. They don't want to jump in quickly to an experimental file format just because Google and Mozilla could get it out to a wide audience fast.

3) WebP is based on WebM. Google bought V8, open-sourced it, made a new media container, got Mozilla to support it too, and everyone cheered. Google promised to phase out h.264 support and make WebM the new Internet video standard. They didn't. Google Chrome still supports h.264, (Google) YouTube doesn't really use WebM that much, but it does use h.264. Everyone who cheered for WebM thus became increasingly confused. My personal opinion is that this has effectively killed off WebM. Mozilla was forced to implement h.264 support via OS codec support (as software patent nonsense prevents them from supporting it natively).

I am not involved in nor privy to the decision making process here. However, I can tell you that the people involved are not "mad" or "playing politics" or ignoring this. They're just really really cautious. WebP could be the savior of image formats or it could be another soon to be dead Google project. I'm pretty confident it will be somewhere in the middle, but it will take a large amount of effort and commitment from Mozilla to implement, test, and *evangelize* WebP. They probably plan to do something "soon", but due to the significance of everything involved they haven't decided when that is yet and dealing with this mess is just not at the top of everyone's long todo list. Nothing anyone says in this bug will change that.

WebP is probably a good idea that will probably happen at some point.

(and now I've contributed yet another long blob of text that will eventually get lost in this din ;)
There is another argument against WebP support that has just occurred to me: VP9. If Google does switch WebM from VP8 to VP9, what about WebP? Google probably doesn't even really know that answer. Mozilla would probably like to quietly kill the VP8 decoder at some point, especially if it is superseded by VP9. Waiting until WebP gets upgraded to VP9 and only supporting that would probably be a good idea if this is to be the case. (and I don't know if it is) Nobody wants a bran new file format that has the possibility of two different codecs one of which is worse than the other but nonetheless must be supported in case someone encodes things wrong. Nobody wants to implement a new format that will be upgraded right away, either. Again, caution is needed here no matter how much we would like to see JPEG replaced.
(Not an official Mozilla response, just a comment from one engineer who is not even a specialist of image codecs).

There are image codec specialists at Mozilla who are looking carefully at image codecs. They need time to carefully evaluate as many codecs as possible, using the right methodology, and not run into a pitfall (which there are many). The silence is just the result of nobody wanting to speak prematurely on such an important, high-profile subject. If you want to accelerate things, run your own comparisons, but at this stage, make sure that they are very careful, scientific comparisons --- don't bother saying "image format X is 30% smaller at equal quality" without making sure that you know what you mean by "equal quality" and that that is really the quality metric that matters for you. Complicated subject. I'm glad there are smarter people than me to take care of it, and that they are taking their time to do this right.
(In reply to nmrugg from comment #109)
I do not entirely disagree with your assessment, actually. Yeah, we are almost past the chicken-and-the-egg problem and there is talk of other big companies wanting to push WebP, which is very good progress. It is still Google's pet project being just pushed through Chrome and it's still experimental.

> It is already in 40% of browsers according to caniuse.com and increasing all the time.

This is statistics abuse. That "40%" is Chrome + Android Browser + Opera vs. IE, Firefox, Safari, iOS Browser, Opera Mini, Blackbery Browser, and IE Mobile. That's just Google's desktop browser plus Google's mobile browser plus what is now a new spin-off of Google's desktop browser. This is all still just Google supporting WebP in its products and those based on them.
Google plans for Animated WebP specs and implementation are still unclear:
http://news.cnet.com/8301-1023_3-57594084-93/blink-leaders-reject-animated-webp-images-in-chrome-for-now/

Apparently they had offline discussions about it, and reached some conclusions, but never published them.
(In reply to Max Stepin from comment #112)
> Google plans for Animated WebP specs and implementation are still unclear:
> http://news.cnet.com/8301-1023_3-57594084-93/blink-leaders-reject-animated-
> webp-images-in-chrome-for-now/
> 
> Apparently they had offline discussions about it, and reached some
> conclusions, but never published them.

Turned out the GIF implementation was a moving target suitable
for further tuning[1] before running the tests again. And
the gif->webp converter was fine-tuned accordingly[2].
Detailed numbers should be available soon...

[1] https://codereview.chromium.org/23646005/
[2] https://gerrit.chromium.org/gerrit/67163
Whatever; might as well try it. I've noted the recent comments above in the whiteboard so people can find something useful in this din if they know where to look. Bugzilla needs a way to mark certain comments as more useful. (there's a bug sitting around somewhere requesting that already)
Whiteboard: [parity-chrome][fuzzing:queue:cdiehl] → [SUMMARY in comments 105-112][parity-chrome][fuzzing:queue:cdiehl]
(In reply to Benoit Jacob [:bjacob] from comment #110)
> There are image codec specialists at Mozilla who are looking carefully at
> image codecs. They need time to carefully evaluate as many codecs as
> possible, using the right methodology, and not run into a pitfall (which
> there are many). The silence is just the result of nobody wanting to speak
> prematurely on such an important, high-profile subject.
Awesome! Where can their work be followed?
Where can people go to contribute their time to help improve the evaluation methodology?
A few comments:

Re #111/107: I suppose it depends on your definition, but I would not classify WebP as "still experimental."  There is nothing upon which we're waiting to decide whether to continue supporting the format, nor is there anything substantive about the format itself that's subject to change at this point.  From a technical point of view I would classify it as stable.

I would also not classify it as a "pet project."  WebP isn't about Google's ego; it's about making the web faster for all users of all sites on all devices.  The fact that Netflix, Facebook and others are investing in this should demonstrate that.

The discussion about WebM adoption vis-a-vis H264 is really not pertinent to WebP.  The history behind WebP and VP8 adoption is lengthy and complex but not really relevant here.  While WebP was born out of VP8, ultimately it stands on its own.

Re #108: There are no plans or intentions to change WebP based on VP9.
(In reply to Stephen Konig from comment #116)

Hi Stephen,

I see you're part of the Web-P team and I'm wondering could you answer a few questions regarding the format?

1: At present is there any convenient or standardised way to detect what version of Web-P a web browser supports? Does chrome have a way of making prevent a web server detecting that chrome can render web-P and sending it a future web-p v0.5 file when the browser happens to only support v0.3?

2: Is there any guarantee that any image compressed with future web-p codecs (v0.5) can decode okay on current decoders? I understand progressive decompression is planned as a future feature for web-p, I gather these images can never be decoded with todays codecs.

3: Are there any plans to finalise the web-P spec at any point? To me the value of the image format is *massively* diminished when it can't be reliably opened by all software claiming to support web-p due to the spec being in constant flux. I find it hard to imagine the format receiving wide adoption before it has stabilised and perhaps standardised by ISO or ITU.

The image format looks fantastic to me generally, but there points have always stuck out to me as large and almost silly shortcomings that are completely solvable.
Re #117: 1) UA detection is the only way to guard against this fully.  Chrome recently added image/webp to its Accept header for image and HTML requests though, post 0.3.0 -- so you can use that to detect that minimum level of compliance.  However as Chrome auto-updates (as do Opera and Firefox), as a practical matter it's not much of a concern as it might appear.  

2) We do guarantee that future encoders will be backwards compatible within the same feature set (so for example a lossy WebP encoded with 0.3.0 will decode with 0.2.0).  Of course if any new features are introduced in a future version of the encoder, those won't be supported by older versions of the decoder (e.g. an animated WebP encoded with 0.3.0 will not decode with 0.2.0).

3) As I mentioned earlier, I would consider the features currently released and supported by WebP to be final--WebP is not in flux as some might suggest.
Stephen, I'm sure you're aware that libwepb 0.3.1 is only three months old, but it's already missing some new features, because those features are like one month old?

There is no WEBP_MUX_BLEND in libwepb 0.3.1 right?

It makes this look like work-in-progress, not stable...
@Max:
The WEBP_MUX_BLEND / WEBP_MUX_NO_BLEND option was considered even before v0.3.1 was released, it was just not finalized at the time: https://groups.google.com/a/webmproject.org/d/msg/webp-discuss/fD_nrJibs_4/rjNb9wVpucoJ
The decision was deferred  pending evaluation at the time. But now, it IS finalized.

Also, animated images produced by 0.3.1 would behave as if they were using WEBP_MUX_BLEND. So next version of libwebp would still have backward compatibility with 0.3.1

The takeaway is that the specificaton of all current features -- lossy, lossless, transparency, ICC/EXIF/XMP support and animation -- are finalized right now.
(In reply to Stephen Konig from comment #118)
>>1) as a practical matter it's not much of a concern as it might appear
Perhaps not in Chome, opera and firefox, but in IE, android browser and safari mobile this is a very real problem as they can't always be updated at all. It's important that web developers have a means to gracefully fall back to supported versions of Web-P unless everyone agrees to ONLY use 0.3 on the web as a kind of standard.

>>2)We do guarantee that future encoders will be backwards compatible within the same feature set
The changing feature set is the problem. Progressive downloading for example seems like a pretty poor feature to be missing in a image format aiming to replace JPEG on the web for performance reasons. No progressive download support means there's a higher perceived download time for the image. As an image format that Google is hoping that all web browser vendors will add to their codebase and maintain forever, it should certainly be expected to have such a basic feature out the gate. It's especially bad when there's no method of detecting if a browser can support decoding progressively encoded web-p images or just v0.3 web-p images. The only safe thing for a web server to do in that case is to only ever serve web-p 0.3 images and never offer web-p images with progressive decoding or any other features not part of the 0.3 feature set. If there was a standardised way of querying what specific version Web-P a browser supports then that would mitigate most of these problems I think and I understand the accept header natively supports this exact scenario, but I gather Google is not doing this in chrome?

>>3) As I mentioned earlier, I would consider the features currently released and supported by WebP to be final
If Web-P v0.3 is final and it's missing progressive downloading, then that sounds pretty poor to me frankly. But even disregarding this, I think the lack of standardisation effort in web-p is by far it's biggest issue. Web-P provides some modest gains in certain areas to consumers by enabling faster website load times, that is great. However, if a consumer is on Facebook or any other site and saves an image from a website do they can send email it to their friend.. that is currently an *awful* experience that I personally think outweighs the good the format provides consumers by an order of magnitude. I'll break down what happens when someone does this:

They save the image to hard disk, and when they view it in file explorer on any OS it's an unrecognised format. Well let's assume chrome associates with web-p and generates thumbnails for it.. that fixes that problem. Now the user emails that image to his friend. But the friend doesn't have software installed to open it and has no idea what's going on and between the two of them they probably have a frustrating time figuring out why the friend can't see the image. Well.. maybe that friend has chrome installed on his pc too and that opens the picture.. well that's great! but maybe in Windows 9 Windows supports web-p 0.3 out of the box and can open web-p images in Windows Photo Viewer, well that's even better right? But what if the image emailed was web-p 0.4 with progressive download encoding? Now the friend receives the image, has a application that can open it (chrome) but still can't see it because the software that associated with it (Windows Photo Viewer) will try to open it and fail. Today I can open Photoshop CS6 and save a png and jpeg image and open them in ms paint in Windows XP RTM and there's some real god honest value in being able to do that. I know with confidence that any software that says it can open PNG and JPEG files can open ANY PNG or JPEG file. Even JPEG2000 and JPEG-XR support this, and while they are not widely used it does make them FAR more useful formats when you know the few places that do use them and using the exact same bitstream specification and can all read the same files and write file that can be opened by one another. I know Web-P is aimed at the web and these problems aren't considered as serious within that domain, but adding a new image format to the web is a big deal and there's no reason why it shouldn't have all the fundamentals right that PNG and JPEG have. If we're designing a whole new format from scratch then there's no reason it should have glaring defects that we'll all have to struggle with for decades after it's adopted. Now I'm not against Web-P and would like what I'm writing to be considered constructive feedback because all the issues I'm talking about can be fixed. Web-P could easily be finalised with all the basic features that should be expected from it if Google wanted, and I personally believe that should be done soon and any extra work on the format should be moved into a completely new file format with it's own file extension (.webp2) but web-p could and should be set in stone before it's pushed much harder as an adoptable format.
(In reply to Stephen Konig from comment #116)
> Re #111/107: I suppose it depends on your definition, but I would not
> classify WebP as "still experimental."

For the purposes of this discussion:  not yet standardized == experimental

> I would also not classify it as a "pet project."  WebP isn't about Google's
> ego; it's about making the web faster for all users of all sites on all
> devices.

I was not using the phrase in a way meant to be derogatory. Everyone has their own pet projects that they focus on by themselves. This is probably one of the better ones.

> The discussion about WebM adoption vis-a-vis H264 is really not pertinent to
> WebP.  The history behind WebP and VP8 adoption is lengthy and complex but
> not really relevant here.  While WebP was born out of VP8, ultimately it
> stands on its own.

Not pertinent to the format but very pertinent to this discussion here. I doubt anyone is holding any grudges, but people should really be wary of a still image format based on a self-sabotaged video format. Again, I think that WebP can be a good thing in its own right, but the history here implies the need for caution just as much as the history of GIFs implies the need for caution.

> Re #108: There are no plans or intentions to change WebP based on VP9.

That's a nebulous answer I read as roughly equivalent to "maybe" and definitely not a "no". The uncertainty here is what concerns me. If VP9 can produce better files then I'd rather have a VP9 based still image format, standardize it, and push it vigorously. If not, then getting a concrete "this is it" on using VP8 in its current form for WebP would be a big must have here, in my opinion. Frankly, I don't even think WebM should be upgraded to VP9. A whole new file extension is probably a better idea so as to differentiate the two and allow for VP8 to be phased out and replaced by VP9+Opus.

(In reply to Stephen Konig from comment #118)
> Re #117: 1) UA detection is the only way to guard against this fully.

Any new feature that requires UA sniffing to properly use needs to be fled away from in terror. WebP may be created for the web but it's intended to replace JPEG. This means cameras and every other device under the sun, which will be almost never updated. The first widely adopted version of this will need to essentially be "set in stone" as just mentioned above and prior versions will need to be explicitly desupported.
(In reply to myutwo33 from comment #121)
> (In reply to Stephen Konig from comment #118)
> >>1) as a practical matter it's not much of a concern as it might appear
> Perhaps not in Chome, opera and firefox, but in IE, android browser and
> safari mobile this is a very real problem as they can't always be updated at
> all.

This is the Mozilla bugtracker.
(In reply to Dave Garrett from comment #122)
> That's a nebulous answer I read as roughly equivalent to "maybe" and
> definitely not a "no". The uncertainty here is what concerns me. If VP9 can
> produce better files then I'd rather have a VP9 based still image format,
> standardize it, and push it vigorously. If not, then getting a concrete
> "this is it" on using VP8 in its current form for WebP would be a big must
> have here, in my opinion. Frankly, I don't even think WebM should be
> upgraded to VP9. A whole new file extension is probably a better idea so as
> to differentiate the two and allow for VP8 to be phased out and replaced by
> VP9+Opus.

We have new versions of HTML (HTML 4.0, XHTML 1.0, HTML 5 etc.), new versions of CSS (CSS 2, CSS 2.1, CSS3 etc.), new versions of ECMAScript (3, 5, 5.1 etc.). We can also have new versions of image formats. The world progresses, do you want the same image format for the next 1000 years? There will be a VP9, maybe a VP10, VP11 and so on... if you always wait for the next standard you will never implement anything.
(In reply to myutwo33 from comment #121)
> [snip] I know with confidence that any software that says it can open PNG
> and JPEG files can open ANY PNG or JPEG file.

Unfortunately this is far from being true, 16 bits per component PNGs are not always supported, IE6 has never been able to support PNG transparency correctly, Adobe Photoshop CS5 and earlier can't display some PNG8+TRNs: http://http://imageoptim.com/

GIF is not perfectly supported by Apple CoreGraphics and therefore Safari:
http://www.lcdf.org/gifsicle/changes.html

And even worse only about one third of the 1992 JPEG specs are really implemented:
http://www.w3.org/Graphics/JPEG/itu-t81.pdf

Now check if Firefox is able to display a hierarchical or lossless JPEG or even a 12 bits per component or an arithmetic coded one.

JPEG optimization scripts like JPEGrescan have to bypass some scan configurations to ensure universal decodability (Adobe Photoshop has notable bugs a this level).
(needinfo?, please see last four paragrahps)

(In reply to ekerazha from comment #124)
> We have new versions of HTML (HTML 4.0, XHTML 1.0, HTML 5 etc.), new
> versions of CSS (CSS 2, CSS 2.1, CSS3 etc.), new versions of ECMAScript (3,
> 5, 5.1 etc.). We can also have new versions of image formats. 

All of these formats are 99% backwards compatible, i. e. they will not result in a fatal error when encountered in an old browser. The only exceptions are new elements in HTML5 only in IE (can be fixed with a shim; block styles can be set with CSS), browsers from 15 years ago(!) should be able to display HTML5 properly! New objects were introduced in ES 5 (almost all can be shim-ed); there will be new syntax that leads to a parse error in ES Harmony, though, but there are on-the-fly transcoders for that already that can be added with a simple bootstrap script. Most of "new" CSS of course will fail but not fatally, it will only degrade the displaying while leaving the function intact (if done right). 

Now, image formats are a different thing, because mostly changes to them are backwards incompatible, with a few exceptions, of course. But even standardized images may fail, as Frédéric points out in Comment 125. However, image software produces mostly compatible (to a broad audience) images (again, with the exception of some optimizers). 

The problem with viewing and (directly) sharing WebP is a big one, as long as most browsers don't support it (which means they could act as a viewer application). However, that's another chicken-egg-problem. There are a few other problems that WebP has not solved yet, and IMO they need to be fixed before releasing it into the wild. 

However, I propose the following: 
– Implement WebP in Firefox (current lib version) now, DISABLED BY DEFAULT. 
– If enabled, annouce WebP in the accept header. (maybe) Include lib version (like 
  image/webp;version=0.3.1, if that's possible).
– (maybe) Enable for Pre-Release (Nightly & Aurora). Auto-disable for Beta & Release.
– Unless the following criteria* are met, WebP remains disabled: 
  1. WebP includes progressive downloading. 
  2. WebP is standardized by ISO/ITU/ECMA/IETF/W3C, or another image-oriented and industry-backed 
     standards organization. Preferable is ISO/ITU, this is no IETF & W3C territory, AFAIK. 
  3. The version string from the accept header is dropped!
  (* Please feel free to amend this list.)

This allows for testing while reducing the (possibly irrevertable) impact, and also showing a clear path for others who want to implement the format (i. e. Mozilla is betting on it but only if others are willing to support the format, too, while also demanding an explicit number of improvements). 

Also, this allows for loopholes for Firefox OS where certain apps (or the system) may sidestep the limitations and use WebP (specifically, system icons/images and E.me). This is in line with Mozilla's push for adoption in developing countries where storage is expensive, and bandwith limited. "Regular" (Firefox OS) WebApps may also be able to sidestep that limitation, e. g. providing programmatic fallback to JPEGs/PNGs for sharing ("hot-swapping" URLs), or Mozilla could only allow it for ressources shipped inside packaged apps (i. e. no loading over network). But that's up to Mozilla to decide the appropriate limiations and exceptions to those. 

Is that something you might like? Benoit, Jeff, can you please also check with some Firefox OS people if this is something they are interested in?
Flags: needinfo?(jmuizelaar)
Flags: needinfo?(bjacob)
(I am just not competent on image codecs, or on the specifics of Firefox OS apps).
Flags: needinfo?(bjacob)
It may be relevant to the discussion: I've developed *lossy* PNG+alpha encoder that is 100% compatible with the PNG standard and produces files 2-3 times smaller (even 5-7 times smaller if you're OK with noticeable distortion) than typical lossless PNG.

http://pngmini.com/lossypng.html

This makes PNG with alpha much more usable on the web and goes a long way towards closing the gap with WebP. (this is not a scientific comparison: http://pngmini.com/vs-webp/ )
(In reply to Dave Garrett from comment #122)
> (In reply to Stephen Konig from comment #116)
> > Re #108: There are no plans or intentions to change WebP based on VP9.
> 
> That's a nebulous answer I read as roughly equivalent to "maybe" and
> definitely not a "no". 

Then let me be more clear -- "No."  We've evaluated whether adapting WebP to be based off of VP9 would significantly enhance the format and concluded it would not.

> (In reply to Stephen Konig from comment #118)
> > Re #117: 1) UA detection is the only way to guard against this fully.
> 
> Any new feature that requires UA sniffing to properly use needs to be fled
> away from in terror. WebP may be created for the web but it's intended to
> replace JPEG. This means cameras and every other device under the sun, which
> will be almost never updated. The first widely adopted version of this will
> need to essentially be "set in stone" as just mentioned above and prior
> versions will need to be explicitly desupported.

Our recommendation is to use the presence of "image/webp" in the Accept header as a way of determining a minimum level of WebP support (read: 0.3.0), which frees you from having to do any UA detection if you stick to this feature set.  Chrome and Opera both include this (and both auto-update); and Firefox already enumerates all of its supported image types in its Accept header and so we would expect it to include image/webp when/if it supports WebP as well.  Similarly if IE12 comes out and supports WebP we would request that it similarly add image/webp.  In this way all you have to do is check for this string in the Accept header and you can confidently serve any 0.3.0-compatible WebP image.

Now this doesn't mean that say IE13 adds support for progressive rendering WebPs (a hypothetical 0.7.0 feature), at which point you'd have to fall back to Accept header + UA sniffing in order to know whether you can serve those.  But this is true of any new version of a Web standard, so I don't know there is any way to fully avoid it unless we want to just give up on supporting any new enhancements for the Web.  Regardless, this is a Web platform problem and not a WebP problem.
(In reply to Stephen Konig from comment #130)
> Our recommendation is to use the presence of "image/webp" in the Accept
> header as a way of determining a minimum level of WebP support (read:
> 0.3.0), which frees you from having to do any UA detection if you stick to
> this feature set.
You've explained all this before and the fact that you're repeating it implies to me that you don't understand what it is I'm taking issue with.

I'll go about another way of explaining the issue. First, let's be clear on what Web-P does for the web. It shaves milliseconds off the load time for web pages that support it fully. It's a great, but relatively, limited improvement it provides. It's not easy either for the average web dev to reap this millisecond load time improvement as most will need to host a jpeg/png/gif and webp version of every image and have some system of only providing the webp version to supporting browsers. This means more server side storage must be used for images and nearly double the image cache must be used. It can certainly but done but it's not an easy optimisation and most websites will have many many places where they can optimise for much greater speed improvements with less pain. Remember we live in a world where most websites are powered by PHP, a language and runtime that doesn't support threading and is interpreted rather than compiled.

Now keep that in mind.. the benefit to Web-p on websites is a small improvement in load times. That's it. It's great, but that's it. Now, let's say a website goes through all the research and trouble needed to host web-p images. In the end though he finds the jpeg images appear to load faster than the web-p images because the jpeg ones can progressively render. The web-p benefit is now *entirely* eliminated. Now this is only one issue with web-p today and it's one than can completely wipe out the usefulness of the format over existing formats. What I'm trying to point out here is that web-p is competing with existing formats and in a lot of ways it loses to them. The attitude I'm hearing is that "it's not a big deal that progressive rendering isn't supported because who needs that?" and "it's fine that each version is fragmented because some browsers can auto-update". Well existing formats like jpeg and png don't have these problems and while the benefits of using webp on the web are slim but good, the disadvantages of trying to use it on the web are big and bad. A page load improvement of 100ms for each user is not worth it if 1 in 1000 users ends up somehow not being able to view the image for some reason. It's laughable that it's being considered acceptable to add new features to webp as time goes on based on the fact that chrome and firefox can auto-update. Many mobile OS's cannot update their browsers so easily but even if this was only an issue for 1% of web users the meagre real world benefit provided by webp is not worth the bad experience that's created for that hypothetical 1%, which is probably closer to +50%.

> at which point you'd have to fall back
> to Accept header + UA sniffing in order to know whether you can serve those.
> But this is true of any new version of a Web standard,
Again, no one has to use UA sniffing for jpeg or png today and no one likes having to ever do that for CSS, HTML or javascript, infact almost no one does. Web developers either use libraries like modernizer to handle that stuff or simply target the oldest browser they expect their website to work on. Even a 1% risk of UA failing for some reason to detect the correct version of Webp a browser supports and providing an image that won't render is not worth the hassle. Risking having a website not render at all for a user is not worth a potential loading time improvement that could be won.

> I don't know there
> is any way to fully avoid it unless we want to just give up on supporting
> any new enhancements for the Web.
That's completely silly because JPEG and PNG never had this issue and neither do JPEG2000 or JPEG-XR. Support for JPEG2000 and JPEG-XR can be tested by check the Accept header for support. That's all. These formats are all standards that do not evolve with breaking changes over time so testing for support for specific version is not necessary. This is a problem Google have created for themselves because they decided to develop webp differently to every other web image format and because they apparently "don't know there is any way to fully avoid" this problem.

Web-p should not be introducing problems to web image formats that do not exist in jpeg and png, and there is absolutely no excuse for Google to do so either. The problems Webp has today creates far more problems for web developers then it solves for them.

I'd be happier if Google could take a closer look at JPEG2000 or JPEG-XR. Specifically JPEG-XR is a good example. It's lightweight, offers good performance, supports all the features web-p is offering to the web (expect it's lossy compression performance is worse), it also offers progressive rendering and support for a wide range of pixel formats that may become valuable on the web in the future. There's also only a single version of it and it's fully standardised by ISO and ITU and is released freely under terms equally as liberal as the webp usage terms. It does not have any of the issues that I'm mentioning about webp and I don't know why webp does not simply follow in it's footsteps with regards to standardisation.
Actually, UA sniffing is still in use by a massive number of sites to provide the 'right version' of PNG images to IE6. 

It's ignorant to say that WebP will be more troublesome that PNG was during its first decade. 

I would personally like to see progressive WebP support. But it may be hypocritical to request that when even progressive jpegs aren't seeing widespread use (and no wonder, as they're not supported by Mobile Safari/Safari).
(In reply to Stephen Konig from comment #130)
> (In reply to Dave Garrett from comment #122)
> > (In reply to Stephen Konig from comment #116)
> > > Re #108: There are no plans or intentions to change WebP based on VP9.
> > 
> > That's a nebulous answer I read as roughly equivalent to "maybe" and
> > definitely not a "no". 
> 
> Then let me be more clear -- "No."  We've evaluated whether adapting WebP to
> be based off of VP9 would significantly enhance the format and concluded it
> would not.

Thank you very much for that clarification. If WebP will settle on its current codec and VP9 doesn't offer an improvement here, that's a big plus. Standardization of WebP will still be wanted, however.

(In reply to Nathanael Jones from comment #132)
> I would personally like to see progressive WebP support. But it may be
> hypocritical to request that when even progressive jpegs aren't seeing
> widespread use (and no wonder, as they're not supported by Mobile
> Safari/Safari).

If we get progressive WebP from the "start", meaning at standardization, then they might get used if everyone actually supports it as a requirement.
As a new data point, it looks like Facebook is now using WebP for the images served to their Android app:
https://fbcdn-dragon-a.akamaihd.net/hphotos-ak-ash3/851560_196423357203561_929747697_n.pdf#page=37
and "aim to roll out WebP to other platforms as well. When the images 
are converted to WebP, we will have saved over 20% of total network traffic, without loss of quality."
(In reply to myutwo33 from comment #131)
> The web-p benefit is now *entirely* eliminated.

WebP does offer a few other benefits, like lossy transparency, higher quality animation.
I'm head developer and system admin of http://www.memecenter.com I want to share our experience with WebP. It's a fairly big website with more than 150 million pageview/month. We have a young audience so Chrome usage is really high:

Last month's browser distribution: (taken from Google Analytics)
Chrome	59.53%	
Firefox 20.02%
Safari	6.81%
IE	4.55%

We have implemented WebP back in February 2013. Currently there are more than 8 million images both in jpeg and WebP. We are still using UA sniffing for detecting WebP support, but will soon convert it to accept headers. As our user base heavily uses Chrome, we've seen huge bandwidth savings and pages are loading a lot faster. We even use seperate css for WebP to use WebP images in css. The only downside so far is user's bad experience while they try to save images to their computer or share with their friends. 

As disk space usually a magnitude cheaper than bandwidth, it's never been an issue for us. When progressive decoding lands, we will most probably encode in that format as only 3-4% of chrome users fall behind the latest version. We'll always use jpeg fallback for those who does not support latest version of WebP.

If Mozilla implements WebP we'll have more than 80% of users supporting WebP format, and it will reduce bandwidth costs even more. (Not to mention page speed)

Ufuk
Memecenter

PS: Not a native speaker, sorry for grammar mistakes.
Flags: needinfo?(jmuizelaar)
(In reply to myutwo33 from comment #131)
> A page load improvement of 100ms for each user is not worth it if 1 in 1000 users ends up somehow not being able to view the image for some reason.

I guess user-friendly browser would allow to download webp image as png by converting it on the client side. That is "Save Image As..." dialog would suggest png format by default, but allow to change it to webp. Really smart browser would also check whether system has application registered for webp extension.

See "Save as..." dialog in Firefox. You can choose "HTML Only", "Complete" or "text only" format. The same could be done for images.
We're very close to finished with results we can publish and discuss. Expect something in the next week or so. Apologies for the delay.
Worst case scenario regarding chroma-subsampling, 4:2:0 sub-sampling is mandatory when lossy compression is issued in WebP.
Some of my experiments with animated GIFs, PNGs and WebP:
http://littlesvr.ca/apng/gif_apng_webp.html
I hope Mozilla paid special attention to chroma-subsamplig when comparing JPEGs to lossy WebPs. My concern here is that a poorly conducted comparison between JPEG and WebP would not reveal WebP qualities but rather the generally poor level of optimization applied to images on the web.
Due to its video roots 4:2:0 chroma sub-sampling cannot be turned off in lossy WebP and it will always lose some color details (the square.jpg has been especially crafted to clearly show chroma sub-sampling in action, of course this type of pattern should rather be compressed using a lossless format, it must be seen as educational material), but it also means that a fair comparison should consider reducing chroma sub-sampling (and therefore quality) in JPEGs to the same level, for instance Photoshop does not apply chroma sub-sampling when Quality is over 50 (save for web) and the default chroma sub-sampling is 4:2:2 for tools based on IJGs libjpeg. Lossy WebP does not target high-end image quality but rather the middle of the pack (which is not bad per se) unfortunately a lot of JPEGs on the web have not been properly optimized, as I wrote before Photoshop doesn't let you fiddle with chroma sub-sampling settings independently and I'm pretty sure that a lot of people is not even familiar with chroma sub-sampling at all and therefore unable to take benefit from this practice.
(Please don't tick boxes you don't understand, especially don't clear needinfo? if you do not provide the information requested! @wwaxpoetic@yahoo.com)

Also needinfo?ed :jsmith, he might have an opinion on what I said in Comment 126. 

I'd add (more control over) chroma-subsampling to the criteria list in comment 126. Thanks Frédéric for pointing that out!
Flags: needinfo?(jsmith)
Flags: needinfo?(jmuizelaar)
I'm probably not the right person to comment here - I don't work a lot on this area.
Flags: needinfo?(jsmith)
(In reply to Josh Aas (Mozilla Corporation) from comment #141)
> We're very close to finished with results we can publish and discuss. Expect
> something in the next week or so. Apologies for the delay.

https://blog.mozilla.org/research/2013/10/17/studying-lossy-image-compression-efficiency/

http://people.mozilla.org/~josh/lossy_compressed_image_study_october_2013/
Does the Mozilla staff plan to publish other comparisons relative to the lossless part? against PNG and animated GIF and even against JNG (JPEG image + alpha mask in a PNG like container) http://en.wikipedia.org/wiki/JPEG_Network_Graphics
Here is the study, just to make sure it is here for people to see: http://people.mozilla.org/~josh/lossy_compressed_image_study_october_2013/
The study doesn't mention much about:

- actual availability of the implementation: webp is sufficiently spread with some tools almost ready for production by everybody interested
- actual availability of content

Having an hevc-based image format might be nice but you have to spend another year to evangelize. Ideally once the framework of content-type offer gets widespread switching to something better would be easy. As today webp *is* compelling to support.
Just thought of adding some information I think might be relevant.

I work for globo.com, the largest portal in South America. Leader in News, Sports and Entertainment. Near a billion page views a month. That is to say that reducing page size is very important to us.

At globo.com we use thumbor[1] to generate our images. We introduced in the last releases, a way for thumbor to automatically convert images to WebP if the browser has the "Accept: image/webp" header.

Yesterday we went live with it at globo.com. Now all the new images that get generated at globo.com will be served as WebP if the browser accepts it.

We were a little overzealous at first and we are using WebP with 90% quality, which means we didn't get that much of a boost in size over JPEG images. We are changing it to 80% soon (maybe 70% as WebP overall quality with greater compression is way better than JPEG's).

Anyway, the transition was VERY smooth and we didn't have any issues with libwebp. Some examples can be seen at G1 [2] and GloboEsporte [3].

Cheers,
Bernardo Heynemann
Manager for the Home Page at globo.com

[1] https://github.com/globocom/thumbor/wiki
[2] http://g1.globo.com/
[3] http://globoesporte.globo.com/
Mozilla: Perhaps it would be helpful to those of us who are baffled by these Bugzilla WebP tickets if you could explain what Mozilla has to lose were it to include WebP support in Firefox? An irreversible decision that would add some lines to the source code that could never be excised without breaking the Internet? Perhaps Mozilla sees itself as an image format kingmaker and doesn't want to tap the wrong candidate, so it should sit on it for over three years and test everything to death? 

I concede I am not in your shoes and may not see the big picture you do, but that's all I can come up with, and they don't strike me, given all the merits that have been revealed in these bug threads alone, as compelling reasons not to resolve this "important, high-profile" ticket (comment #110). 

Furthermore, I still don't understand the relevancy of how good other formats are in terms of resolving this ticket, which is about WebP, in any direction, and would truly appreciate clarification.

Regards,
Doug Simmons


(In reply to Nordware from comment #152)
> wow.

Yep.
What does the relative efficiency of the WebP compression algorithm has to do with the fact that we Firefox users cannot enjoy the sites that use WebP images?
There's only political-ish arguments here because people have a hard time understanding the admittedly messy issues here. WebP is an experimental image format developed by Google based on WebM, which they pushed with even more veracity then more or less self-destructed it by breaking their promise to phase out h.264 which they had said it was to replace. It was said in prior discussion here that WebP will definitely be VP8, which I trust is true, but that was also said for WebM. WebP is better than JPEG, but then again pretty much everything newer is. There was a study put out by Mozilla that showed the coming HEVC could be better still. (hell, I'd like to see Daala, but that's even more experimental) WebP is probably an improvement over the current state of affairs, but it's not a clean and simple decision. It's a mess; the decision will be messy, and not quick. (and these bugs really should be locked to prevent comments from newly ad hoc created accounts that make them an even messier din) I too sort of expected a hard "take 2" decision to come out by now, but honestly, it's nowhere near as big of a deal as some of the ranting gets to. (and in all likelihood there will be a "take 3" but to actually do an implementation)

We're talking about a whole new image format, to be used by the whole human species for our global communications, probably supported for the rest of "forever", replacing a format we've been using reliably for a couple of decades, that is currently only used in one major browser platform (and the other things using it), and still is very much experimental, or at least as experimental enough to keep up the 0.* versioning and still have major feature changing updates. Our views of the pace of technological development are often skewed because we have yet to come to any consistent pace of doing so. This is still "new" and this general progression is a "big deal". People are going to have to be patient here.

As I noted on the whiteboard up top, there is a summary in comment 105 - 112. Also, quite frankly, just because there are 162 comments here plus another 187 comments in bug 600919, does not mean you shouldn't read every single one before commenting here. This is not a discussion forum. (and yeah, by that logic I probably should just let you argue amongst yourselves, but I'm saying this again anyway :/ )
> We're talking about a whole new image format, to be used by the whole human species for our global communications, probably supported for the rest of "forever"

Could be useful to be a bit more pragmatic here. Nothing is always as infinite as you say. You know that any format, one day, will be obsolete. And we'll have to decide what to do with them.
I understand your arguments, truly, but I also think that it's not a good idea to prevent the integration of WebP just because it might not be the ultimate picture format we will all use in the next century. Computer science life can't see this far.
At each release, Linux kernel is implementing and deprecating file systems, and it's still a flipping good kernel.

> very much experimental, or at least as experimental enough to keep up the 0.* versioning

Come on, gettext is at version 0.18, but it's considered as stable and it's used in almost every Linux binaries (https://www.gnu.org/software/gettext/)

That's the only comment I'll do. As you said, this is not a forum.
Blocks: 935466
Has this bug been delayed or influenced because of behind-the-scenes knowledge of Mozilla's mozjpeg project, announced in https://blog.mozilla.org/research/2014/03/05/introducing-the-mozjpeg-project/ (by Josh Aas who has commented on this bug)?
(In reply to Ralph Corderoy from comment #164)
> Has this bug been delayed or influenced because of behind-the-scenes
> knowledge of Mozilla's mozjpeg project, announced in
> https://blog.mozilla.org/research/2014/03/05/introducing-the-mozjpeg-project/
> (by Josh Aas who has commented on this bug)?

We don't have the data we'd like to see before moving forward on WebP, 'mozjpeg' or not. Furthermore, from that blog post:

"We (at Mozilla) don’t doubt that algorithmic improvements will make this worthwhile at some point, possibly soon. Even after a transition begins in earnest though, JPEG will continue to be used widely."
I hope you recognise that even if WebP isn't as effective as JPEG (it is), that having alpha transparency alone would make it worthwhile.

A few years ago I was involved in creating doctorwho.tv for BBCWW. This website currently has huge banner images that use alpha transparency, and I know for a fact that they'd like to have more elaborate images still. The problem with this is that these images are absolutely huge to download: one of the banners[1] is 850KB alone, using WebP this same image would be served at roughly 150KB.

You just can't do that with JPEG, and the chances of getting two browser vendors to accept any other image format is laughable, let alone then getting the tooling into applications and onto web servers. So why can't Mozilla support a solution that at least has traction with another vendor?

As things stand the potential to make better websites has been held back for three years by Mozilla (and Microsoft but that's expected), and I'm left asking myself why, because I'm not getting any answers that make sense from the people I'd usually trust.
Hi!
Maybe this is a solution: http://webpjs.appspot.com/
It even works for IE.
Hi Josh,

> > Has this bug been delayed or influenced because of behind-the-scenes
> > knowledge of Mozilla's mozjpeg project, announced in
> > https://blog.mozilla.org/research/2014/03/05/introducing-the-mozjpeg-project/
> > (by Josh Aas who has commented on this bug)?
> 
> We don't have the data we'd like to see before moving forward on WebP,
> 'mozjpeg' or not.

Noting that didn't answer the question :-) can you bring us up to date on what
data it is Mozilla would like and what efforts are being made to obtain it?
The study results that were previously being waited on were released on
2013-10-17, five months ago tomorrow.  #c147
@Ralph Corderoy: That's exactly what I was thinking about. Mozilla should be a bit more transparent about that. Maybe we could even rally their opinion, if we can clearly understand why.
There isn't a "Mozilla opinion" - there is some people in the Mozilla community that really want WebP, there are other people in the community that are skeptical. There is no hidden internal position - and there isn't an "us" vs "them" or anything like that. Gathering the data about the utility and exhausting all avenues (e.g., squeezing more out of JPEG and PNG) is what we (Mozilla) are currently doing. It's expensive (not just in money terms) to add a new image format to the Web - so it makes sense to try to squeeze everything we can out of JPEG before adding a new format to the platform. 

As has been said many times, in adding a new format, the format needs to be significantly better than the current set of image formats - naturally, JPEG can't give us alpha transparency, and it might not be able to get the 30% compression advantage claimed by WebP. I, as a person who works for Mozilla, am personally hopeful that we will get WebP (specially because of alpha transparency on photographs at decent compression). I also know at lot of my colleagues at Mozilla want it - but we don't seem to be quite there yet in having consensus to add this. Getting a few more drops out of JPEG/PNG seems like a the right thing to do right now as WebP continues to evolve as a new format (as is supported in Blink-based browsers).
The format already exists on the web. It's embedded in many iOS apps of scale. I used to work for Netflix where millions of viewers consume WebP images on their smart TVs and on Chrome browsers. Saying that Mozilla adding the format adds it to the web is a bit much but it would make things significantly easier for implementers. So I ask, what specifically needs to be proven to get sign off from the current nay sayers at Mozilla?
@Marcos Caceres: Thanks for the clarifications.

So, to the skeptical ones at Mozilla, I'd say that an image format, open source, supported by a major firm in the industry, that supports alpha transparency, a pretty good compression and even animation... well... you don't find this a lot in the streets nowadays.
Until proven wrong, PNG doesn't intend to be a lossy compression format and JPEG didn't plan to add alpha channel to their format. So you can squeeze whatever you want out of these formats, I don't you'll find what the web developers ask in either of these formats.

And I've read the Ars Technica article (http://arstechnica.com/information-technology/2013/04/chicken-meets-egg-with-facebook-chrome-webp-support/) and now there doesn't seem to be any problem to integrate this format now: WebP is mature enough now, it supports significant enhancements over both JPEG and PNG (alpha transparency, good lossy compression and animation), and now everything is integrated into one only version: 0.4.0.

I'm not a Google fanboy, it could have been a format from Mozilla, IBM, Sun or even Microsoft, but these conditions are the closest I've seen for new format to be accepted by both the industry and the developers.
(In reply to romain.failliot from comment #172)

> Until proven wrong, PNG doesn't intend to be a lossy compression format

There is lossy PNG already. I know of 4 methods to do that, 3 of which I've implemented:

http://pngmini.com/lossypng.html

It typically achieves file sizes 3-4 smaller than a regular PNG, and of course is compatible with all browsers, including IE and iOS.

For example the 850KB image mentioned earlier in these comments:

http://dam.bbcchannels.com/i/2cnpy0000001000

can be compressed as PNG 144KB to 200KB, depending on quality and the lossy method used:

http://imgur.com/a/bjF39


> JPEG didn't plan to add alpha channel to their format. 

ISO committee SC29WG1 is looking into adding alpha channel to JPEG in a backwards-compatible way (I can't link to the relevant e-mail, because Google Groups seems to be missing half of mails from dev-mozjpeg mailing list :( )
I contribute my opinion to the matter of whether or not WebP support is added to Firefox, but I do not unilaterally determine what happens. I'm happy to be transparent. This might get a bit long.

Here’s why I don’t think the case for WebP is good enough, at least right now.

1) We lack data showing that WebP is significantly enough better than JPEG in terms of compression. What "significantly enough" means exactly is up for debate, but the case right now is not super compelling. A lot of the unqualified results people throw out (those without a clear methodology) have to do with re-encoding, where quality isn’t necessarily being maintained and re-encoding properly with a JPEG encoder would also have improved file sizes.

2) Last time I checked, it was not possible to create large WebP images. I couldn't encode a ~20 megapixel image. These images are already on the Web and they're only going to get more popular. Adding a new format that can’t handle these images would be unwise. This is probably fixable, but last time I checked it wasn't done.

3) I suspect it's unlikely that MS will agree to include WebP support in IE, maybe ever. Not having MS on board, given their market share, is problematic. It means lots more header/UA checks and double solutions for every use of WebP, possibly for a long time.

4) I haven't done extensive testing on this yet, but word is that WebP compression advantages fall off when an image gets larger than about 500x500 pixels. This might be why we see WebP perform a bit worse on the Tecnick image set (~1200x1200) than the Kodak set (~768x512) in my last study. This may also be impacting other peoples' tests. I'm curious to know more about this.

5) Users can't do much with WebP images today if they save them. As Facebook learned, this frustrates users. As this is a bit of a chicken-and-egg problem it's less important, but it is a consideration.

I also think that from a technology perspective we're already capable of doing much better than WebP, and I wonder if we can't do significantly better in a reasonable amount of time. We can't wait forever to ship something because the cutting edge is never quite ready, but WebP is pretty far from the cutting edge (e.g. HEVC). I would be a bit disappointed if after waiting 20+ years WebP is the best we can do for replacing JPEG. That said, moving the ball here is tough and I respect the WebP folks for trying to do so, getting as far as they have, and spurring action on the topic. I’m just not convinced WebP is enough yet.

I’ll mention Daala, since it sometimes comes up. The timeline for a potential Daala still-frame format is unclear. If this path were pursued, an encoder/decoder impl is probably significantly more than a year out. We’re not putting our eggs in this basket at this point, but it would be cool if it worked out. Working out means Daala has to perform as hoped, then we’d have to choose to pursue a still-frame format and spend engineering time on the necessary tools.

As for “alpha alone is worth it”… Alpha would be great, but I don’t agree, in part because anyone who used WebP for alpha would be creating more work for themselves because a non-WebP solution would still be needed. Firefox supporting WebP is not going to change the fact that a large number of users (most?) won’t be able to do anything with WebP (unless you want to use WebPJS, which you can use now).

I hope to do another image study some time in the next couple of months. It would include changes based on feedback from the last study, updated versions of encoders (including WebP 0.4), results for mozjpeg, and maybe some new metrics. As always, we’ll re-evaluate based on the new data. If WebP shows solid gains, and there is no hope for anything better in the near-to-mid term, I may change my position and I'll advocate that others do so as well.

For those who perceive that we aren’t paying enough attention to data people give us: We do pay as much attention as we can, but getting good data is complicated and doing a thorough analysis of reports takes a lot of time. We can’t afford to look at everything as deeply as we’d like to determine its validity. The best thing for us is to take notes on feedback and suggestions and integrate them into our own testing.
(In reply to porneL from comment #173)
> (In reply to romain.failliot from comment #172)
> 
> > Until proven wrong, PNG doesn't intend to be a lossy compression format
> 
> There is lossy PNG already. I know of 4 methods to do that, 3 of which I've
> implemented:
> 
> http://pngmini.com/lossypng.html
> 
> It typically achieves file sizes 3-4 smaller than a regular PNG, and of
> course is compatible with all browsers, including IE and iOS.
> 
> For example the 850KB image mentioned earlier in these comments:
> 
> http://dam.bbcchannels.com/i/2cnpy0000001000
> 
> can be compressed as PNG 144KB to 200KB, depending on quality and the lossy
> method used:
> 
> http://imgur.com/a/bjF39
> 

if i understand correctly, lossypng modifies the source before (lossless) compression as PNG.
Therefore, it could _also_ be used to modify the source before feeding to WebP compression.
This would reduce the size further to 129042 bytes: 
   https://drive.google.com/file/d/0BzaRBGNsqD0ycmJ2eDBiNFJ1eDg/edit?usp=sharing
using the same pre-processing technique.

> 
> > JPEG didn't plan to add alpha channel to their format. 
> 
> ISO committee SC29WG1 is looking into adding alpha channel to JPEG in a
> backwards-compatible way (I can't link to the relevant e-mail, because
> Google Groups seems to be missing half of mails from dev-mozjpeg mailing
> list :( )
I'm going to play devil's advocate a little here, because I completely agree with the stance in comment 174.

(In reply to Josh Aas (Mozilla Corporation) from comment #174)
> 1) We lack data showing that WebP is significantly enough better than JPEG
> in terms of compression. What "significantly enough" means exactly is up for
> debate, but the case right now is not super compelling. A lot of the
> unqualified results people throw out (those without a clear methodology)
> have to do with re-encoding, where quality isn’t necessarily being
> maintained and re-encoding properly with a JPEG encoder would also have
> improved file sizes.

The reality is that it's not WebP vs. JPEG, it's WebP vs. crappy JPEG in most cases. If WebP is easier to not screw up, that's actually a fair comparison. Much of the win comes from shuffling aside old JPEG encoders rather than just having a better codec. That is a real result (even if we'd rather it not be).

> 3) I suspect it's unlikely that MS will agree to include WebP support in IE,
> maybe ever.

I've been having a hard time predicting what they will actually do in recent years. Also, I don't really care. It's not our job to worry about creating another thing web developers will have to special case for IE.

> 4) I haven't done extensive testing on this yet, but word is that WebP
> compression advantages fall off when an image gets larger than about 500x500
> pixels.

WebP is designed for the web (at least so sayeth the name) so it may just not be as important for higher resolutions. It's conceivable that WebP could coexist with JPEG and be preferred in cases where it is better. A lot of developers would like a better codec for small images, probably the ones that really want alpha.

> I’ll mention Daala, since it sometimes comes up. The timeline for a
> potential Daala still-frame format is unclear. If this path were pursued, an
> encoder/decoder impl is probably significantly more than a year out.

Only a year or so? I was getting the impression that Daala was even further off than that, unfortunately. (though, this is not to say that I don't think waiting for it isn't possibly a viable option)

> I hope to do another image study some time in the next couple of months.

I'd also like to see some comparisons of various common JPEG encoders alongside WebP and mozjpeg. I'm interested to know the current playing field better.

What I'd most like to see is if there's some way an HEVC based still image format could be slapped together with the features everyone wants instead of resorting to VP8 just to "get something done". Once HEVC video comes out we will be stuck with it for ages, so we might as well get a companion still image format to go with it. (yes, software patent lunacy comes into play, but it's always possible that something sane could be done here if we do it from the start)

> As always, we’ll re-evaluate based on the new data. If
> WebP shows solid gains, and there is no hope for anything better in the
> near-to-mid term, I may change my position and I'll advocate that others do
> so as well.

Obvious question: what is your ballpark for "near-to-mid" term?
@Pascal: You're right WebP is still smaller. It's just that the gap is now about 10% rather than 550%.

I'm not saying PNG is better than WebP, just that lossy PNG encoders narrow the gap to the point PNG is an option in scenarios where you'd otherwise need WebP.

The tools for lossy PNG are tuned to characteristics of PNG compression (e.g. specific PNG filters), so are suboptimal for WebP, but it should be possible to hack lossless WebP to be lossy in a similar way. However, I'd rather see proper lossy WebP fixed instead, e.g. support full chroma resolution, use HD and intra-frame improvements from VP9, etc. 

I would be disappointed if we adopted WebP based on an 8-year-old VP8 codec when Chrome ships with the brand new VP9 already. IMHO if WebP becomes mainstream it's going to outlive the VP8 WebM, and it's going to be silly to have WebP quality limited for compatibility with a bitstream of something that's already becoming a legacy codec :(
I see it mainly as WebP vs lossy PNG with alpha. Maybe the effort should be directed towards a drop-in replacement for libpng with a simple quality parameter.
I recently set up a small benchmark to compare well known PNG optimization tools (ImageOptim, ScriptPNG…) and added WebP 0.4.0 to the mix.
http://frdx.free.fr/png_vs_webp.htm

ScriptPNG is pretty interesting since it's based on at least 4 different tools run in a row (was even more in the past but its author made the choice of speed rather than absolute smallest file result).
It means that nowadays to get top class PNG optimization you have to run tools that:
- clean the alpha channel in a more advanced way than simple blackening of fully transparent pixels (truepng or cryopng)  — PNG-32 case —
- sort its palette in a clever way (truepng) — PNG-8 case —
- find the best PNG filters (pngwolf(z) eventually just zopflipng)
- produce a better compressed Deflate stream than zlib (pngout and zopfli)
and if you want some icing on the cake:
- optimize the Deflate block headers (deflopt or defluff)

Zopflipng tries to perform most of these tasks on its own but it's still not the silver bullet of PNG optimization.

In my test a dumb (well… if WebP 0.4.0 hadn't shipped with broken alpha cleaning) cwebp run usually produces clearly smaller lossless WebP files compared to their PNG-32 counter part.
On the PNG-8 front it's much more mitigated, overall WebP is a little bit better (not even a percent) but there are still some smaller PNG-8 files.

I did not run the lossy - VP8 - part yet.

I would say that WebP is promising (on the lossless side at least) and is the way to go, keep in mind that these compression tools have not the same maturity (I could have beaten Usain Bolt on a 200m run when he was at age of 2 since I'm about 15 years older than him).
Personally I'm more inclined to focus my efforts in improving WebP and take advantage of its features (the 2048 color cache for instance) rather than hacking old image formats (mozjpeg seriously? what is it going to produce, a free version of JPEGmini? Rather foster arithmetic encoded JPEG and here you got your 8% improvement in a snap, turn arithmetic coding to Huffman on the fly when users save their images if you are afraid users can't open them elsewhere).
(In reply to porneL from comment #173)

> ISO committee SC29WG1 is looking into adding alpha channel to JPEG in a
> backwards-compatible way (I can't link to the relevant e-mail, because
> Google Groups seems to be missing half of mails from dev-mozjpeg mailing
> list :( )

Ask Thomas Richter… but if you are talking about what was envisioned in the early days of the JPEG-XT project I believe that the alpha channel part was not approved. JPEG-XT is now just a simplified JPEG with a backward compatible lossless optional layer.
(In reply to porneL from comment #177)
> @Pascal: You're right WebP is still smaller. It's just that the gap is now
> about 10% rather than 550%.

^This.

If it isn't providing an improvement that is an order of magnitude better, than what is the point?  If you are looking for incremental improvement, you do what Mozilla is doing with mozjpeg.


> The tools for lossy PNG are tuned to characteristics of PNG compression
> (e.g. specific PNG filters), so are suboptimal for WebP, but it should be
> possible to hack lossless WebP to be lossy in a similar way. However, I'd
> rather see proper lossy WebP fixed instead, e.g. support full chroma
> resolution, use HD and intra-frame improvements from VP9, etc. 
> 
> I would be disappointed if we adopted WebP based on an 8-year-old VP8 codec
> when Chrome ships with the brand new VP9 already. IMHO if WebP becomes
> mainstream it's going to outlive the VP8 WebM, and it's going to be silly to
> have WebP quality limited for compatibility with a bitstream of something
> that's already becoming a legacy codec :(

^ Thank you.

The algorithms behind Daala is where the landscape meets the sky, rushing to adopt WebP as a standard is rushing to adopt legacy code which will only be used in closed systems (such as Netflix).  No thank you.

Just like Ogg vs MP3 or WebM vs H.256, the next opening for market acceptance will be for the next generation codec, like Opus.

Wake us up when you are willing to throw financial resources at bringing Daala along and have a feature set that will replace GIF, PNG, JPEG and perform some new tricks as well, like a responsive image container.  Because even with all of those advantages, we will have a very hard time trying to get Apple and MS to adopt it ... just like we are having with Opus!
For me the big advantages of webP over moz-jpg is no matter what you do, .jpg doesn't support translucency. You guys keep mentioning another format: HVEC. Even if its a better format, webP is already in use. Its in use by facebook. Its supported by most webkit browsers. According to caniuse.com, its supported by about 43% of installations. Personally I say if its that big already, you should just add it and get it over with and stop whining about it.
I'll throw in my vote for ANY method of supporting lossy + transparency images. I'd prefer Jpeg-XR over Web-P for this as it's fully ISO standardised and quite stable at this point. Better still though might be adding transparency support to jpeg somehow, as many mobile devices have hw support for decoding jpeg streams. Maybe a png stream could be added to the end of a jpeg filestream to be interpreted as the alpha channel?

How about this.. a css property for the transparency channel like so:
div.cutoutPhoto {
background-image: url("cutoutPhoto.jpg");
background-image-mask: url("cutoutPhoto-mask.png");
}
Where cutoutPhoto-mask.png is a grayscale png representing the alpha channel. This stuff would be very useful for cut out photos or allowing for noisy imagery with alpha in web designs, but far more important for games. It's a real problem with web based games that all transparent images *must* be pngs. It's pretty sad that this is 2014 and no one has made a move on this issue. I'd like web-p support, but it's not the only way to solve this problem. I'd like to think there's people in Mozilla thinking about this.
(In reply to Daniel Mota Leite from comment #185)
> >Maybe a png stream could be added to the end of a jpeg filestream to be interpreted as the alpha channel?
> 
> NO! Please no! we already have several transparent image formats, no need to
> reinvent one more, and even worst, a hugly hack on a existent standard!!
> https://xkcd.com/927/

It already exists anyway it's called JPEG Network Graphics or JNG:
http://en.wikipedia.org/wiki/JPEG_Network_Graphics
And it works a bit differently since it is a close relative of PNG with a JPEG image in a specific chunk, and it has already been rejected by Mozilla:
https://bugzilla.mozilla.org/show_bug.cgi?id=88020
Its main advantage was that it would not have required a lot of new code since there's no new compression scheme it's more or less just a container to pack a JPEG image and an alpha/mask layer together.
(In reply to Zach Lym from comment #182)
> (In reply to porneL from comment #177)
> > @Pascal: You're right WebP is still smaller. It's just that the gap is now
> > about 10% rather than 550%.
> 
> ^This.
> 
> If it isn't providing an improvement that is an order of magnitude better,
> than what is the point?  If you are looking for incremental improvement, you
> do what Mozilla is doing with mozjpeg.

The problem is that the technique being discussed requires human curation in order to ensure the image quality is up to standard. When serving images through an automated service that rescales or crops the image, it isn't practical to use these tools.

So, speaking specifically for the use case of BBCWW, that 850KB PNG can only be solved by employing more humans to curate the image, or by adopting WebP or another format with similar features.

> > I would be disappointed if we adopted WebP based on an 8-year-old VP8 codec
> > when Chrome ships with the brand new VP9 already. IMHO if WebP becomes
> > mainstream it's going to outlive the VP8 WebM, and it's going to be silly to
> > have WebP quality limited for compatibility with a bitstream of something
> > that's already becoming a legacy codec :(
> 
> ^ Thank you.
> 
> The algorithms behind Daala is where the landscape meets the sky, rushing to
> adopt WebP as a standard is rushing to adopt legacy code which will only be
> used in closed systems (such as Netflix).  No thank you.

Daala is shaping up to be a great thing, but is isn't here yet and won't be for some time. Yes, in a couple of years it might be usable, but by then we will have wasted five years doing nothing while a perfectly reasonable solution is ignored.

As for your "closed systems" argument, I'm not at all sure what point your trying to make.
Rowan Lewis said 

The problem is that the technique being discussed requires human curation in order to ensure the image quality is up to standard. When serving images through an automated service that rescales or crops the image, it isn't practical to use these tools.

So, speaking specifically for the use case of BBCWW, that 850KB PNG can only be solved by employing more humans to curate the image, or by adopting WebP or another format with similar features.

However

At Netflix we re-encoded hundreds of thousands if not a million images in WebP without hand curation with success. Our customers seem pretty happy.
(In reply to nyteshade from comment #190)
> Rowan Lewis said 
> 
> The problem is that the technique being discussed requires human curation

You're probably thinking about pngmini.com which is a GUI for manual conversion, but the lossy PNG techniques and tools don't require manual supervision. 

For example pngquant/libimagequant takes quality setting identical to libjpeg/cwebp, so you can easily batch lossy compression of millions of PNG images.
> Daala is shaping up to be a great thing, but is isn't here yet and won't be
> for some time. Yes, in a couple of years it might be usable, but by then we
> will have wasted five years doing nothing while a perfectly reasonable
> solution is ignored.
> 
> As for your "closed systems" argument, I'm not at all sure what point your
> trying to make.

The whole point is that WebP is only a reasonable solution for closed systems.  People keep bringing up Netflix while ignoring the fact that Netflix still relies on Silverlight.  Netflix can do whatever it wants because their material doesn't have to interoperate with any other website or software.  Facebook trialed WebP but their users rejected it because they could not share the images with friends using browsers incapable of dealing with WebP images.

Adding WebP would add yet another defacto-nonstandard-standard to the mix.  Daala being years away from being ready is why it has a real chance of gaining any acceptance.  Even VP9 and HVEC are both going to struggle for any adoption given the dominance of H.264 and the same is true for WebP vs Jpeg and PNG.  It's like Beta vs VHS or Blu-Ray vs HD-DVD: you are battling over features and specs in a war of market forces.

To force Apple and IE to accept any new standard requires overwhelming force, something we have no hope of kindling until the next major standards/technological refresh. We might as well save our resources for when we have a chance at winning.

I'm tired of this filling up my inbox, can someone please restrict or otherwise lock this thread?
I'm going to restrict comments on this bug to users with editbugs permissions, at Josh's request, since the recent discussion is not constructive, and Bugzilla isn't meant as a discussion forum. mozilla.dev.media (http://www.mozilla.org/about/forums/#dev-media) is the relevant discussion forum for this topic.
Restrict Comments: true
Summary: Implement WebP image support, take 2 → (WebP) Implement WebP image support, take 2
Whiteboard: [SUMMARY in comments 105-112][parity-chrome][fuzzing:queue:cdiehl] → [SUMMARY in comments 105-112][parity-chrome][parity-opera][fuzzing:queue:cdiehl]
Blocks: 1193354
Web Compatibility issue with Zeit.de
https://webcompat.com/issues/1644
Another important German site: faz.net in https://webcompat.com/issues/1714
I don't know if FAZ and Zeit use the same backend or are developed by the same company. m.faz.net seems to serve markup that loads webp images if there is an "Android" string in the User-Agent.
Alias: WebP
Summary: (WebP) Implement WebP image support, take 2 → Implement WebP image support - take 2
Discussions aside for different image formats being "used in closed environments", I can see WebP support being important for Mozilla, even if just to have at least one truecolor lossy+alpha format available (which is currently not the case, since PNG=lossless and JPG=no-alpha).

I've personally worked with some people @Microsoft to try and figure out what the status is for JPEG-XR and its past work done for Firefox, but apparently that was a WIP that went into a vault as an undocumented project. This means we're looking at WebP as the only candidate to fill that gap right now. Adoption of the format is slow but steady, and 0.4 seems to be a stable implementation.

My question is: what exactly would it take to pull the current patch on this bug forward? It would probably be minimal to just get the decoder working (I'm not sure if saving out WebP should be considered a priority), but considering the image handling was moved to the Surface Cache it'll need a bit of adjustment. (the only real problem I see in the decoder seems to be the mImage.EnsureFrame line in ::WriteInternal?)
Adding a webcompat issue created by the lack of webp support
https://webcompat.com/issues/2634
FWIW, it is reported that Safari is going to support WebP in iOS 10 and macOS Sierra, which would leave us the only browser doesn't support WebP on macOS. [1]

According to my friend's test, that version of Safari still doesn't support animated WebP as well as WebM, though. [2]

Based on that, I suggest we should prioritize implementing this, otherwise we may see more compatibility issues coming.


[1] https://groups.google.com/a/webmproject.org/d/msg/webp-discuss/J8HLhTaklYE/RAtX14MEAQAJ
[2] https://twitter.com/bobtung/status/753395186911227904 (Chinese)
Whiteboard: [SUMMARY in comments 105-112][parity-chrome][parity-opera][fuzzing:queue:cdiehl] → [SUMMARY in comments 105-112][parity-chrome][parity-opera][fuzzing:queue:cdiehl][parity-safari]
> I suggest we should prioritize implementing this, otherwise we may see more compatibility issues coming

Agreed.
Blocks: 1222509
In addition to what Xidorn is saying

PageSpeed optimizes on the fly images from JPEG to WebP
https://developers.google.com/speed/pagespeed/module/filter-image-optimize

Bug 1222509 is a Web compatibility issue, where Firefox Android with version number is identified as a Blink UA (so supporting WebP) and then sends us WebP format. I'm trying to figure out if it's local pagespeed configuration for the site  or a general behavior of pagespeed.
I thought PageSpeed only converted images to WebP if it detected support (dunno if via UA sniffing or accept headers).

> PageSpeed automatically detects whether the browser supports WebP, and if it does, which features it supports.
Seeing as we're up to comment 209 here, I highly suggest a "take 3" bug be spun off for actual implementation (which I do encourage to be prioritized, based on apparent adoption rates).
Another one related to pagespeed module. 
https://webcompat.com/issues/3050 sending WebP images to Firefox Android.
Working on take 3 has been started. Dup this bug to the new one.
Alias: WebP
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → DUPLICATE
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: