Missing 'image/webp' in default navigation value of the 'Accept' header
Categories
(Core :: Networking: HTTP, defect, P2)
Tracking
()
People
(Reporter: obud, Assigned: CuveeHsu)
References
(Regression)
Details
(Keywords: regression, site-compat, Whiteboard: [necko-triaged])
Attachments
(2 files)
User Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:66.0) Gecko/20100101 Firefox/66.0
Actual results:
Since version Firefox 66 be missing 'image/webp' in default value of the 'Accept' header.
Expected results:
Could 'image/webp' be returned into default value of the 'Accept' header?
Comment 1•6 years ago
|
||
:baku I think this should be closed as Invalid, but double-checking with you because of bug 1417463, comment 22.
Hello,
removing "image/webp" in default Accept header when requesting one html page with embedded pictures to a "mod_pagespeed" or "nginx_pagespeed" web server prevent to serve webp.
webp were served with Firefox 65, jpeg are now served with Firefox 66.
This only concern html embedded pictures. Direct requests to picture are served with webp as accept header is "image/webp,/".
Chromium/Chrome have default accept header set with "image/webp" as it was in Firefox 65.
I think it could be considered as a regression.
Thanks,
Eric
see :
Comment 4•6 years ago
|
||
Andrea, how should we proceed here?
Is this related to bug 1417463 comment 7 ?
By spec, imglib uses the "wrong" content-type but I want to talk with image peers before changing it. Maybe it would be a follow up.
I can confirm that PageSpeed does not send WebP with the FF67 request headers. While I cannot speak to what should really be in the default value, I wouldn't want FF to be perceived as less performant for sites with convert_jpeg_to_webp and/or convert_to_webp_lossless enabled.
Can this be tracked for a release version? Thanks!
Updated•5 years ago
|
Definitely a massive Regression / Bug - tested against ImageWrapper of Futureweb SAAS CMS - since WebP Accept Header is gone all (html embedded) Images are delivered as JPEG instead ob WebP again ... rendering WebP Support in FF rather useless ...
According to "https://hacks.mozilla.org/2019/01/firefox-65-webp-flexbox-inspector-new-tooling/" the "Accept: image/webp" Header should be sent by FF for Server-Side WebP Support detection.
Quote: You can also detect WebP support on the server-side and serve images as appropriate, as supported browsers send an Accept: image/webp header when requesting images.
This would indeed be nice to get fixed. It's basically going one step back to remove it from the default Accept header, it makes detection of webp rather impossible, or cumbersome since we now have to check for the specific Firefox version whether it supports webp or not.
Updated•5 years ago
|
Updated•5 years ago
|
Comment hidden (off-topic) |
Comment hidden (off-topic) |
Comment hidden (off-topic) |
Comment hidden (off-topic) |
Comment hidden (off-topic) |
Comment hidden (off-topic) |
Comment hidden (off-topic) |
Comment 16•5 years ago
|
||
Abandoning needinfo request from 6 months ago.
Comment 17•5 years ago
|
||
@Gingerbread Man
Hello,
What more infos do you need ?
Thanks,
Eric
Comment 18•5 years ago
|
||
status-firefox66: --- → affected
status-firefox65: --- → not affected
Comment 21•5 years ago
|
||
(In reply to eldk from comment #17)
What more infos do you need ?
The needinfo flag was for getting the attention of the patch author of the regressing bug.
Comment 22•5 years ago
|
||
Nhi, can you help find an owner for this, maybe for 72?
Comment 23•5 years ago
|
||
Looking for Andrea's recommendation per comment 4.
Updated•5 years ago
|
Comment 24•5 years ago
|
||
By Fetch spec: https://fetch.spec.whatwg.org/#fetching
4.1.3:
If request’s header list does not contain `Accept`, then:
Let value be `*/*`.
If request is a navigation request, a user agent should set value to `text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8`.
Otherwise, a user agent should set value to the first matching statement, if any, switching on request’s destination:
"image"
`image/png,image/svg+xml,image/*;q=0.8,*/*;q=0.5`
"style"
`text/css,*/*;q=0.1`
We are following the spec. If we want to change the default accept header, the correct way is to file a spec issue.
Comment 25•5 years ago
|
||
(In reply to Andrea Marchesini [:baku] from comment #24)
By Fetch spec: https://fetch.spec.whatwg.org/#fetching
4.1.3:
If request’s header list does not contain `Accept`, then: Let value be `*/*`. If request is a navigation request, a user agent should set value to `text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8`. Otherwise, a user agent should set value to the first matching statement, if any, switching on request’s destination: "image" `image/png,image/svg+xml,image/*;q=0.8,*/*;q=0.5` "style" `text/css,*/*;q=0.1`
We are following the spec. If we want to change the default accept header, the correct way is to file a spec issue.
I'm all for following the spec in general, now I'm sure there's plenty of cases where Firefox doesn't follow the spec and implements their own nifty features because there's no spec available yet or because the spec doesn't make sense to real-world cases - but I guess that's another discussion ;)
However, not sending image/webp greatly hinders any firefox user from utilizing a web format that benefits the web (both end-users and companies) by reducing the bytes of transmitted data over the link.
I'm very curious about how Firefox would see sites implement delivering webp images when the only place we can effectively do the detection is in the initial request (for the HTML document) - doing the detection when the image is requested is both complicated and waste of resources from a processing perspective, you'll end up having to evaluate 10s and sometimes 100s of requests on a given page, to check whether it supports image/webp or not - you'll have to do the detection for every request performed.
One cannot do this check on the first request (especially under binary protocols where we're not really sure what comes first), and then "mark" all other requests as possible webp supported, systems are not built for that, and shouldn't be built for that either.
Your lack of image/webp results in moving logic from a logical place to somewhere not logical.
Since the browser will be requesting images/myawesomecat.jpg, then it would be expected that a jpeg is delivered back, and not a webp format, it's illogical to return another image format when the extension is jpg or png for example.
Another way to fix it would be to use a 302 redirect to the webp image, you'd probably not want to use a 301 redirect since browsers cache these heavily, and in case an image would disappear, you have an issue.
Therefore the detection of whether a browser supports webp or not, should happen the place where the source of the page is generated, thus we'd need a reliable way (Accept: image/webp) to detect this.
Does it break the spec implementing image/webp
in the navigation request? Sure it does - but you as a browser vendor should know, specs are usually made based on how the world operates.
There's certain browsers (Firefox being one of them) that implemented SPDY support, which later turned into a standard, which has to follow a spec. If browsers don't implement features, even before a spec is created, features would never actually get added.
So using the excuse: "We simply follow the spec" is kinda silly - specs and standards adapt based on where the world is headed. Without "first movers", HTTP/2 and HTTP/3 would never have become a thing :)
But sure, if Firefox wants to hinder the implementation of webp, and wants to prevent it's users from benefit from a image format that in many cases are superior to the "competition" (jpg and png), then sure - keep it as it is.
However, if Firefox wants to be good to it's users - bring back image/webp in the accept header for the navigation request.
Alternatively, please as a browser vendor, come up with a very good idea on how the detection can be done:
- Without using 302 redirects
- Without sending back a webp image when people request jpg or png extensions
- Without having to evaluate 10s or 100s of requests to do a logic condition
- Without having a silly implementation that prevents people from actually implementing it
I'm looking forward to hear from Mozilla how they'd want site owners and the internet to adapt for serving webp, in such a way that Firefox users can benefit from new technologies.
Your decisions are hurting your own users, simply because "we simply follow the spec".
I'm surprised that you don't see the logic in the issue and just shove it off your back by saying: "We are following the spec".
Comment 26•5 years ago
|
||
I do think we need to fix this issue, but I want to have a more generic fix than having firefox with a different Accept header value.
Specs can be changed easily, in particular, the Fetch API, I hope. Anne, do you know if we have already a spec issue to introduce webp in the accept header value?
Comment 27•5 years ago
|
||
https://github.com/whatwg/fetch/issues/274#issuecomment-545501458 is the latest there. There's not much agreement it seems to all share the same value. As far as I can tell we already send image/webp
, but maybe that's only on Nightly. If there's anything specific you'd like to see let me know.
Assignee | ||
Comment 28•5 years ago
|
||
image/webp
is not in default value of Accept
.
Comment 27 is about default image value of Accept
I guess?
Comment 29•5 years ago
|
||
Thanks for pointing that out Junior! Accept
when navigating in browsers:
- Chrome:
text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
(ouch!) - Firefox:
text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
- Safari:
text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
And Fetch suggests text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
.
I'm not a big fan of the continued expansion of the Accept
header, especially since there's another image format around the corner too, but if it makes more content work I'd suggest amending ours to
text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
and perhaps experimenting on Nightly with
text/html;q=0.9,image/webp,*/*;q=0.8
to try reduce the overall length somewhat (and if successful suggest it to the others).
Comment 30•5 years ago
|
||
Generally speaking we (and many others) do detection for webp based on the Accept header sent in the navigation request and simply look for image/webp
this way we're able to on the server side, be sure to generate the source containing webp, and use vary headers to then serve a specific cache item (based on webp being supported or not).
I agree that if there's tons of formats coming up then continue to extend the Accept
header is not ideal - but I'm generally curious about other (reliable ways) to easily figuring out support for webp for example - sure we can match based on user-agent, but then we'll need to do really weird matching, such as target firefox 65 versions, but not 66, 67, 68, 69, 70, but then maybe tag version 71 and above again - and we have to do this for every browser, and hope for user-agents stay fairly consistent.
We're talking about the navigation request here - it's a single request so while the Accept header may be a bit long (One could argue that maybe text/html,application/xhtml+xml,application/xml;q=0.9
shouldn't be there when it's sent by all browsers (I know it's more complicated than that, but anyway).
If anyone has a better idea of being able to detect webp reliably that can be implemented without having to "hack" stuff and return different content than the requested file type, or without having to maintain a long list of user-agents, then I'd really prefer adding the extra bytes in the header (which will be compressed under HPACK anyway).
However, it would be nice to find a solution for browsers to advertise support for certain technologies, that all (major) browsers would adapt.
Comment 31•5 years ago
|
||
Does anyone have any background reading for someone curious to know why sites actually need this, rather than leaving it up to the browser which version of the image they wish by using srcset and distinct URLs for each available variant of the image? Is it simply because there is no pure CSS variant of srcset that's broadly available? I doubt that anyone truly wants Accept to eventually go the way of User-Agent, after all. Maybe the Client Hints/User Agent Hints specs are more suited for this kind of thing, if there are reasons it is required instead of the srcset approach?
Comment 32•5 years ago
|
||
It’s already mentioned why it’s needed.
It’s used for detecting whether a browser supports webp or not so sites can do replacements server side to generate the page source with either webp or the normal image, thus also being able to fully cache this.
You can only use srcset by using the source element to specify the image format as well and hope the browser picks up the right image. Now the issue with the source element is that it’s not super widely supported in browsers so one would have to use a JavaScript polyfill, and we end up having to add even more work for browsers that do not support the source element, further growing the client-side code that has to be executed.
You’ll also find it unlikely that the world is going to change from img tags to source tags where we can actually specify the image type to be webp and jog for example, because it complicates things a whole lot from an integration perspective and to satisfy the majority of the world integrators that would build tools to implement webp on sites would have to find all image tags and replace them with source srcsets and add a polyfill to the mix as well.
So a TLDR:
To serve webp versions of a site to webp supported browsers, like we currently do for chrome (and co) users :)
Comment 33•5 years ago
|
||
Oh and yes, because of the lack of source srcset in CSS as you already pointed out as well.
Comment 34•5 years ago
|
||
Thanks, but I still don't really know why it's necessary to do this just based on those arguments. I'm not personally against it, mind you; it sounds like a fine way to save integrators time right this second. But we already see HEIF, AVIF, JPEG-XL and other formats on the horizon. And what about high DPI screens or HDR support? How complex will your logic have to be, instead of letting the UA decide? How will you know what the user's system handles best (hardware acceleration, temporarily broken codecs, etc)?
These are already problems (for instance, certain social media sites wanting UAs to send hints so they can choose whether to serve high-DPI images, rather than just letting the UA choose). I think now is the perfect time to not just seek a quick fix, but to find out what makes integrators not want to change their approach. If it's simply because the tools aren't yet available or mature (CSS support, quicker ways to manifest supported image formats than source tags, etc), then let's focus on that too, not just a quick fix. After all, trying to guess the best video codec has caused nothing but problems for video sites, and now it's poised to become a similar problem for images.
Of course bear in mind that I'm only saying this as a concerned and curious webcompat engineer, not an integrator or decision maker. To me, a compelling enough argument for Accept:webp would be "there are lots of servers already relying on it, which will probably never be updated"
Comment 35•5 years ago
|
||
But we already see HEIF, AVIF, JPEG-XL and other formats on the horizon. And what about high DPI screens or HDR support?
We can already do high DPI screens using srcset on img
tags and in CSS.
It's correct regarding HEIF, AVIF, JPEG-XL that if we start to add every image type to the Accept header it becomes long (now, from a transport perspective, this doesn't really matter much, because it's compressed anyway and is a part of the static table definition in HPACK thus we won't even resend them for other requests unless they change - so despite the header could be longer if those were added, they're not actually adding a lot more to the transport.
As I've mentioned earlier, as browsers catch up and support a specific type (and older browsers that doesn't goes EOL), maybe it would be time to remove things from the "Accept" header again - one could argue text/html shouldn't be a part of the Accept header because all browsers implement it, so why is it there?
How complex will your logic have to be, instead of letting the UA decide?
Now the UA doesn't decide anything, the UA simply tells what it is (and sometimes it's lying). If user-agents would add what image formats they support into the UA string, then sure the detection becomes easy, but let's take a few scenarios:
We can use a whitelist
Keep a list of all user-agents (and versions, and flavors of versions) that support an image type.
What sucks about this method is that we constantly have to update this logic, to support whenever a new version is released or when changes happen. While this was fine back in the days - currently some browsers (Firefox included) release so many browser updates that it becomes annoying to keep track of.
We can use a blacklist
See all browsers as supported and blacklist those that does not.
Issue with this is that if a new browser comes out, we may send webp (or other formats) to browsers that does not support it)
We can use regex
Bound to break, but okay - we'd have to match regex for Chrome (but not everything that says "Chrome" because it may be forks that does not do webp), Firefox version 65 specific, and then maybe 71, and then maybe remove 72 again because someone does a regression.
Matching on UA isn't easy, it has never been and until there's a standard on how UA strings should look - it probably never will be easy.
Thus plenty of people are against basing support of things on the user-agent.
If you know a good library that advertises what a browser support and doesn't support, then I'm sure plenty of people can use that.
How will you know what the user's system handles best (hardware acceleration, temporarily broken codecs, etc)?
You'll never really know unless browsers implement this from the server unless browsers implement it - you may be able to get some info via javascript, but stuff such as image support is maybe not something you should do via javascript - because javascript will often be evaluated/executed after the browser already started to fetch other resources.
As you maybe know, the priority of javascript in HTTP2 and HTTP3 is rather low compared to any other resource type. Thus they're expected to come in late - thus that's not where we should do such detection.
To me, a compelling enough argument for Accept:webp would be "there are lots of servers already relying on it, which will probably never be updated"
I wish I could use that argument, but I can't because I don't believe it's the case.
I see the point in we should try to keep the accept header decent, but on the other hand, the header basically tells "I accept X".
Firefox currently send image/webp for all image resources, so sorry to use your own argument against you:
But we already see HEIF, AVIF, JPEG-XL and other formats on the horizon. And what about high DPI screens or HDR support?
Shouldn't these be in the accept header of image resources too then?
Sending the image/webp in the navigation request solves the problem of where most plugins (e.g. for WordPress) or other libraries do a lot of this detection if anything we'd love to do it on the server-side so we can deliver exactly what the client needs to process.
Could this be changed to user-agents? Yes, but I think it will introduce plenty of problems maintaining these lists and preventing false positives, and we end up with a case wherein 5 years we have to match across 100s of user-agents, that really sucks.
The main issue browser vendors and integrators (and webcompat engineers) have these days is that if we want to be front-runners for new technologies to gain marketshare (Not saying every technology should become popular), but when it makes sense the front-runners may have to bend some rules or integrate something temporarily until the remaining of the market picks up.
I believe webp is one of them - it actually benefits everyone that has support for it.
We know Firefox (and other vendors) has been doing it for other technologies as well.
There's a phase where we need certain flags to see a browser supports new tech, and where to put this depends on the tech that is being implemented - in this case, Chrome (and flavors of Chrome) decided to add it in the Accept
header in the navigation request, so did Firefox (for 1 version) - and for this particular case, I actually believe the Accept header would be the place to put it.
I'd like a standard way to advertise support for these technologies, and I think that would be the ideal solution on the long run.
But until we have something like that in place (as well as source srcset support in CSS) then I can't see why we shouldn't let the marketshare FireFox has benefit from a technology that greatly reduces page sizes.
A small example:
Site with no webp support in Firefox takes up 5.15 megabyte
In Chrome the same site consumes 3.3 megabyte
It's a decrease of almost 36% in page size, while this may not mean a lot on a fast connection - imagine slower connections such as 3G or slow 4G - we're talking roughly 2 megabytes of data saved by sending a image/webp
string in the Accept header in the navigation request.
There's literally already 1 million+ sites out there that will send webp images the second Firefox adds back image/webp
to the navigation request.
Sure it's only a million+ sites, but still a decent count :)
Comment 36•5 years ago
|
||
Note that I'm not arguing against your sites using WebP, or another format. I'm just wondering why page devs feel they need to make the choice of what to serve based on Accept headers and the like, rather than the other way around. If your server had a reasonable way to just say "you can select either WebP or JPG", then your listed concerns basically go away and you have less work to do.
maybe it would be time to remove things from the "Accept" header again
That would only happen if everyone is committed to going back to change all of their old sites once the Accept header changes and removes bits and pieces. Unfortunately, we know that is not realistic. I don't want you to have to go back to a site you made 10 years ago, for some company that you don't work for anymore, and figure out that old code and fix it so it works again if something broke (let alone just so browsers can reduce their Accept header's footprint).
Now the UA doesn't decide anything, the UA simply tells what it is (and sometimes it's lying)
My argument is that the UA should be what makes the choice, not just say what it supports and hope the server gets it right. This is precisely what has lead to UAs "lying" over time: old servers are still looking for "Mozilla" in the User-agent header before they serve specific content, thus UAs "lie" that they are Mozilla browsers. That would be avoided if the server just served the content, and the UA decided which format to fetch. I don't see why servers are the right place to make such choices instead of the user-agent.
Matching on UA isn't easy,
That's exactly my point. Servers and web pages in general are not the right ones to made these decisions. The user agent should choose based on what formats you have available, and what it know the users' system supports best. I would rather work toward that goal then keep adding work-arounds and over-complicating everyone's server and page logic.
Shouldn't these be in the accept header of image resources too then?
Why not just let the server tell the browser what it has available, and the browser picks instead? Why worry about what format to pick on the server-side? Again, what if hardware acceleration is available for some formats? How do you know what's best to pick in every situation? Browsers will just eventually have to lie to servers that are sending sub-optimal content. And servers will just need to know more and more information in headers to know which image is best to serve to a given device and browser, not just which image formats they might be able to accept.
so we can deliver exactly what the client needs to process.
But why do you need to make that decision on the server side? Why can't Wordpress simply say "pick either the WebP or JPEG of this image, each are available here" and let the browser pick the best one? So far I don't really know. It just seems to be the way we're used to doing things, not the way we should be doing them.
I can't see why we shouldn't let the marketshare FireFox has benefit from a technology that greatly reduces page sizes.
You can already do this, why not just improve srcset/etc and let Firefox choose which format is best for a given user, rather than guessing WebP because it reduces your page size?
webp actually benefits everyone that has support for it.
I am in no way arguing that you should not serve WebP, I just wonder why your web server cannot simply tell the browser it supports WebP, and where to get it, and let the browser do the rest. If it's because Firefox is currently making a bad choice, then why not just improve Firefox to make a better choice? If it's because the current tools (srcset and such) are not supported well enough or do not offer efficient enough ways to advertise which formats you support, then why not push for browsers to fix that instead of making web servers try to guess the best format?
For now, the answer seems to be that this is a quick fix that might help more than it harms. Which is of course fine. But it does not sound like the RIGHT fix, just an easy work around for right now.
At any rate, I apologize for hijacking this bug for this discussion. If folks want to continue it, let's find a better place than this work-tracker.
Comment 37•5 years ago
|
||
I'm just wondering why page devs feel they need to make the choice of what to serve based on Accept headers and the like, rather than the other way around
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Accept
The Accept request HTTP header advertises which content types, expressed as MIME types, the client is able to understand
Because that's what the Accept header is for, for us to return what the client accepts, we should not return something the client does not accept.
If your server had a reasonable way to just say "you can select either WebP or JPG", then your listed concerns basically go away and you have less work to do.
How? The server has to figure it out somehow, but on the other hand, I also don't think that if a browser request image.png that it then should return a .jpg, .webp or .gif image.
That would only happen if everyone is committed to going back to change all of their old sites once the Accept header changes and removes bits and pieces. Unfortunately, we know that is not realistic.
Just like it's not realistic to maintain a list of all user-agents and what features they support - thus use a header made for advertising which content types the client is able to understand.
Thus we have to find a middle ground.
I don't want you to have to go back to a site you made 10 years ago, for some company that you don't work for anymore, and figure out that old code and fix it so it works again if something broke (let alone just so browsers can reduce their Accept header's footprint).
The argument is simply there to say, sometimes things are done, and for a reason, why should millions of sites that have implemented looking for image/webp suddenly have to go back and change to a list of hundreds of user-agents because Firefox feels it's not good to send a mime type that they support in the navigation request Accept header?
Again, Accept header is a part of the static HPACK table definition - sure it adds 10 bytes to the header, but it's sent once, and it's sent compressed.
My argument is that the UA should be what makes the choice, not just say what it supports and hope the server gets it right. This is precisely what has lead to UAs "lying" over time: old servers are still looking for "Mozilla" in the User-agent header before they serve specific content, thus UAs "lie" that they are Mozilla browsers. That would be avoided if the server just served the content, and the UA decided which format to fetch. I don't see why servers are the right place to make such choices instead of the user-agent.
Realistically I don't see websites around the world will implement source srcsets, and we still need a CSS implementation for it as well then.
And servers will just need to know more and more information in headers to know which image is best to serve to a given device and browser, not just which image formats they might be able to accept.
Let's take an example.
We have a website with 150 images on it, all images come in jpg, webp, heif and jpeg-xl format.
Browser sends a navigation request to the server:
with the info image/webp,image/heif,image/jpeg-xl
This request header is 42 bytes uncompressed (I'm not gonna calculate HPACK compression ratio in this case).
The server will return let's say webp images, so you'll have 100 image tags:
<img src="img/image1.webp" />
<img src="img/image2.webp" />
<img src="img/image3.webp" />
<img src="img/image4.webp" />
Ok, let's switch it around - let's make the server not do the decision, but instead inform to the browser what it has:
<picture>
<source type="image/webp" srcset="img/image1.webp">
<source type="image/jpeg-xl" srcset="img/image1.jpegxl">
<source type="image/heif" srcset="img/image1.heif">
<img src="img/image1.jpg" />
</picture>
<picture>
<source type="image/webp" srcset="img/image2.webp">
<source type="image/jpeg-xl" srcset="img/image2.jpegxl">
<source type="image/heif" srcset="img/image2.heif">
<img src="img/image2.jpg" />
</picture>
<picture>
<source type="image/webp" srcset="img/image3.webp">
<source type="image/jpeg-xl" srcset="img/image3.jpegxl">
<source type="image/heif" srcset="img/image3.heif">
<img src="img/image3.jpg" />
</picture>
<picture>
<source type="image/webp" srcset="img/image4.webp">
<source type="image/jpeg-xl" srcset="img/image4.jpegxl">
<source type="image/heif" srcset="img/image4.heif">
<img src="img/image4.jpg" />
</picture>
Browser sends a 42 byte extra in the navigation request (uncompressed), gets a 119 bytes request back for 4 images.
Browser doesn't send the 42 bytes extra in the navigation request, gets a 875 bytes request back for 4 images.
Your HTML output grew by 635% because you wanted to save 42 bytes.
For me, that makes no sense - but ok.
Again, what if hardware acceleration is available for some formats?
Then maybe make it in such a way that you can add a priority of what you prefer to get:
Accept:image/webp;p=80,image/heif;p=100,image/jpeg-xl;p=30
Or simply order the formats as you'd like them received for a given resource - this way if it comes first in the list, it would be your preference.
In the above case, you'd prefer HEIF if available, if not, then webp, and if not jpeg-xl, and it that's not available, fall back to whatever is available.
Then surely the server can decide whether it wants to listen to the priority, but it could be a nice way for browsers to inform servers what they prefer to receive (based on the info you have about a system).
You're adding a small amount of data to the navigation request, and saving hundreds of bytes on modern websites because the server does not simply output everything it has via source srcsets.
But why do you need to make that decision on the server side? Why can't Wordpress simply say "pick either the WebP or JPEG of this image, each are available here" and let the browser pick the best one?
It can, but is it optimal to send kilobytes of data for every pageview because you can't send sub-50 bytes of data?
It just seems to be the way we're used to doing things, not the way we should be doing them.
But question is, should we really be adding 600% increase to our HTML output for images because we can't advertise in the HTTP header what we support?
You can already do this, why not just improve srcset/etc and let Firefox choose which format is best for a given user, rather than guessing WebP because it reduces your page size?
So increase the page size, to reduce the page size - logical.
If it's because Firefox is currently making a bad choice, then why not just improve Firefox to make a better choice?
I'd say, Firefox currently makes a bad choice by not sending image/webp in the accept header.
I just wonder why your web server cannot simply tell the browser it supports WebP
It's not the web server, but rather the website that would have to tell this if we're going the srcset way.
If it's because the current tools (srcset and such) are not supported well enough or do not offer efficient enough ways to advertise which formats you support, then why not push for browsers to fix that instead of making web servers try to guess the best format?
Why should browsers be pushed, if we simply should return everything? Then in the end, it doesn't matter for us what the browser decides to do right?
Why do browsers have to send Accept-Encoding and Content-Encoding headers? The server should just give the browser to fetch whatever it likes no?
Like we should go all the way then, for everything.
For now, the answer seems to be that this is a quick fix that might help more than it harms. Which is of course fine. But it does not sound like the RIGHT fix, just an easy work around for right now.
Possibly, if the world can agree on a way where we can keep communication between client and server minimal (from a transport perspective), while still giving sufficient info for both ends to decide what they wanna do, then I'm all for it - but that's currently not the case.
We have to remember not only about what's possible within HTML, CSS, JS but also the other components of the browser and server - I can't see why it's really a big problem to advertise from the browser perspective in the navigation request what it supports, especially taking into account the HTTP protocol is highly efficient in SPDY, HTTP/2, QUIC and HTTP/3 to transport this information.
Then we could improve on the logic on what to return based on preferences (either with the order of mimes you accept, or a specific p=
flag) so the site and/or server can try to satisfy your request. If people wanna implement large chunks of picture elements with source srcset codes with all formats they support - then they can also do that, and you can still fetch what you think should be fetched.
But I'd say a 50 bytes header (uncompressed) and a 3 kilobyte HTML output (+ 100 DOM elements) vs a 22 kilobyte HTML output (and 500 DOM elements), increasing the HTML by 600% and the DOM elements by 500% seems really overkill.
Comment 38•5 years ago
|
||
Alright, one last comment, then I'll let others have their final word.
I really hope you aren't mistaking me as saying that Firefox SHOULD NOT implement this header change. I'm simply asking why site developers seem to think they need it, as I'm getting the impression that they don't want to advance the state of the art here, just keep the status quo.
Because that's what the Accept header is for
It still doesn't follow that it's a good idea anymore. User-Agent headers seemed like a good idea too once, as did vendor prefixes like -webkit. Accept might be fine for now, but it has proven not good enough for videos already. I can see it becoming not good enough for images very soon.
How? The server has to figure it out somehow
No it doesn't? Source tags already prove that it can just let the browser choose and not have to figure anything out for the browser. Why not improve on those to save a few bytes and get CSS support? I'm not getting the vibe that site devs are interested in doing that, rather than just continuing to guess based on headers. And that worries me, as someone who has to live in the trenches and see users complaining about sites serving sub-par content when they don't have to, and browsers having to lie to get that better content.
Just like it's not realistic to maintain a list of all user-agents
I still don't see why you HAVE to do that. Right now, the lack of a CSS equivalent for source tags is a blocker, in my estimation. But saving a few bytes is not preferable to the problems that UA sniffing causes. Especially if you are already saving a lot more bytes just by gaining WebP support.
Thus we have to find a middle ground.
I'm fine with the middle ground being Accept headers for now, to buy us time to work on CSS support and other improvements to srcset to save some extra bytes (etc).
why should millions of sites that have implemented looking for image/webp suddenly have to go back and change to a list of hundreds of user-agents
I still don't understand why they have to make that change? I'm not arguing against adding Accept:webp, after all. I just think they should make changes eventually to support a better approach, so why not start considering it now?
Again, Accept header is a part of the static HPACK table definition
Compression and header sizes aren't the pain point here. HTML source tags also compress well. But fixing it when a site is serving sub-optimal content because it lacks the insight necessary to know what's best to serve? That is a real pain point, and I'd like to avoid it. We won't solve that just by adding more Accept headers or other hints. Not every server will go the extra mile to begin with. A better solution seems to be in order.
Realistically I don't see websites around the world will implement source srcsets
If we cannot convince sites to do better, then I guess this entire argument is moot :) But yes, we realistically need to improve the HTML and CSS side to get to a better place. Again, we can also add Accept headers for now, I just want to know why sites seem to think they need to decide what to serve, rather than letting the browser choose.
Let's take an example.
Even if markup size was our only concern, and gzip made the file larger, that's just how a solution would look today. There is nothing stopping us from instead just having a <picture availableFormats="webp,jpeg">
attribute instead, or a meta tag, or even response headers. This is a problem that can be fixed, and it's not even the real pain point here (let's be real: I'm sure you'd rather spend 1k in your markup to save 25% on a megabyte-ized image, if bandwidth/bytes are your concern). Not that I want site devs to make that change just for Firefox, but I would at least ask them to consider it for all browsers for the other user-facing benefits.
Comment 39•5 years ago
|
||
Compression and header sizes aren't the pain point here. HTML source tags also compress well.
Sure, but it's still a whole lot bigger than Accept is for example, and you're still not avoiding having a larger DOM to work with, a DOM that is already too big on most sites - you're thus adding a lot more workload on the clients having to parse even more DOM elements with javascript that people (sadly) do these days.
But fixing it when a site is serving sub-optimal content because it lacks the insight necessary to know what's best to serve?
Thus sort the Accept header based on priority. Apparently it's already a thing: https://developer.mozilla.org/en-US/docs/Glossary/quality_values
That is a real pain point, and I'd like to avoid it. We won't solve that just by adding more Accept headers or other hints. Not every server will go the extra mile to begin with. A better solution seems to be in order.
We can assume that web servers that actually care about performance and standards would (or already) does follow the quality values listed in https://developer.mozilla.org/en-US/docs/Glossary/quality_values
There will always be servers, browsers, sites and software that does not follow the spec 100% (either by choice or mistake). Every implementation differ, performance (both speed and resource utilization) differs - if nothing differed, we would have a single web server, and a single browser, and everyone would be happy.
If we cannot convince sites to do better, then I guess this entire argument is moot :)
A lot of standards are developed based on people are doing things wrong, or using things unintended - HTTP/2 and HTTP/3 being good examples, they both tries to fix bad design in the earlier versions protocols, but even more so, fixing what went wrong with the web as a whole.
Sometimes we have to develop standards to fix what's broken, and sometimes that means rethinking stuff and make it less complicated to actually implement.
There is nothing stopping us from instead just having a
<picture availableFormats="webp,jpeg">
This would be a sensible solution, but then some people may call their images jpg, others jpeg, JPG, JPEG, or for webp it could be .jpg.webp
or .JPG.webp
(e.g. using extension of original image, and then server appends webp for webp version), .webp
etc.
Standardizing this would be relatively easy, but you'd also want people to implement it, so if you're all of a sudden having such restrictions, you may end up asking a site to rename 10s of years of content to comply with the new standard.
or a meta tag
Which wouldn't work - that would assume all images on the site would be available in all formats, and that may not be the case.
There's examples where a certain file type (such as webp) gives 25-60% savings, but other images where the size would maybe be 20-30% bigger, and cases like this, you'd maybe not want to even have a webp (or jpeg-xl etc) version of the given image.
or even response headers
I'd say, that's as bad as the Accept request header then :)
(let's be real: I'm sure you'd rather spend 1k in your markup to save 25% on a megabyte-ized image, if bandwidth/bytes are your concern)
Correct, if bandwidth was absolutely everything - but I'm actually measuring real performance benefits from end-users and RUM data to gather insight into performance optimizations that are done, and will adjust - if my 1k extra markup slows down due to additional javascript processing then it makes no sense if I can start my rendering earlier because data arrives earlier.
I rather optimize for real-world traffic - and generally a massive DOM has bigger impact than we want them to have.
Not that I want site devs to make that change just for Firefox, but I would at least ask them to consider it for all browsers for the other user-facing benefits.
Devs will often implement and go for the majority of browser support, we wanna cover 80% of the people with 20% of the work - if it makes no sense to spend another 80% of our time to satisfy (and maybe even dissatisfy part of our 80%) then it may not be worth it.
It's an interesting discussion and I'm sure that eventually a solution that is actually nice and (semi easy to implement) will be here - but in the meantime, we can get the second best.
Comment 40•5 years ago
•
|
||
(In reply to Thomas Wisniewski [:twisniewski] from comment #31)
Is it simply because there is no pure CSS variant of srcset that's broadly available?
Well, there are the image‑set(…)
(bug 1107646) and image(…)
(bug 703217) CSS functions, but neither is supported by Firefox.
Assignee | ||
Comment 41•5 years ago
|
||
I'm not a big fan of neither making Accept:
longer, nor ignoring spec by reverse engineering or fitting the field.
However, adding image/webp
as a default value might be an acceptable action at this moment, given we fail to figure out a clean solution in the duration we have so many developers care about this issue.
webp faq indicates checking Accept:
and <picture>
but the latter one makes the developer's experience painful and we care about that. I don't like the way to neglect the spec, but we're in a dilemma in fact.
Personally I don't care much about the bytes due to the compression and the fact that 80% bandwidth from multimedia. I care more about the experience and the performance in the case. I believe Comment 40 will make the world better. In the mean time, it deserves to increase the visibility of both Accept
topic and a better and neutral solution.
Assignee | ||
Comment 42•5 years ago
|
||
Comment 43•5 years ago
|
||
(In reply to lucas from comment #35)
Firefox currently send image/webp for all image resources
If we already do that, doesn't that provide an obvious fix for the problem? Have the HTML page always load <img src="/image_no_extension">
and have the image load return a type that's available+accepted rather than try to change the HTML to direct the user to either webp or jpg?
Comment 44•5 years ago
|
||
(In reply to Valentin Gosu [:valentin] (he/him) from comment #43)
(In reply to lucas from comment #35)
Firefox currently send image/webp for all image resources
If we already do that, doesn't that provide an obvious fix for the problem? Have the HTML page always load
<img src="/image_no_extension">
and have the image load return a type that's available+accepted rather than try to change the HTML to direct the user to either webp or jpg?
No, it does not fix the problem, it complicates it even more, for a few reasons.
No extension means to handle it dynamically
You're not gonna store myimage
on the file system, what you're asking is that the web server software will run stat
for the following list:
- myimage.webp
- myimage.png
- myimage.jpg
- myimage.jpeg
- myimage.gif
Surely you can stat in the order which the accept header is, in this case webp,*/*
- but if there's no webp, then you have to possibly stat
for all other file extensions (in uppercase and lowercase extensions even).
You'll have to do this for every image on a website. You're thus maybe asking the web server to stat the file system 1000+ times for a single pageload. It simply doesn't scale.
You put complex logic on things that should be dumb
Static file delivery systems by default should be dumb - that's what makes them so performant as they are - adding additional logic increases complexity, decreases performance, and increases cost of running a given service in the masses.
Imagine CloudFlare having hundreds of servers all having to communicate with the origin server to figure out what images exist on the origin or not (if they don't suddenly take over image generation, but should they?).
Doing the logic at it's source makes sense until there's a better solution that is sensible to implement for everyone and at same time, one that doesn't make the world increase it's electricity footprint even further than it is today.
Let's not build something that makes the world worse than it already is :)
Comment 45•5 years ago
|
||
(In reply to lucas from comment #44)
If we already do that, doesn't that provide an obvious fix for the problem? Have the HTML page always load
<img src="/image_no_extension">
and have the image load return a type that's available+accepted rather than try to change the HTML to direct the user to either webp or jpg?No, it does not fix the problem, it complicates it even more, for a few reasons.
That's a fair point, I guess, but it still seems to me like we're trying to hack around the exact same reason that the <picture>
HTML element is for. I agree that if all your HTML is img
tags this will increase the size of the HTML, but that's negligible compared to the actual size of the images.
I don't feel that strongly one way or another about this, but making an exception for webp sounds like a hack 🙂
Updated•5 years ago
|
Assignee | ||
Updated•5 years ago
|
Comment 46•5 years ago
|
||
Comment 47•5 years ago
|
||
bugherder |
Updated•5 years ago
|
Comment 48•5 years ago
|
||
Posted site compatibility note: https://www.fxsitecompat.dev/en-CA/docs/2019/image-webp-has-been-added-to-default-http-accept-header/
Comment 49•5 years ago
|
||
Junior, can we uplift this patch to 71 or do you think that it is too risky for the release channel and it should ride the 72 train? Thanks
Assignee | ||
Comment 50•5 years ago
•
|
||
(In reply to Pascal Chevrel:pascalc from comment #49)
Junior, can we uplift this patch to 71 or do you think that it is too risky for the release channel and it should ride the 72 train? Thanks
IMO it should ride the 72 train given that we are without this for 5 versions and next beta is coming soon.
Comment 51•5 years ago
|
||
WFM, one less potential uplift to worrry about, thanks :)
Updated•3 years ago
|
Description
•