Since bug 318793 Firefox (and other xul apps) completely disregard any caching information sent along with an update.rdf file. This means that I, as the extension author and maintainer of my webhost, have absolutely no mechanism to control how often my update files are downloaded, beyond pulling my extension of course. Right now update files take upwards of 80% of the data pulled from my site. It strikes me that we should be able to do some kind of honouring of caching information so I'm opening this for a little bit of input. The main concern with caching as far as I understand it was that many authors use static files which will get cached. My first thought is that if rather than bypassing the cache all the time we instead forced validation all the time (VAIDATE_ALWAYS) then wouldn't that always check for new versions of static files, and allow those of us using databases to use appropriate logic to cancel the request before having to send the entire update.rdf? Admittedly this wouldn't completely get rid of the requests, but it would at least mean I could cut down the data transferred and users still wouldn't lose the speed of update detection.
This is one of those bugs where there is no way to satisfy everyone regretfully. I personally prefer to allow caching and allowing the author to control the behavior by sending Cache-control: no-cache in the response header if they want to disable caching but many authors either don't know how or don't want to be bothered. What ever is decided to be done to fix this I would prefer it if it doesn't cause another bug to fix the new behavior. One important note is that this new behavior hasn't been thought of as a bug as often as the old behavior.
I agree that we can't really flip flop on this, I'm just hopeful we can find a slightly better compromise than we have now. With the current situation it's possible for Firefox to cost me money without any real user interaction. Anyway I've performed a couple of quick tests. The first case was a static file served from a standard Apache install. First testing the behaviour pre-bug 318793. In this situation on the first request the file is retrieved in full. For subsequent requests Firefox re-contacts the server and validates it's cache because Apache sends an E-Tag and Last-Modified header wit the file. If the file is modified then the new file is downloaded, if not then the cached version is used. Post-bug 318793 the file is of course always downloaded. If the request flags are changed to VALIDATE_ALWAYS then the file is again correctly validated each time. Second case is a PHP script serving the update file, with pretty much no headers. Here the file is never cached by Firefox and so no matter what flags are used the file is always redownloaded. Final case is a PHP script serving the update in a pretty broken manner. It sends an expires header set in the future to make sure that firefox caches it. Pre-bug 318793 Firefox caches the file and never checks for an updated version. Currently it of course redownload's each time. However changing the request flags to VALIDATE_ALWAYS makes Firefox redownload the file each time as well. Basically in these three cases using VALIDATE_ALWAYS instead of BYPASS_CACHE makes things work in what I believe is the best manner, caching the file when possible but always going back to the server to check for a new copy.