Open Bug 708276 Opened 12 years ago Updated 6 months ago

Dijit files are over-aggressively cached causing errors when a Dijit-based webapp is upgraded

Categories

(Firefox :: General, defect)

8 Branch
x86_64
Linux
defect

Tracking

()

UNCONFIRMED

People

(Reporter: adamw, Unassigned)

Details

Attachments

(1 file)

User Agent: Mozilla/5.0 (X11; Linux x86_64; rv:8.0) Gecko/20100101 Firefox/8.0
Build ID: 20111108090055

Steps to reproduce:

I run tt-rss on my server, it's an RSS feed-reading webapp. It uses the Dijit UI library - http://dojotoolkit.org/reference-guide/dijit/index.html .

Several times, after I upgraded tt-rss on the server end, I've started getting errors from the Dijit stuff when accessing tt-rss via Firefox. The last one was quite similar to http://tt-rss.org/redmine/issues/378#change-1264 .

As explained in that bug by the reporter and the tt-rss author, the problem appears to be that Firefox caches dijit components too aggressively and keeps using the cached versions even if the newer tt-rss comes with changes to dijit.

Removing all cache files related to dijit - I did grep -Rl dijit * | xargs rm -f in Firefox's Cache directory and restarted Firefox - resolves the issue.


Actual results:

As explained in that bug by the reporter and the tt-rss author, the problem appears to be that Firefox caches dijit components too aggressively and keeps using the cached versions even if the newer tt-rss comes with changes to dijit.

Removing all cache files related to dijit - I did grep -Rl dijit * | xargs rm -f in Firefox's Cache directory and restarted Firefox - resolves the issue.


Expected results:

Firefox should fix its caching so updating a Dijit-using webapp on the server does not cause errors at the client end.
Firefox caches based on the http specification.
What http headers are send by the server for those URLs ?
Attaching the headers captured by the 'Live HTTP Headers' extension for Firefox when loading the main tt-rss page.

There may be some 'sensitive' data in here, but tt-rss is just an RSS reader. All you're going to discover if you use this to access my tt-rss server is what web sites I read. It's not worth the effort :)
...sigh. and I just hastily killed my WP and roundcube sessions, so don't bother trying to break into those either. darn cookies.
You could edit out the cookies and the actual host....
Now the question: Which exact URL is cached too long ?

For example this URL 
>GET /tt-rss/lib/dijit/dijit.js HTTP/1.1
Gecko revalidates this URL with 
>If-Modified-Since: Tue, 22 Nov 2011 10:30:55 GMT
>If-None-Match: "23249-455-4b2504c5af5c0"

The server answer is 
"HTTP/1.1 304 Not Modified"

It have to be a 200 ok if the file had changed and gecko wouldn't use the cached copy.

The problem with LiveHttpheaders is that you are seeing only the actual requests and not things that are coming from the cache.
Problem is I've 'fixed' the issue now and there are no further updates of tt-rss available, so I can't give you an example of a header in the 'problematic' case. What I'm showing you now is a working case.

I guess I'll have to try and remember to capture the headers next time it breaks, before fixing it?
A Mozilla http log would be great if you get the problem again.
- https://developer.mozilla.org/en/HTTP_Logging
nsHttp:5 is enough in this case.

This is very likely not a bug in Gecko (from my experience).
There are explicit rules for caching in the http rfc.
- http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.