Closed Bug 73490 Opened 24 years ago Closed 24 years ago

HTTP URLs with '?' characters should not be cached unless an expiration time is explicitly provided by the server

Categories

(Core :: Networking: HTTP, defect)

defect
Not set
normal

Tracking

()

RESOLVED FIXED
mozilla0.9

People

(Reporter: darin.moz, Assigned: darin.moz)

References

Details

Attachments

(3 files)

HTTP URLs with '?' characters should not be cached unless an expiration time is explicitly provided by the server. See section 13.9 of RFC2616 for details.
Status: NEW → ASSIGNED
Keywords: nsbeta1
Target Milestone: --- → mozilla0.9
sans the printfs, r=gagan
you may want to add a comment to logs instead of the printfs
Attached patch cleaned up patchSplinter Review
dougt says "sr=dougt"
the fact that we're searching the whole spec for a '?' worries me - could a multibyte-char hostname (now that they are allowed) contain an 8 bit character which happens to match '?'? it would suck for that entire host or domain to have their pages uncached.
good point. i'll have to be more careful. perhaps it is enough to check whether or not the URL has a non-empty "query" portion (via nsIURI::query). would this be sufficient?
Keywords: patch
that seems sufficient to me. if it's not, that would imply url parsing problems. r=valeski on the 3/26/01 21:28 patch.
*** Bug 73600 has been marked as a duplicate of this bug. ***
*** Bug 73572 has been marked as a duplicate of this bug. ***
sr=alecf (though you might run into platform-specific bustage with that "if (str...)" because I'm not sure all platforms will know to cast that nsXPIDLCString to a const char *)
alecf: not a problem... that convention is used all over the place in HTTP.
yea, nsXPIDLCString does the right thing xp now.
Spam? (if yes, sorry): The cached url's in recent builds (2001-03-27) have become a major annoyance when reporting bugs. Going to a bug page previously visited and recently changed (bug reported) *shows the old state*. Pressing SHIFT-Reload is a pain and most users will not know how to or allways remember to do this. suggest keyword: *nscatfood*
fix checked in.
Status: ASSIGNED → RESOLVED
Closed: 24 years ago
Resolution: --- → FIXED
This seems to have broken going back to forms. If I fill out a form at say http://bugzilla.mozilla.org/show_bug.cgi?id=73490, then click on "Vote for this bug" and then hit "back", the form data should not be cleared.... Currently it is (linux build 2001-04-02-05). Should I open a new bug on that?
yes please do.. that is probably a side effect of the data not being cached.
I think it's a bad idea not to cache all URLs with '?'. I think that all pages with '?' should be cached except the pages wich sends the "no-cache-header".
deniande: please read section 13.9 of RFC1616 (http://www.w3.org/Protocols) and let me know how you interpret it... it seems very clear about this point.
I think we need to cache them through session history, but not in the case of it being typed in a url bar, or clicked on through a link..
sounds right to me too. i've talked to radha and it looks like we can use the same solution for POST urls to accomplish this. namely, HTTP will add a unique id in the cache key. only history will know this cache key, and so only history will be able to retrieve data for these urls from the cache. see bug 56346 for details.
Darin: I know that RFC1616 says that URLs with '?' characters should not be cached, but let me describe what I want. I would like that prevously visited pages should be availibe when I am browsing offline. For example these "bug-pages". Before this bug was fixed, I could browse these pages offline, but now I can't because there is a '?' character in the URL. Internet Explorer does this, so I'm currently using IE, because I use this future a lot. Before this bug was fixed I used Mozilla, and I want to use it again, but not before I can browse visited pages offline.
We could do so that the page isn't loaded from cache when you are working online, but when you are working offline (File => Work Offline), then the page is loaded from the cache... What do you think about that?
Dennis: see bug 56346... i think you'll agree that the solution described there will meet your concerns as well.
Darin: From bug 56346: "ok, i've discussed this with gagan and gordon, and we've come to the conclusion that HTTP should just put everything in the cache." Does that mean that I would be able to browse all visited webpages offline (inclusive URLs with '?')?
If everything is in the cache, how do we deal with the original problem: Going back to a bug after making a change doesn't show the changes?
Peter: "put everything in the cache" does not mean "load everything from the cache". Verified fixed in the 2001-04-03-11 build, by the way.
er, that was 2001-04-03-11 Linux build. :)
Can anyone easily provide a URL that has and does not have the expiration time?
Keywords: verifyme
perhaps a bugzilla query?
or rather, perhaps a bugzilla bug link such as: http://bugzilla.mozilla.org/show_bug.cgi?id=73490
benc: you can use the testserver to set something up. seek help from bbaetz.
mass remove verifyme requests greater than 4 months old
Keywords: verifyme
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: