Closed
Bug 213963
Opened 21 years ago
Closed 21 years ago
Cookies get lost after a huge number of cookies is reached
Categories
(Core :: Networking: Cookies, defect)
Tracking
()
RESOLVED
FIXED
People
(Reporter: wfor, Assigned: dwitte)
References
Details
(Keywords: fixed-aviary1.0, fixed1.7.5)
Attachments
(1 file)
6.73 KB,
patch
|
mconnor
:
review+
darin.moz
:
superreview+
mkaply
:
approval1.7.5+
|
Details | Diff | Splinter Review |
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.3) Gecko/20030312
Build Identifier: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.3) Gecko/20030312
Hard to explain ... There is s very popular site in germany, it is
http://www.heise.de, this site sets a lot of cookies. After a longer session the
number of cookies grows to a large number and suddenly some are lost or get
garbadged.
The output is created by doing a 'strings -a tcpdump | grep Cookie | uniq'
[[The X, Y & Z are just used to prevent any misuse]]
Here is one example of a garbagde Cookie, see end of the line
personality=md5&XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&kind&p&un&YYYYY&uid&ZZZZZ;
collaps_expand=cea&c; forum_view=v&t; hide_show=a&s;
kill_list=kl&148684%2C170476%2C175404%2C175404%2C172937; show_preview=sp&y;
u2u_fora_list=u2ufl&; accept_cookie=ac&1; 45406=m&&cef&e; 7262=m&&cef&e;
35634=m&&cef&e; 45400=m&&cef&e;+#"?F
After accessing some pages it seems to be sane again ...
Set-Cookie: 44545=m&&cef&c; path=/
Cookie:
personality=md5&XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&kind&p&un&YYYYY&uid&ZZZZZ;
collaps_expand=cea&c; forum_view=v&t; hide_show=a&s;
kill_list=kl&148684%2C170476%2C175404%2C175404%2C172937; show_preview=sp&y;
u2u_fora_list=u2ufl&; accept_cookie=ac&1; 45406=m&&cef&e; 7262=m&&cef&e;
35634=m&&cef&e; 45400=m&&cef&e; 45385=m&&cef&c; 45382=m&&cef&e; 45392=m&&cef&e;
45396=m&&cef&e; 45405=m&&cef&e; 44548=m&&cef&e; 44547=m&&cef&e; 44545=m&&cef&c
This one is the next request, just one click awy from the one above. As you can
see the 'personality' cookie is gone
Set-Cookie: 44545=m&&cef&e; path=/
Cookie: collaps_expand=cea&c; forum_view=v&t; hide_show=a&s;
kill_list=kl&148684%2C170476%2C175404%2C175404%2C172937; show_preview=sp&y;
u2u_fora_list=u2ufl&; accept_cookie=ac&1; 45406=m&&cef&e; 7262=m&&
cef&e; 35634=m&&cef&e; 45400=m&&cef&e; 45385=m&&cef&c; 45382=m&&cef&e;
45392=m&&cef&e; 45396=m&&cef&e; 45405=m&&cef&e; 44548=m&&cef&e; 44547=m&&cef&e;
44545=m&&cef&e
The full tcpdump is available, so feel free to ask questions
Maybe related ... Heise is transfering pages in compressed form as you can see
here: Transfer-Encoding: chunked
Reproducible: Always
Steps to Reproduce:
0. You need an account at www.heise.de, it is a german page, however
you can create an account here: http://www.heise.de/registration/ There might be
a delay to get write access, do not know of any delay for read access ...
1. Start with this page: http://www.heise.de/foren/go.shtml
2. In the list of discussions do
loop:
3. Click onto 'Programmieren:Sonstige' --> This sets an additional Cookie
4. Click onto 'Aufklappen' --> This changes a Cookie
5. Go Back in history
6. Go back again in history, now you reach the list again
7. Back to pint 3. above, but now click onto the link below 'Programmieren:Sonstige'
end loop
Actual Results:
After a while (~10 loops), you can see that some cookie gets crambled or lost.
I just checked, the cookie cannot be seen in the cookie manager, although it was
a permanent cookie with an end of life set as :expires=Sun, 24-Aug-2003 18:48:02 GMT
Expected Results:
Of course, no cookies should be lost :-)
The tcpdump file is saved, if any additional information is needed, feel free to
ask.
Comment 1•21 years ago
|
||
We actually have a hard cap on the number of cookies that can be stored to
prevent some sort of security attack, iirc...
Whiteboard: DUPEME
Comment 2•21 years ago
|
||
See
http://lxr.mozilla.org/seamonkey/source/extensions/cookie/nsCookieService.cpp#92 :
static const PRInt32 kMaxNumberOfCookies = 300;
static const PRInt32 kMaxCookiesPerHost = 20; <-- is it this one ?
static const PRUint32 kMaxBytesPerCookie = 4096;
When the limit is reached, we're supposed to start removing older cookies, based
on last access. Maybe we remove cookies that are still in use ?
Reporter, we had a big cookie-rewrite between Mozzill 1.3 and 1.4, so maybe this
is already fixed. Can you upgrade and test again ?
Reporter | ||
Comment 3•21 years ago
|
||
Upgraded as you suggested to 1.4, very similar behavior. I counted exactly, to
reproduce you need to open 15 of those links in the list as described in the
'Steps to Reproduce'.
Again, tcpdump and then ...
strings -10 tcpout1.4| grep Cookie | uniq
This here is just before
Cookie: show_preview=sp&y;
personality=md5&XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&kind&p&un&YYYY&uid&ZZZZZ;
collaps_expand=cea&c; forum_view=v&t; hide_show=a&s;
kill_list=kl&148684%2C170476%2C175404%2C175404%2C172937; u2u_fora_list=u2ufl&;
44548=m&&cef&e; 44547=m&&cef&e; 44546=cef&e&m&; 44545=m&&cef&e; 44438=cef&e&m&;
44342=m&&cef&e; 43922=m&&cef&e; 43282=m&&cef&e; 43281=m&&cef&e; 42177=m&&cef&e;
39795=m&&cef&e; 39787=m&&cef&e; 36485=m&&cef&c
Set-Cookie: collaps_expand=cea&c; path=/
Set-Cookie: forum_view=v&t; path=/
Set-Cookie: hide_show=a&s; path=/
Set-Cookie: kill_list=kl&148684%2C170476%2C175404%2C175404%2C172937; path=/
Set-Cookie: show_preview=sp&y; path=/
Set-Cookie: u2u_fora_list=u2ufl&; path=/
Set-Cookie: accept_cookie=ac&1; path=/
Set-Cookie: 36485=cef&e&m&; path=/
Cookie:
personality=md5&XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX&kind&p&un&YYYY&uid&ZZZZZ;
collaps_expand=cea&c; forum_view=v&t; hide_show=a&s;
kill_list=kl&148684%2C170476%2C175404%2C175404%2C172937; u2u_fora_list=u2ufl&;
44548=m&&cef&e; 44547=m&&cef&e; 44546=cef&e&m&; 44545=m&&cef&e; 44438=cef&e&m&;
44342=m&&cef&e; 43922=m&&cef&e; 43282=m&&cef&e; 43281=m&&cef&e; 42177=m&&cef&e;
39795=m&&cef&e; 39787=m&&cef&e; 36485=cef&e&m&; accept_cookie=ac&1
Set-Cookie: 36478=m&&cef&c; path=/
Set-Cookie: show_preview=sp&n; path=/
And here ist is gone :-(
Cookie: forum_view=v&t; hide_show=a&s;
kill_list=kl&148684%2C170476%2C175404%2C175404%2C172937; u2u_fora_list=u2ufl&;
44548=m&&cef&e; 44547=m&&cef&e; 44546=cef&e&m&; 44545=m&&cef&e; 44438=cef&e&m&;
44342=m&&cef&e; 43922=m&&cef&e; 43282=m&&cef&e; 43281=m&&cef&e; 42177=m&&cef&e;
39795=m&&cef&e; 39787=m&&cef&e; 36485=cef&e&m&; accept_cookie=ac&1;
36478=m&&cef&c; show_preview=sp&n
Set-Cookie: 36478=m&&cef&e; path=/
Cookie: forum_view=v&t; hide_show=a&s;
kill_list=kl&148684%2C170476%2C175404%2C175404%2C172937; u2u_fora_list=u2ufl&;
44548=m&&cef&e; 44547=m&&cef&e; 44546=cef&e&m&; 44545=m&&cef&e; 44438=cef&e&m&;
44342=m&&cef&e; 43922=m&&cef&e; 43282=m&&cef&e; 43281=m&&cef&e; 42177=m&&cef&e;
39795=m&&cef&e; 39787=m&&cef&e; 36485=cef&e&m&; accept_cookie=ac&1;
36478=m&&cef&e; show_preview=sp&n
My new user Agent string: User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US;
rv:1.4) Gecko/20030624
If you allow me to do some suggestions. I think it is okay to store only some
number of cookies and to delete older ones when this number would get exceeded.
Maybe you can change to logic which decides which cookie gets deleted al littel
bit. For this page, the most important cookie is personality, which *I* usually
use as a permanent cookie, which should be valid for one month. All the other
cookies are temporary ones which do not hurt. So, maybe you can first delete by
whatever logic all those temporary cookies, identified by *no* expiration date.
However, maybe it still would make sense to increase the number of cookies per site?
Thank you for your quick response
Assignee | ||
Comment 5•21 years ago
|
||
Hmm... the cookie spec (rfc2109) specifies these 20/site and 300/total limits,
so the site is violating the spec here - I'd say this one is evang.
I'm not really sure about the idea of expiring session cookies before cookies
with expiry times set, when the limit is reached - although whatever we do will
certainly be affecting an edge case. I don't like adding random hacks like that...
Reporter | ||
Comment 6•21 years ago
|
||
Well I can understand that you guys do not want to violate any RFC, this is okay.
What about making those limits customizable? There is already a huge bulk of
parameters which can be changed from the outside, 2 more or less do not hurt?
[[However, in parallel I will contact www.heise.de for a different solution,
like resetting those *important* cookies every time]]
Assignee | ||
Comment 7•21 years ago
|
||
I'd recommend just getting the site to use cookies properly. It shouldn't be
setting so many cookies - it can just append its data to an existing cookie or
something.
Reporter | ||
Comment 8•21 years ago
|
||
Just quoting from RFC 2109.
-------------8<-----------------------
6.3 Implementation Limits
Practical user agent implementations have limits on the number and
size of cookies that they can store. In general, user agents' cookie
support should have no fixed limits. They should strive to store as
many frequently-used cookies as possible. Furthermore, general-use
user agents should provide each of the following minimum capabilities
individually, although not necessarily simultaneously:
* at least 300 cookies
* at least 4096 bytes per cookie (as measured by the size of the
characters that comprise the cookie non-terminal in the syntax
description of the Set-Cookie header)
* at least 20 cookies per unique host or domain name
User agents created for specific purposes or for limited-capacity
devices should provide at least 20 cookies of 4096 bytes, to ensure
that the user can interact with a session-based origin server.
-------------8<-----------------------
I see always the wording 'at least' or 'minimum' I do not see anywhere a maximum
number. Well, I agree that Mozilla does not violate the RFC, because the minimum
requirements on cookies are satisfied.
Now, what speaks against introducing a new set of tunable parameter like
network.cookie.cookiesPerHost
network.cookie.maxNumberOfOverallCookies
network.cookie.maxBytesPerCookie
to initialize
static const PRInt32 kMaxNumberOfCookies = 300;
static const PRInt32 kMaxCookiesPerHost = 20;
static const PRUint32 kMaxBytesPerCookie = 4096;
to different values tunable by the user?
Those 2 variables are only used in this single file, so they should not cause
any harm to the rest of the code?
Assignee | ||
Comment 9•21 years ago
|
||
Sure, we could, but why? If your goal is to allow users to avoid random problems
like the one this site exhibits, that's not a sensible solution. We can't expect
users to go fiddling with these prefs, so I don't really see the point in making
them customizable. There's no good reason why a site should be exceeding these
limits, unless it's broken. (Making it a pref would also introduce other
complications - what if the user changes the pref during a mozilla session? We
need extra code to traverse the cookie database and expunge all cookies over
that limit, then.)
There may be other reasons for doing it, however. Now that cookies is going into
necko, would embeddors want to tune these things? e.g. if we increase these
limits to 20/1000, would minimo applications perhaps want less?
i wonder if darin has an opinion :)
Comment 10•21 years ago
|
||
Well I would like to come back to the limited cookie count. From my point of
view there is one reason to overcome this limitation: All the other browsers
seem to work without these constraints. From the user point of view all the
other browsers are working except Mozilla/Firebird. Therfore I would like to ask
you to reconsider about the current limitation.
Many thanks
Ulf
Assignee | ||
Comment 11•21 years ago
|
||
define "work"? can you give me a specific example of something that breaks, or
is otherwise suboptimal, with the current limits?
i'm not averse to increasing them, but we should have a good reason to do it.
Assignee: darin → dwitte
Comment 12•21 years ago
|
||
Many thanks for your feedback
Well I guess the original application was somehow much too complicated. We do
have a technical magazine editor over here, who offers a lot of services like
IT- news, virus alterts etc. by means of a single login procedure to the public
FOC.
Pls. visit
http://www.heise.de/mediadaten/online/zugriff.shtml
just to get an idea of the different services they are offering and the access
figures. Even if it might look as a somehow strange language for your eyes, the
different services are listed in the lower half of the table. As they have a
huge traffic amount they are using one cookie for login purposes and are feeding
back individual cookies for setting up different kind of services and individual
setups. I guess the idea is to keep the traffic high by storing the setup
cookies on the client computer. If you are acessing different services in
parallel chances are that the client computer will run against the upper margin
of currently 20 cookies. And because of the FIFO handling of cookies the very
first entry which is in fact the user id will get lost. For the average user the
browser seems to be the cause of some strange misbehaviour. In fact the site is
adressing this in the FAQ section. As I understand the RFC- limit of 20
cookies/server is a lower limit. Therfore it should be no violation to the RFC
to increase the limit.
I run accross this problem when I wanted to report a different behaviour of
Firebird while surfing on that site. Intentionally it seems that the browser
forgets about the visited links. This seems to be related to the lost login
cookie however I can not exactly isolate and reproduce this behaviour. When
visiting bugzilla I stumbled over this bug report.
As the editors are adding their own Linux edition FOC to their magazines for
promoting Open Source Software this would be another (minor) reason for exeeding
the cookies limit ;-) I am pretty shure that they will appreciate this
modification as well.
Comment 13•21 years ago
|
||
Hello & thanks for dealing with this subject...
I seem to gather that the site in question (heise.de) *does* use cookies
'properly' and does *not* violate the RFC, while on the other hand Mozilla fails
to support the sites' cookie management just because it implements only the
RFC's absolute minimum requirements?
By the way, this is from the site's FAQ (translation from German by me): "While
the changes within Mozilla would be done by changing one line of code, the
necessary changes in cookie management within our server software would mean a
complete revision of the whole source code, for which we have no resources at
this time. That means, as a Mozilla user using our message boards, you will be
confronted with the described problem for the time being."
Comment 14•21 years ago
|
||
OS->All
Confirming based on number of confirming comments and the entry in the FAQ at
www.heise.de.
Status: UNCONFIRMED → NEW
Ever confirmed: true
OS: Linux → All
Comment 15•21 years ago
|
||
*** Bug 228695 has been marked as a duplicate of this bug. ***
Comment 16•21 years ago
|
||
All current Mozilla versions have a limit of 300 cookies that can be stored. I
currently have over 250 cookies that I need to keep around and I've gotten to
rejecting almost all new cookies that come in so they don't overwrite the ones I
need. Given that almost every site these days want to write at least one and
many times several cookies, the old limit of 300 is way too low.
Comment 17•21 years ago
|
||
I have also begun to be bitten by this problem lately. I don't think I'm
visiting any sites that are misusing cookies, it's just that I have collected so
many different ones that the ones I really need to keep are getting overwritten.
For instance the cookies used to keep track of what messages I've read on
various forums I read regularly keep getting overwritten, almost on a daily
basis lately which is extremely annoying. Annoying to the point I have been
forced to switch back to IE for keeping up with some sites.
This extremely low limit on cookies is ridiculous and will only grow more so as
time goes on. Please at least let the users configure a higher number if they
desire.
Comment 18•21 years ago
|
||
If the ones you are using keep geting overwritten due to the 300 limit, that
would suggest that your daily browsing involves over 300 unique cookies getting
set in a day. I would suggest that perhaps something else is to blame for the
sites not working, but then again, it only takes 15 sites that set 20 cookies
apiece, although that might not be the issue.
Comment 19•21 years ago
|
||
When you surf a lot of pages in one day (which many of us do) they all want to
set many cookies and every damn popup or online ad also wants to set some kind
of cookie to track users. Blowing past the arbitrary limit of 300 is almost
trivial. All we are asking for is a user preference to override the hardcoded limit.
Comment 20•21 years ago
|
||
I would be against adding a pref to raise the limit.
I would be against removing the limits. i don't want 10GB of cookies on my system.
I wouldn't have a problem with just raising it. Today's computers can handle it :)
Comment 21•21 years ago
|
||
Why are you against added a pref so the user can do what they want? Making
arbitrary decisions about what a user wants isn't going to make this the best
browser out there.
Comment 22•21 years ago
|
||
Requiring the user to find the pref, find out a working value set isn't good. A
user doesn't want to play with prefs to get his browser working.
Making it so that is just works(tm) is good.
Assignee | ||
Comment 23•21 years ago
|
||
I agree with mvl, we are not going to have a pref for this. If we need to raise
the limit, so be it. Adding a pref is both user unfriendly and pointless, for
such a backend detail as this.
Comment 24•21 years ago
|
||
Well the current 'solution' isn't working and the user can't fix it since there
is no way to change the limit.
So why do we have prefs for History size and cache size? Most people are never
going to change those either, but at least we give them the option.
Assignee | ||
Comment 25•21 years ago
|
||
I've already told you. read my comment again. if we need to raise the limit, we
can do that. ok?
history and cache are not the same as cookies, because they have more
flexibility wrt browser implementation. lack of history and/or cache (due to the
user switching them off or setting their size to low values) doesn't have the
potential to stop a site from working - an insufficient limit on cookies does,
as you have discovered.
Comment 26•21 years ago
|
||
But we already allow the user to block all cookies, which would have the same
effect as someone setting the cookie limit to zero.
I guess I just see more user options as a good thing, but others obviously feel
differently.
Comment 27•21 years ago
|
||
I guess making the limits accessible from the outside doesn't necessarily mean
to make the data availlable through some preferences pane. It would be
sufficient if the limits could be edited for the 'advanced' user from the
outside. That's the approach Apple is using on Safari. However if you want to
keep this hardcoded why don't you increase the number of cookies to 1,000 as a
starter. I would be glad to feed back if this would solve my current problems.
Comment 28•21 years ago
|
||
Suggested limits:
max 50 cookies per domain.
max 3000 cookies in total.
Yes, this is still arbitrary, but at least somewhat higher.
Maybe we should #ifdef stiff for minimo, so it doesn't have a huge table in
memory. (thanks to mconnor for the suggestion)
dwitte, darin: thoughts on the numbers?
Comment 29•21 years ago
|
||
thought I'd poke my head in here
3000 cookies is probably excessive, given a figure dwitte gave of the 300 cookie
table taking ~30k, this is along the lines of 300k of memory if we hit this
maximum. I would be against raising the limit beyond 1000 unless we also
implement some sort of auto-pruning for unused cookies (i.e. bug 202690)
just simply increasing memory footprint of a feature tenfold without a control
mechanism is bad :)
Assignee | ||
Comment 30•21 years ago
|
||
50/1000 sounds reasonable, i agree with mconnor's points about overall size.
the per-domain and total numbers aren't necessarily related though; i remember
seeing one complaint that the per-domain numbers were too low, but otherwise it
seems like most of the complaint is about the total number. so it may not be
necessary to increase the per-domain number as much...
Comment 31•21 years ago
|
||
dwitte: This bug is about a site that needs > 20 cookies. You are ironic :)
My 3000 number is more like: we do 300 now, lets just multiply it by 10. But
actaully, i think the 300 isn't the problem for most users. 1000 will do fine.
Comment 32•21 years ago
|
||
I'll speak up again. Regarding my recent complaint, I examined all the cookies I
had, and determined I was running up against the 300 limit. I went through and
removed a bunch of ad-related ones and other unimportant ones in order to get it
back down around 200. No site had more than about 10... the average was around 3
or 4.
So... I don't think the 20 limit per site is so bad - the 300 max is what hit me.
Comment 33•21 years ago
|
||
raising the cookie limits sounds like a good idea to me.
preferences might make sense in the context of an application embedding gecko.
consider minimo for example. it wants cookie support, but maybe it wants to
reduce footprint by restricting the cookies limits as much as the spec allows.
not having preferences might encourage someone to #ifdef the code.
i might not bother setting default values in all.js for these prefs. having
them be hidden prefs is probably sufficient (i.e., they don't need to be listed
in about:config).
at some point we need to provide better documentation to embedders that explains
the set of prefs like this that can be configured.
Comment 34•21 years ago
|
||
darin: if there's prefs that aren't documented in
http://www.mozilla.org/projects/netlib/cookies/cookie-prefs.html please let me know.
Reporter | ||
Comment 35•21 years ago
|
||
For me, as the original bug reporter, the 20 cookies per site limit DOES hurt,
it still hurts. On the other side, I never have a problem with the overal limit.
I do not have a problem with that limit because Mozilla asks me to set a cookie
and I do never agree to a cookie which smells like adv or for sites google leads
me too. I am sure that limit is very easily reached when one allows cookies for
every site.
I would be happy with being able to set whatever different value in the
preferences file, no need to set/change it in running Mozilla.
Whatever value is hardcoded in the source, there will be people argueing for a
smaller value (memory) and people argueing for a larger value.
Comment 36•21 years ago
|
||
Am I wrong, or do both RFC 2109 and RFC 2965 (which supercedes 2109) read as
following:
"In general, user agents' cookie support should have no fixed limits. They
should strive to store as many frequently-used cookies as possible."
(See section 6.3 in 2109, and section 5.3 in 2965.)
There *must* be a better way to deal with the potential of a large cookies.txt
file than to delete all but the 300 most recent cookies. (Not to mention that
we're talking about a file that, at its current incarnation, is around 20-30K in
size, as compared to a browser that's anywhere from 10 to 27 Mb in size.)
And just to give a real-work example of how the 300 cookie limit impacts
someone, if I don't log into my online banking site at least once every four to
five days, I have to hunt down my ATM card so that I can retype in my card
number. Likewise, if I don't log into my mutual fund tracking site every four
to five days, I have to grab an account statement to find my account number.
Neither is reasonable, given that there could be a single, persistent line in my
cookies.txt file to keep those pieces of information for me!
(And the platform for this bug should be changed to All, so far as I know; it's
a problem on Mac and Unix, as well.)
Comment 37•21 years ago
|
||
most banking sites I've seen have 5-15 minute timeouts, 4-5 days is a little
crazy. And are you saying they disable saving passwords, but persist cookies
for login sessions? That's a crazy security model, given default browser
settings for major browsers.
Regarding the RFCs, short of imposing an arbitrary method for determining when
to expire cookies (i.e. unused cookies in x days), this basically would amount
to "the longer you go between clearing cookies, the more memory you're stuck with"
also, any other option based on "commonly-used" instead of "most recently set
300" would involve tracking frequency of access/updates, with all of the privacy
concerns that would invoke (i.e. you could see that query bugzilla dozens of
times a day, based on the BUGLIST cookie).
Comment 38•21 years ago
|
||
No, the banking site login is based on my bank card number as my login name, and
that number can be saved in a persistent cookie. But if I see 300 cookies
between logins, then that cookie goes away, and I have to dig out my card again
in order to use their site. And it was one example; there are tons of others,
all of which demonstrate users who choose an option to have a piece of
information stored in a persistent cookie, and all of which involve that piece
of information deleted based on a totally arbitrary, unmodifiable limit.
And we're seriously talking about the memory use of 300 vs. 1000 vs. 3000
cookies? I can't imagine that it *touches* the memory use of, say, caret
browsing or stored form data.
Comment 39•21 years ago
|
||
going from 30k to 300k of memory used is the different between 300 and 3000
cookies. That's a big deal on a PC with 32 MB of RAM.
in any case, it'll be a pref at some point
Comment 40•21 years ago
|
||
Just a data point -- on my machine, Firefox's memory footprint is 95 Mb (iBook,
OS X), so the idea of running it on a 32 Mb machine already seems to be moot.
And if the notion truly is that adequate cookie support is not worth the memory
it takes, then it's hard to see how Mozilla aims to be anything but fringe.
Trying to sell users on a browser that will, *by design*, forget supposedly
persistent data most likely involves those users not knowing that the behavior
exists.
Assignee | ||
Updated•21 years ago
|
Status: NEW → ASSIGNED
Whiteboard: DUPEME
Assignee | ||
Comment 41•21 years ago
|
||
Assignee | ||
Comment 42•21 years ago
|
||
Comment on attachment 146737 [details] [diff] [review]
increase limits and add prefs
this ups the limits to 50/1000 resp, and adds hidden prefs for them.
(note that if a user decreases the limit, say from 1000 down to 300, this won't
have an immediate effect on the cookielist size since the cookieservice is
designed with a fixed list size in mind - the list will gradually pare down as
cookies are set. limit increases are no problem.)
i figured we don't need kMaxBytesPerCookie to be tunable - darin, any opinion?
Attachment #146737 -
Flags: superreview?(darin)
Attachment #146737 -
Flags: review?(mconnor)
Assignee | ||
Comment 43•21 years ago
|
||
actually, for limit decreases, the list size won't pare down at all, except by
cookie expirations. if we care, i can fix this somewhat shabbily...
Comment 44•21 years ago
|
||
Comment on attachment 146737 [details] [diff] [review]
increase limits and add prefs
r=me, although I think we still need to find a better way of expiring unused
cookies, but that's beyond the scope of this bug really
Attachment #146737 -
Flags: review?(mconnor) → review+
Comment 45•21 years ago
|
||
Comment on attachment 146737 [details] [diff] [review]
increase limits and add prefs
>Index: netwerk/cookie/src/nsCookieService.cpp
>@@ -1226,23 +1240,23 @@ nsCookieService::AddInternal(nsCookie
>- if (CountCookiesFromHost(aCookie, data) >= kMaxCookiesPerHost) {
>+ if (CountCookiesFromHost(aCookie, data) >= mMaxCookiesPerHost) {
>- if (mCookieCount >= kMaxNumberOfCookies) {
>+ if (mCookieCount >= mMaxNumberOfCookies) {
If make those two a |while|, won't that fix the reducing the limits problem?
Granted, it would only work the next time a cookie is set, but better then not
at all.
Comment 46•21 years ago
|
||
Comment on attachment 146737 [details] [diff] [review]
increase limits and add prefs
> PRInt32 val;
...
>+ if (NS_SUCCEEDED(aPrefBranch->GetIntPref(kPrefMaxNumberOfCookies, &val)))
>+ mMaxNumberOfCookies = val;
>+
>+ if (NS_SUCCEEDED(aPrefBranch->GetIntPref(kPrefMaxCookiesPerHost, &val)))
>+ mMaxCookiesPerHost = val;
so the type of val is PRInt32, but the type of mMaxNumberOfCookies and
mMaxCookiePerHost is PRUint16. should you range check val at all?
what if someone sets maxNumber to 999999 in an attempt to allow unlimited
number of cookies. maybe the result will not be what they expected.
you could write:
mMaxNumberOfCookies = (val >= 0 && val <= 0xFFFF) ? val : 0xFFFF;
or maybe that's overkill. i use the CLAMP macro defined in nsHttp.h for
this purpose in nsHttpHandler.cpp, which you might want to duplicate here.
otherwise, looks fine
sr=darin
Attachment #146737 -
Flags: superreview?(darin) → superreview+
Assignee | ||
Comment 47•21 years ago
|
||
checked in. i added a variation on your CLAMP macro, and made all the pref
getters use it. thx!
Status: ASSIGNED → RESOLVED
Closed: 21 years ago
Resolution: --- → FIXED
Comment 48•20 years ago
|
||
Can we port this to the aviary branch, please?
Whiteboard: needed-aviary1.0
Assignee | ||
Comment 49•20 years ago
|
||
this fix was checked in to trunk 2004-04-23, and the aviary branch date was
20040515, so by my reckoning... it already is? ;)
Comment 50•20 years ago
|
||
No, it's not yet on the aviary branch. The aviary branch branched from the
1.7-branch, which is still unpatched.
Trunk:
http://lxr.mozilla.org/seamonkey/source/netwerk/cookie/src/nsCookieService.cpp#88
Aviary branch (what would I do without this lxr module?):
http://lxr.mozilla.org/aviarybranch/source/netwerk/cookie/src/nsCookieService.cpp#83
Graph (this is rev. 1.25):
http://bonsai.mozilla.org/cvsgraph.cgi?file=mozilla/netwerk/cookie/src/nsCookieService.cpp
Comment 51•20 years ago
|
||
yup, right. dwitte must have been confused (or something worse) :)
mconnor, care to port & checkin? i guess this should apply without much problems.
Comment 52•20 years ago
|
||
So, I'm assuming by the graph that this also isn't part of Firefox 0.9? That's
a bummer...
Comment 53•20 years ago
|
||
it'll go into 1.0beta
Updated•20 years ago
|
Flags: blocking-aviary1.0?
Comment 54•20 years ago
|
||
Comment on attachment 146737 [details] [diff] [review]
increase limits and add prefs
Requesting approval1.7.2.
This is a low-risk patch which fixes a major German IT news site.
Attachment #146737 -
Flags: approval1.7.2?
Updated•20 years ago
|
Flags: blocking-aviary1.0RC1?
Comment 55•20 years ago
|
||
Comment on attachment 146737 [details] [diff] [review]
increase limits and add prefs
a=mkaply for 1.7.2
Attachment #146737 -
Flags: approval1.7.2? → approval1.7.2+
Assignee | ||
Comment 56•20 years ago
|
||
i'll check this into the 1.7 branch and jump on mconnor for an aviary checkin.
Assignee | ||
Comment 57•20 years ago
|
||
fixed on aviary1.0 and 1.7 branches.
Keywords: fixed-aviary1.0,
fixed1.7.2
Whiteboard: needed-aviary1.0
Assignee | ||
Updated•20 years ago
|
Flags: blocking-aviary1.0RC1?
Flags: blocking-aviary1.0?
Reporter | ||
Comment 58•20 years ago
|
||
Just a final comment. heise.de (the site where this bug caused troubles) wants
to thank you for fixing this problem. If you can read read german pages, read
their statement here:
http://www.heise.de/foren/faq/go.shtml?forum_id=7262#mozilla
[[otherwise use babelfish or google or or or]]
Many thanks from me too!
You need to log in
before you can comment on or make changes to this bug.
Description
•