Closed Bug 219157 Opened 17 years ago Closed 4 years ago
Cookie not sent when site is disabled from setting cookies (including "for originating site")
16.51 KB, text/plain
718 bytes, text/html
5.98 KB, patch
|Details | Diff | Splinter Review|
2.20 KB, patch
|Details | Diff | Splinter Review|
6.37 KB, patch
|Details | Diff | Splinter Review|
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.5b) Gecko/20030913 Galeon/1.3.8 Build Identifier: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.5b) Gecko/20030913 Galeon/1.3.8 In previous versions of mozilla if a site set a cookie, and then you blocked it from settings any more, the first cookie that it had set was still sent to the server, as of 1.5b this seems not to be the case Reproducible: Always Steps to Reproduce: 1. Go to http://news.bbc.co.uk 2. Select one of the 'domestic', or 'international' options. (this pref is stored in a cookie) 3. Block cookies for news.bbc.co.uk 4. go back to http://news.bbc.co.uk Actual Results: When you go back to the site, the cookie was not sent, so you have to select which part of the site to view. Expected Results: Mozilla should send the cookies it has currently set, but dissallow setting of further cookies, like it did in all previous versions.
i think our current behavior is more privacy-oriented, which is a good thing for most users. i implemented that for a reason. however, i agree that we should have a setting that allows currently set cookies to be sent... it's a nice usability feature. not sure how we could do that nicely just yet.
The main problem is that the UI and the behaviour are completly inconsistent at the moment. The UI talks about blocking sites from _setting_ cookies, but the implementation is setting and sending cookies, there is a significant different there. IMO the old behaviour where you blocked a site from setting cookies, but any already set (which, if you cared about privacy you could delete) made much more sense
>The main problem is that the UI and the behaviour are completly inconsistent at >the moment. The UI talks about blocking sites from _setting_ cookies, but the >implementation is setting and sending cookies, there is a significant different >there. okay, that's a different issue... we can fix that at least :) >IMO the old behaviour where you blocked a site from setting cookies, but any >already set (which, if you cared about privacy you could delete) made much more >sense sure, but I'm arguing that there exist users for which the opposite is true. as i said i'm not sure what the best solution is yet
The behaviour is quite wierd (using Firebird 0.7 Gecko/20030925), according to the log (attached above) the browser sends cookies to newsimg.bbc.co.uk -- which is not even in cookperm.txt -- but not to news.bbc.co.uk. The relevant lines from my system are: cookperm.txt: www.bbc.co.uk 0F news.bbc.co.uk 0F cookies.txt: .bbc.co.uk TRUE / FALSE 1091826396 BBCNewsAudience Domestic This worked fine until a few weeks ago. linuxtoday.com doesn't log in now, either.
um, why are you complaining? you've blocked cookies for news.bbc.co.uk, and mozilla isn't sending or accepting cookies from it. you haven't blocked newsimg.bbc.co.uk, so mozilla sends cookies to and accepts cookies from it. i fail to see the problem.
The problem is the behaviour has changed. We have had the old behaviour for many years, and then suddenly, it changes. I havn't blocked cookies from news.bbc.co.uk, I have blocked news.bbc.co.uk from "setting" cookies. There is a difference. From a users PoV (i.e me) the old behaviour makes a lot of sense, you can accept some cookies, and then block any more from being set, but old ones go through. If you wanted to remove the already set cookies you could remove them from the cookie manager.
i wasn't asking you, i was asking alistair. you've already made your case ;)
My case is the same... bbc.co.uk tries to set several cookies, including one called UID, which I don't want set; the only one I want set is BBCNewsAudience. The old behaviour was fine -- you could delete any cookies you didn't want and stop any more from being set; the current behaviour is a regression which means I keep getting dumped at http://news.bbc.co.uk/shared/hi/interstitial-news.stm . Let me put it another way: if a cookie shows up in the cookies panel, it's counter-intuitive that it isn't actually sent now.
*** Bug 223035 has been marked as a duplicate of this bug. ***
so either we need to undo this behaviour, or delete the cookies affected by a perms change. The later makes more sense, but entails risk if people don't realize how domainwalking works :) however, if you want to block a site, you probably don't want it using the saved cookie to track you either, so maybe deletion is the best answer?
I can recreate this described behavior when "for originating site" option is selected. Cookies that have been set previously by site 1 are not sent when site 2 references site 1 (with an img/iframe/etc). This is slightly different from the argued situation where disabling sites should possibly clear the cookies as well. It's because the user still wants the cookies for site 1 as well as those from site 2, but because the "originating site" is site 2, site 1's cookies are not sent with the request. It would be expected that the cookies are sent while they cannot be updated because the option only restricts setting cookies.
Status: UNCONFIRMED → NEW
Ever confirmed: true
Summary: Cookie not sent when site is disabled from setting cookies → Cookie not sent when site is disabled from setting cookies (including "for originating site")
In nsCookieService.cpp, both GetCookieStringFromHttp (send) and SetCookieStringFromHttp (receive) use CheckPrefs to determine if a cookie should be rejected/accepted. They can tell if it's setting or getting by checking aCookieHeader. Perhaps instead of rejecting both setting and getting (for both specific site cookie blocking and preference based originating site), let the cookie be sent (get) anyway.. ? < return nsICookie::STATUS_REJECTED; > return aCookieHeader ? nsICookie::STATUS_REJECTED : nsICookie::STATUS_ACCEPTED;
-> default owner
Assignee: darin → nobody
I'd like to work on this bug :)
(In reply to mitchdevel from comment #16) > I'd like to work on this bug :) It is yours. Ask if you need help.
Assignee: nobody → mitchdevel
Status: NEW → ASSIGNED
Sites can now load currently saved cookies when the user has prevented sites from saving new cookies
Attachment #8754330 - Flags: review?(mcmanus)
Comment on attachment 8754330 [details] [diff] [review] Removed preferences check when getting cookie thanks for the patch - valentin will take a look at it
Attachment #8754330 - Flags: review?(mcmanus) → review?(valentin.gosu)
thanks i wasn't sure who to ask
How does this affect Private Browsing mode? A cookie gets accepted in normal mode. User then prevents the site from saving new cookies. In normal browsing mode with this patch applied, the website can still read the original cookie, but not set a new one. Is that first cookie readable in Private Browsing mode?
In the case of private browsing mode the code creates an empty database, look at line 2988 of nsCookieServices.cpp
Comment on attachment 8754330 [details] [diff] [review] Removed preferences check when getting cookie Review of attachment 8754330 [details] [diff] [review]: ----------------------------------------------------------------- Thanks for the patch. It felt like this needed a test. I hope you don't mind that I wrote it myself.
Attachment #8754330 - Flags: review?(valentin.gosu) → review+
Attachment #8754599 - Flags: review?(mcmanus)
Setting the assignee to the actual owner. Again, thanks for implementing this.
Assignee: valentin.gosu → mitchdevel
I have a couple questions that I'd love to get an answer for if you have any time 1. What made you decide it would need a new permanent test? Incase somebody unknowingly reverts the change? 2. I noticed you added the test in "netwerk/test/mochitests/" rather than creating a new folder at "netwerk/cookie/test/mochitests/". Would there have been a downside to creating a new mochitests folder and putting the test in there? Sorry if these questions are silly I am learning and thanks for the help with the bug!
(In reply to mitchdevel from comment #26) > I have a couple questions that I'd love to get an answer for if you have any > time These are very good questions. > 1. What made you decide it would need a new permanent test? Incase somebody > unknowingly reverts the change? That is one reason. It is also common to add a test whenever changing a behaviour to ensure it works as expected. > 2. I noticed you added the test in "netwerk/test/mochitests/" rather than > creating a new folder at "netwerk/cookie/test/mochitests/". Would there have > been a downside to creating a new mochitests folder and putting the test in > there? There's no strong reason for this choice. The cookie test folder is probably the better choice. > Sorry if these questions are silly I am learning and thanks for the help > with the bug! No questions are silly. Good work on your first patch! Also, in case you haven't found it, http://codefirefox.com/ is an excellent resource for new contributors.
Thank you Valentin, that's helped a lot. :)
Attachment #8754599 - Flags: review?(mcmanus) → review+
Mitch, it seems we have a test failure in test_pluginstream_3rdparty.html Would you like to investigate?
I'll have a look
renamed CheckPrefs to CheckSafe to better reflect its usage updated all references to CheckPrefs to CheckSafe added a new parameter to allow different behaviour while getting and setting cookies most previous tests that failed on the try server should now work correctly (ran all tests that failed previously, all tests in netwerk/test/ and /netwerk/cookie/test/ in addition) one test did still fail although the output from the message it gave is to do with UI and seems very very minor so not sure if that is an issue
Attachment #8755061 - Flags: review?(valentin.gosu)
Comment on attachment 8755061 [details] [diff] [review] Fix after try server test fail Review of attachment 8755061 [details] [diff] [review]: ----------------------------------------------------------------- This is much better. r=valentin We'll wait for the tests to complete, then I think we can land this. If you want to make these changes you're welcome to upload the fixed patch, otherwise I'll land the patch I pushed to try where I fixed them. Thanks a lot for your contribution! ::: netwerk/cookie/nsCookieService.cpp @@ +2026,5 @@ > nsCookieKey key(baseDomain, aOriginAttrs); > > // check default prefs > + CookieStatus cookieStatus = CheckSafe(aHostURI, aIsForeign, requireHostMatch, > + aCookieHeader.get(), true); nit: indent one char to the left. @@ +3005,5 @@ > } > > + // make sure we are sending the cookie to the correct place and are allowed to > + CookieStatus cookieStatus = CheckSafe(aHostURI, aIsForeign, requireHostMatch, > + nullptr, false); nit: indent @@ +3013,5 @@ > + case STATUS_REJECTED_WITH_ERROR: > + return; > + default: > + break; > + } nit: These lines were deleted in attachment 8754330 [details] [diff] [review], and added back. That patch should be obsoleted, and these lines left unchanged. @@ +3791,4 @@ > bool aIsForeign, > bool aRequireHostMatch, > + const char *aCookieHeader, > + bool aCheckPrefs) nit: indent all params one char to the left.
Attachment #8755061 - Flags: review?(valentin.gosu) → review+
Fixed the indentation errors and combined the previous patches
Thanks for the help :)
https://hg.mozilla.org/integration/mozilla-inbound/rev/c18023d517ed93d0e506f410c28ddfa60a2e6eee Bug 219157 - Test that cookies are set only when domain has permission. r=mcmanus https://hg.mozilla.org/integration/mozilla-inbound/rev/6647b13a6ad92410d6a71298eded13ca655edbbe Bug 219157 - Fix to prevent 3rd party sites from getting cookies r=valentin
These patches change behavior when users gives cookie permission with exception rules and default behavior is reject. nsCookieService::CheckSafe called in nsCookieService::GetCookieStringInternal bypasses pref checks. So, exception rules are completely ignored even if users set "accept" for the domain and access to cookie is rejected by default setting. Is this behavior intended?
That was unintentional thanks for pointing it out
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
This should be a simple fix, but I have a couple questions now that I hope can be answered 1. Should the test added in the previous patches be updated to test this new circumstance? 2. Should we also allow sending already saved cookies when the global default behaviour is set to prevent accepting new cookies to be consistent with this change? Sorry for making more work
This makes it so we can ignore per site preferences only if they would normally deny us sending the cookie, I'll wait and see your opinion before changing anything extra
(In reply to mitchdevel from comment #41) > This should be a simple fix, but I have a couple questions now that I hope > can be answered > > 1. Should the test added in the previous patches be updated to test this new > circumstance? Yes, please! > 2. Should we also allow sending already saved cookies when the global > default behaviour is set to prevent accepting new cookies to be consistent > with this change? I think that we do want to be consistent about this issue. Not sending cookies could be done by deleting the cookies for all (or each individual) site. However, this means the behaviour we've had for a long time, and handling this in a completely different way from Chrome. Patrick, do you have some input on this? I'd say this change warrants at least a release-note.
This assumes the current behaviour for default preferences (sites cannot get cookies if setting them is disabled)
Patrick, I forgot to ni? you on comment 43. I have backed out the patches to fix bug 1276462. The bottom line is that this change makes sense to me for the case where you blacklist each website. However, if we generalize this to the case where we globally don't allow cookies, it makes for a completely different behaviour than the one I tested in Chrome. Do you have some thoughts on how to proceed?
Flags: needinfo?(valentin.gosu) → needinfo?(mcmanus)
Dan do you have any thoughts on this?
Flags: needinfo?(sworkman) → needinfo?(dveditz)
Attachment #8755794 - Attachment is obsolete: true
It's dangerous to resurrect 13-year old bugs. The entire first half of this bug does not describe the current product and leads to incorrect conclusions. If we want to change cookie behavior around this we need to start in a clean bug, with a clear proposal of what will change and how it interacts with current UI and currently available cookie prefs. And a problem statement of what it's trying to solve and some evidence that this is, in fact, a problem for enough people to be worth fixing. At the time this bug was written it was a complaint about a deliberate change in behavior. 13 years later that changed behavior is now "normal" and mirrors that of other products. We also have other features like the global "allow from visited" setting that (for 3rd party cookies) disallows setting but sends any existing cookie -- exactly what this bug is asking for, at least in the 3rd party context.
Status: REOPENED → RESOLVED
Closed: 4 years ago → 4 years ago
Resolution: --- → WONTFIX
actually INVALID is better. you can make a case for a new setting in a new bug, but this bug, as described, no longer applies to modern Firefox.
Resolution: WONTFIX → INVALID
I want to note that the "door hanger" permission UI still reads "Set Cookies" - Allow, Allow for Session, Block. I also filed bug 1278011 to add tests for "getting cookies" as well, to avoid regressions such as bug 1276462.
You need to log in before you can comment on or make changes to this bug.