Ensure LocalStorage databases aren't created for blocked sub-domains which are potentially reached via a redirect from their root domain (ex: blocking www.reddit.com and navigating to reddit.com)
Categories
(Core :: Storage: localStorage & sessionStorage, defect, P2)
Tracking
()
Tracking | Status | |
---|---|---|
firefox-esr78 | --- | wontfix |
firefox74 | --- | wontfix |
firefox75 | --- | wontfix |
firefox84 | --- | wontfix |
firefox85 | --- | fix-optional |
firefox86 | --- | fix-optional |
People
(Reporter: cbaica, Unassigned)
Details
(Keywords: regression, Whiteboard: dom-lws-bugdash-triage)
Attachments
(2 files)
Affected versions
- Fx74.0b9
- Fx75.0a1
Affected platforms
- Windows 7
Steps to reproduce
- Launch Firefox.
- Go to about:preferences#privacy, scroll down to the 'Cookies and Site Data' and click 'manage permisions'.
- In the field write 'https://www.reddit.com', click the 'Block' button and save changes.
- In a new tab, navigate to the blocked website (reddit.com).
- After the website loads, switch back to the about:preferences#privacy page and refresh it.
- Scroll down to the 'Cookies and Site Data' section and click 'Manage Data'
Expected result
- Cookies from reddit should not be displayed.
Actual result
- Cookies from reddit.com are displayed.
Regression range
- Will come back with a regression range ASAP.
Additional notes
- Issue can't be reproduced on windows 10.
Reporter | ||
Updated•5 years ago
|
Comment 1•5 years ago
|
||
I can't reproduce on OSX, but I guess you said that this only reproduces on Windows 7 anyway, which seems really weird to me. If this can be consistently reproduced getting a regression range would be nice.
Updated•5 years ago
|
Reporter | ||
Comment 2•5 years ago
|
||
As a side-note I've investigated the issue further on ubuntu and macOS 10.13. I've managed to reproduce the issue there as well.
I've ran the regression and here is the result:
- Last good revision: e86c59fab68eb8c9139526aaa2480132b5fc1452
- First bad revision: faa3b669a3cddb7e7d5002d51dd403b6dea91da3
- Pushlog: https://hg.mozilla.org/integration/autoland/pushloghtml?fromchange=e86c59fab68eb8c9139526aaa2480132b5fc1452&tochange=faa3b669a3cddb7e7d5002d51dd403b6dea91da3
Comment 3•5 years ago
|
||
Thanks. It's interesting that you can reproduce on OSX, but the regression range seems very unlikely.
Just to clarify, you're using a fresh profile, right? And without having been to reddit.com on that profile before?
Reporter | ||
Comment 4•5 years ago
|
||
Yes, I'm using a fresh profile every time, so there wouldn't be any 'residual' navigation data.
As for the regression range, I'm not sure wether it matters, but I got it on an ubuntu 18.04 machine.
Comment 5•5 years ago
|
||
Steven, could you set the priority flag for this bug?
Updated•5 years ago
|
Comment 6•5 years ago
|
||
I can reproduce this on Nightly build 20200305212712 on Ubuntu 18.04.
However, I'm wondering if this is the expected functionality. The address entered in the video is https://www.reddit.com
, and we see that no www.reddit.com
cookies are set, only reddit.com
. If instead enter https://reddit.com
then no cookies are set, so I'm guessing we either use full hostname matching to the domain
attribute of the cookie or eTLD+1.
Johann can you confirm that's the expected functionality?
Updated•5 years ago
|
Comment 7•5 years ago
|
||
In the video storage is set, though. So maybe we have a place where storage code is not respecting the cookie permissions correctly? I don't think the site data manager is showing regular cache in the storage section, so it would have to be something like localStorage.
Comment 8•5 years ago
•
|
||
Something weird is definitely happening if I navigate to "reddit.com" like in the video. When I opened the network panel in devtools I was able to reproduce locally and the profile is showing that we are storing data in QuotaManager for the on-disk encoded origin of https+++www.reddit.com
for both LSNG and the Cache API and there is a ServiceWorker registration for https://www.reddit.com/
. The only thing in Cache API storage is the ServiceWorker's https://www.reddit.com/sw.js
script.
My naive presumption would be that we're sending the non-existent permissions for "reddit.com" or for a pre-STS upgrade "http" origin down to the process, not "https://www.reddit.com". If the permission isn't making it into the process then it would make sense that StorageAllowedForWindow would return StorageAccess::eAllow when it shouldn't. This same check is used both for LocalStorage and for ServiceWorkers as called by ServiceWorkerContainer::Register.
I'm going to try and get a pernosco reproduction up now and without involving devtools.
Updated•5 years ago
|
Comment 9•5 years ago
|
||
Hi Andrew, any luck in getting the pernosco session?
Comment 10•4 years ago
|
||
I have managed to reproduce this issue on Mac OS 11 while running these steps:
- Open Firefox and go to about:preferences -> "Privacy & Security" section.
Firefox is opened and "Privacy & Security" section is displayed - Make sure that "Standard" option is set in the Enhanced Tracking Protection section.
Standard is set by default - Go to Cookies and Site Data -> click on "Manage Data..."
You can see a list with many sites displayed - In the "Cookies and Site Data" click on the "Manage Permissions..." button.
The "Exceptions - Cookies and Site Data" dialog is opened. - Add https://www.reddit.com/ in the field manually and hit the "Block" button.
The website in question is added in the list with the "Block" status. - Go to Reddit.
The website will not be completely loaded as all cookies are automatically blocked. - Go to Cookies and Site Data -> click on "Manage Data..."
You can see a list of websites but https://www.reddit.com/ is NOT displayed on that list.
Reddit related cookies are shown in the list.
Considering previous comments, this appears not to be Windows 7 specific and somewhat intermittent.
Updated•4 years ago
|
Updated•4 years ago
|
Comment 11•4 years ago
|
||
I have managed to reproduce this issue on Ubuntu 20.04 using the steps from comment 0. The cookies are still shown in the list.
Comment 12•3 years ago
|
||
Hi Jan, can you try to reproduce this, please?
Comment 13•3 years ago
|
||
I tested on macOS with Firefox Nightly 99.0a1 (2022-02-22) (64-Bit) using the instructions in comment 0. Between every page load, I deleted the corresponding cookies. I also made sure to refresh about:preferences#privacy
every time, because otherwise new cookies will not show up.
Doing multiple tests shows that it depends on how you navigate to the website. For example typing https vs http or using www vs not. This could be the reason why some could reproduce it, while others not or it may appear as intermittent. To give some more insights, I varied the URL in the blocklist. Also note that the cookie list in the "Manage Data..." dialog does not show the protocol of the host name.
The following overview outlines how the blocking URL affects whether cookies are saved or not depending on the URL that is used to visit the website. [-]
means no cookies for reddit.com were saved, [x]
means there are cookies for reddit.com after going to the URL that follows after [-]/[x]
.
If I see cookies, they are always from reddit.com and not from www.reddit.com. Even with an empty blocklist, it's always the host without www that is listed in the "Manage Data..." dialog.
Blocking https://www.reddit.com
- [-] https://www.reddit.com
- [-] http://www.reddit.com
- [-] www.reddit.com
- [x] https://reddit.com
- [x] http://reddit.com
- [x] reddit.com
Blocking http://www.reddit.com
- [x] https://www.reddit.com
- [x] http://www.reddit.com
- [x] www.reddit.com
- [x] https://reddit.com
- [x] http://reddit.com
- [x] reddit.com
Blocking https://reddit.com
- [-] https://www.reddit.com
- [-] http://www.reddit.com
- [-] www.reddit.com
- [-] https://reddit.com
- [-] http://reddit.com
- [-] reddit.com
Blocking http://reddit.com
- [x] https://www.reddit.com
- [x] http://www.reddit.com
- [x] www.reddit.com
- [x] https://reddit.com
- [x] http://reddit.com
- [x] reddit.com
Comment 14•3 years ago
|
||
(asuth to try and identify existing test coverage that could form the basis of an automated test, will set needinfo on :jkrause when posting it)
Comment 15•3 years ago
•
|
||
I think it was probably a bit ambiguous about what the scope of the bug was here, so I've re-titled it and I'll go into more detail here.
Bug scope: If navigating to "reddit.com" in the URL bar with cookies blocked for "https://www.reddit.com", and there is no QuotaManager storage for the origin on disk at PROFILE/storage/default/https+++www.reddit.com
then after navigating to "reddit.com" in the URL bar, there should still be no storage for the origin on disk.
Extra context:
- The
about:preferences#privacy
was changed since this bug was filed so that it only displays data on a granularity of eTLD+1. So if there is data for "www.reddit.com" it will be displayed as "reddit.com". If there's data for "reddit.com" it will also be displayed for "reddit.com". Same for "foo.reddit.com". - reddit.com is in our strict-transport-security preload list and that includes its subdomains. We will never attempt to talk to "http://reddit.com" or "http://www.reddit.com", only their https variants.
https://reddit.com/
servces a 301 redirect tohttps://www.reddit.com
and there are no cookies.https://www.reddit.com
serves cookies with a domain of bothreddit.com
and.reddit.com
. So it seems quite possibly reasonable that cookies might show up on "reddit.com" if we're only blocking "www.reddit.com". Arguably it's nonsensical for someone to block just a subdomain because of how cookies work, and maybe that's something that the "manage exceptions" UI should handle. But that's not this bug, that would be an anti-tracking bug/privacy bug.- A bunch of improvements / bug fixes have been made about permission transmission and just in general with the fission process model.
So I think we probably want to re-run these tests with a slightly altered procedure where we:
- Only test blocking for "https://www.reddit.com" and "https://reddit.com" since it seems like the http block, as expected, does not block https!
- Make sure there's no
PROFILE/storage/default/https+++www.reddit.com
before testing the navigation. - Our check becomes only checking if that directory showed up or not. We don't care about the preferences privacy UI and in fact we do expect that cookies will probably get set against "reddit.com" because of how cookies work and the specificity of the block.
Based on the existing results in comment 13 I would expect that we should probably pass this test.
Test Investigation Notes
- using searchfox to find the place where the cookie permissions come from. I have problems remembering the cookie permissions exactly sometimes and the interface name, so I actually did sf SESSION (where sf is my keyword bookmark for searchfox that I added by right-clicking in the searchfox.org search field and choosing "add a keyword for this search") and then I used ctrl-f to search for "cookie" within the search.
- That got me to https://searchfox.org/mozilla-central/source/netwerk/cookie/nsICookiePermission.idl somewhat indirectly because I actually saw a line for
netwerk/cookie/nsICookieManager.idl
and I just clicked on the directory name to go to that directory and scan the list of files fornsI
prefixes and then realizednsICookiePermission.idl
is what I wanted. - I clicked on ACCESS_DENY and did the substring search option. (There should be a semantic option, but there's a regression in searchfox right now. However, the text search is actually preferable in this case because the semantic options aren't as guaranteed to figure out the JS semantic stuff, so the text search should find everything)
- That gives me https://searchfox.org/mozilla-central/search?q=ACCESS_DENY&redirect=false
- Because searchfox chunks things up so that "Test files" are separate, I scroll down to the test files section and see that it looks like there's a fairly limited set of them there.
- I do notice there's a test file that mentions ACCESS_DENY but I don't see the constant actually being used, which is suspicious, so I click on that, which is https://searchfox.org/mozilla-central/source/dom/tests/mochitest/localstorage/test_cookieBlock.html#17 and I see that at https://searchfox.org/mozilla-central/source/dom/tests/mochitest/localstorage/test_cookieBlock.html#44 there's SpecialPowers.pushPermissions([{'type': 'cookie', 'allow': false, 'context': document}], startTest); which suggests there's probably a bunch of other tests that my above search missed out on.
- Searchfox has a regexp mode that's enabled by checking a little checkbox, so I can check at least for same line variants of the above... that gets me https://searchfox.org/mozilla-central/search?q=SpecialPowers.pushPermissions.*cookie.*allow&path=&case=false®exp=true which shows that there don't seem to be other same-line variants at least. Unfortunately, same-line is a pretty bad requirement.
- So I use the secret searchfox fulltext-search-only context:N mechanism to search for pushPermissions usages and then get extra lines of context. I force everything to be a fulltext search by putting "." in the path filter which disables semantic searching. This gets me https://searchfox.org/mozilla-central/search?q=context%3A5+SpecialPowers.pushPermissions&path=.&case=false®exp=false and then I can ctrl-f for "cookie" in there.
- probably the most interesting thing I found in that initial ACCESS_DENY search was the link to https://searchfox.org/mozilla-central/source/netwerk/cookie/test/browser/head.js#10 which creates its own "PERM_DENY" constant, and which has a CookiePolicyHelper which runs a given test under a number of permutations: https://searchfox.org/mozilla-central/rev/73a8000b0c0eb527faef01ea17c6d2398622a386/netwerk/cookie/test/browser/head.js#16-24,77
- ah, but that turns out to only be used for our storage-related APIs, mainly, whoops! https://searchfox.org/mozilla-central/search?q=symbol:%23CookiePolicyHelper&redirect=false
The general takeaway would probably be that CookiePolicyHelper is probably a good basis for any tests operating in this area as it can let us check that we're really checking what we think we're checking by also making sure out check detects data when it's allowed, etc. But I don't know that we need to add an automated test for this at this time.
Comment 16•3 years ago
•
|
||
Thanks!
Arguably it's nonsensical for someone to block just a subdomain because of how cookies work, and maybe that's something that the "manage exceptions" UI should handle. But that's not this bug, that would be an anti-tracking bug/privacy bug.
So we seem to confuse our users quite a bit here, up to the point to confuse ourselves, too. I think this is worth filing a bug (if it does not exist already). Jan-Rio, apart from re-testing following :asuth instructions, can you please check and file it in case?
Comment 17•3 years ago
|
||
I did another test as suggested by Andrew in comment 15. I was using Firefox Nightly 100.0a1 (2022-03-09) (64-Bit) on macOS. Checking for files in PROFILE/storage/default/https+++www.reddit.com
, but not examining or clearing cookies in between.
Blocking https://www.reddit.com, navigating to ...
- https://www.reddit.com/: no files are created
- https://www.reddit.com: no files are created
- https://reddit.com/: files are created
- https://reddit.com: files are created
There is one exception or intermittent when navigating to the non-www host, especially if you use a trailing slash or not. Sometimes, no files are created when using one of the last two URLs.
I think this is because the search bar handles the redirect itself and takes you directly to the https://www version of the website, skipping the server-sided redirection of the website. See attachment for clarification. I just pressed enter after typing the URL and did not click on any suggestion. I guess this also depends on whether or not you have already visited the website.
Blocking https://reddit.com
- no files are created at all using the 4 URLs from above
Updated•2 years ago
|
Updated•8 months ago
|
Description
•