Firefox receives cookies for a website that was set as blocked in the permission section
Categories
(Core :: Storage: localStorage & sessionStorage, defect, P2)
Tracking
()
| Tracking | Status | |
|---|---|---|
| firefox-esr78 | --- | wontfix |
| firefox74 | --- | wontfix |
| firefox75 | --- | wontfix |
| firefox84 | --- | wontfix |
| firefox85 | --- | fix-optional |
| firefox86 | --- | fix-optional |
People
(Reporter: cbaica, Unassigned, NeedInfo)
Details
(Keywords: regression)
Attachments
(1 file)
|
2.00 MB,
video/mp4
|
Details |
Affected versions
- Fx74.0b9
- Fx75.0a1
Affected platforms
- Windows 7
Steps to reproduce
- Launch Firefox.
- Go to about:preferences#privacy, scroll down to the 'Cookies and Site Data' and click 'manage permisions'.
- In the field write 'https://www.reddit.com', click the 'Block' button and save changes.
- In a new tab, navigate to the blocked website (reddit.com).
- After the website loads, switch back to the about:preferences#privacy page and refresh it.
- Scroll down to the 'Cookies and Site Data' section and click 'Manage Data'
Expected result
- Cookies from reddit should not be displayed.
Actual result
- Cookies from reddit.com are displayed.
Regression range
- Will come back with a regression range ASAP.
Additional notes
- Issue can't be reproduced on windows 10.
| Reporter | ||
Updated•1 year ago
|
Comment 1•1 year ago
|
||
I can't reproduce on OSX, but I guess you said that this only reproduces on Windows 7 anyway, which seems really weird to me. If this can be consistently reproduced getting a regression range would be nice.
Updated•1 year ago
|
| Reporter | ||
Comment 2•1 year ago
|
||
As a side-note I've investigated the issue further on ubuntu and macOS 10.13. I've managed to reproduce the issue there as well.
I've ran the regression and here is the result:
- Last good revision: e86c59fab68eb8c9139526aaa2480132b5fc1452
- First bad revision: faa3b669a3cddb7e7d5002d51dd403b6dea91da3
- Pushlog: https://hg.mozilla.org/integration/autoland/pushloghtml?fromchange=e86c59fab68eb8c9139526aaa2480132b5fc1452&tochange=faa3b669a3cddb7e7d5002d51dd403b6dea91da3
Comment 3•1 year ago
|
||
Thanks. It's interesting that you can reproduce on OSX, but the regression range seems very unlikely.
Just to clarify, you're using a fresh profile, right? And without having been to reddit.com on that profile before?
| Reporter | ||
Comment 4•1 year ago
|
||
Yes, I'm using a fresh profile every time, so there wouldn't be any 'residual' navigation data.
As for the regression range, I'm not sure wether it matters, but I got it on an ubuntu 18.04 machine.
Comment 5•1 year ago
|
||
Steven, could you set the priority flag for this bug?
Updated•1 year ago
|
Comment 6•1 year ago
|
||
I can reproduce this on Nightly build 20200305212712 on Ubuntu 18.04.
However, I'm wondering if this is the expected functionality. The address entered in the video is https://www.reddit.com, and we see that no www.reddit.com cookies are set, only reddit.com. If instead enter https://reddit.com then no cookies are set, so I'm guessing we either use full hostname matching to the domain attribute of the cookie or eTLD+1.
Johann can you confirm that's the expected functionality?
Updated•1 year ago
|
Comment 7•1 year ago
|
||
In the video storage is set, though. So maybe we have a place where storage code is not respecting the cookie permissions correctly? I don't think the site data manager is showing regular cache in the storage section, so it would have to be something like localStorage.
Comment 8•1 year ago
•
|
||
Something weird is definitely happening if I navigate to "reddit.com" like in the video. When I opened the network panel in devtools I was able to reproduce locally and the profile is showing that we are storing data in QuotaManager for the on-disk encoded origin of https+++www.reddit.com for both LSNG and the Cache API and there is a ServiceWorker registration for https://www.reddit.com/. The only thing in Cache API storage is the ServiceWorker's https://www.reddit.com/sw.js script.
My naive presumption would be that we're sending the non-existent permissions for "reddit.com" or for a pre-STS upgrade "http" origin down to the process, not "https://www.reddit.com". If the permission isn't making it into the process then it would make sense that StorageAllowedForWindow would return StorageAccess::eAllow when it shouldn't. This same check is used both for LocalStorage and for ServiceWorkers as called by ServiceWorkerContainer::Register.
I'm going to try and get a pernosco reproduction up now and without involving devtools.
Updated•1 year ago
|
Comment 9•1 year ago
|
||
Hi Andrew, any luck in getting the pernosco session?
Comment 10•5 months ago
|
||
I have managed to reproduce this issue on Mac OS 11 while running these steps:
- Open Firefox and go to about:preferences -> "Privacy & Security" section.
Firefox is opened and "Privacy & Security" section is displayed - Make sure that "Standard" option is set in the Enhanced Tracking Protection section.
Standard is set by default - Go to Cookies and Site Data -> click on "Manage Data..."
You can see a list with many sites displayed - In the "Cookies and Site Data" click on the "Manage Permissions..." button.
The "Exceptions - Cookies and Site Data" dialog is opened. - Add https://www.reddit.com/ in the field manually and hit the "Block" button.
The website in question is added in the list with the "Block" status. - Go to Reddit.
The website will not be completely loaded as all cookies are automatically blocked. - Go to Cookies and Site Data -> click on "Manage Data..."
You can see a list of websites but https://www.reddit.com/ is NOT displayed on that list.
Reddit related cookies are shown in the list.
Considering previous comments, this appears not to be Windows 7 specific and somewhat intermittent.
Updated•5 months ago
|
Updated•5 months ago
|
Description
•