CORS fetch is not always being shown as blocked on Firefox
Categories
(DevTools :: Netmonitor, defect, P2)
Tracking
(Not tracked)
People
(Reporter: doug.hs, Unassigned)
References
(Blocks 1 open bug)
Details
Attachments
(4 files)
User Agent: Mozilla/5.0 (X11; Linux x86_64; rv:84.0) Gecko/20100101 Firefox/84.0
Steps to reproduce:
SITE_A returns an "Access-Control-Allow-Origin" header that does NOT match SITE_B:
Access-Control-Allow-Origin: SITE_X
Actual results:
Sometimes the fetch request from SITE_B to SITE_A will be blocked as expected, while other times it succeeds with code 200 and leaks the confidential JSON data to SITE_B.
The Console tab indicates the request was blocked, but the Network tab shows otherwise (see screenshot). The .catch() part of the .fetch() Promise chain appears to be working as expected (the webpage displays an error message).
I cannot reproduce this on Chromium and GNOME Web, only on Firefox.
Expected results:
The fetch request should always be blocked, because it violates the CORS configuration.
Comment 2•4 years ago
|
||
Bugbug thinks this bug should belong to this component, but please revert this change in case of error.
Comment 3•4 years ago
|
||
Could you provide some runnable code to reproduce, or a publicly accessible endpoint?
Comment 4•4 years ago
|
||
Also, can you clarify how you're establishing CORS is broken here?
Based on:
The .catch() part of the .fetch() Promise chain appears to be working as expected (the webpage displays an error message).
I suspect that really, all that's happening is that the devtools network tab is not clear in how it labels these items (though there are downsides to indicating the request was "blocked", as people seem prone to mistakenly assume that this means the request was not made and the response not received, which of course is not the case in any browser - we need the response to the request in order to even see the Access-Control-Allow-Origin
header).
(In reply to Valentin Gosu [:valentin] (he/him) from comment #3)
Could you provide some runnable code to reproduce, or a publicly accessible endpoint?
Sure, I'm working on it.
(In reply to :Gijs (he/him) from comment #4)
Also, can you clarify how you're establishing CORS is broken here?
If the purpose of CORS is merely to prevent fetch
from working with the response body, then there's nothing wrong with it. I just thought CORS was also meant to prevent obtaining the body of a response that violates it. This, combined with the observable behavior of other browsers, led me to that conclusion.
(In reply to :Gijs (he/him) from comment #4)
Also, can you clarify how you're establishing CORS is broken here?
Now I understand that CORS is only meant to prevent scripts from using a cross-site resource, but nothing prevents us from doing it manually through the browser. If that's so, then this is not bug. I apologize for the confusion.
Comment 7•4 years ago
|
||
(In reply to doug.hs from comment #5)
(In reply to :Gijs (he/him) from comment #4)
Also, can you clarify how you're establishing CORS is broken here?
If the purpose of CORS is merely to prevent
fetch
from working with the response body, then there's nothing wrong with it. I just thought CORS was also meant to prevent obtaining the body of a response that violates it.
It's meant to prevent the site making the request from working with the response body, ie to prevent data leaks from site A to site B if site A does not allow site B access. If site B does not get access to the response body, no data leak can happen.
This, combined with the observable behavior of other browsers, led me to that conclusion.
Unfortunately, I think the other browser is misleading you. :-)
I'm on a mac, so conveniently turning on apache means that my localhost http root serves a page whose entire content is: <html><body><h1>It works!</h1></body></html>\n
.
Loading any website in Chrome, and then running fetch("http://localhost/").then(r => r.text()).then(console.log)
in its developer tools indeed shows in Chrome's network console "Failed to load response data", and shows the request was blocked. However, if you set up Wireshark to listen on the loopback interface, and filter for http traffic, it's easy to see that the http request to the local apache server is made and succeeds, and serves the entire content of the page to Chrome (which, I assume, throws it away as nothing on the web will be able to access it).
(In reply to doug.hs from comment #6)
(In reply to :Gijs (he/him) from comment #4)
Also, can you clarify how you're establishing CORS is broken here?
Now I understand that CORS is only meant to prevent scripts from using a cross-site resource, but nothing prevents us from doing it manually through the browser. If that's so, then this is not bug. I apologize for the confusion.
No worries at all. There's still a bug here, but fortunately it's not a security issue right before Christmas. ;-)
Firefox should still be consistent about how we display such requests in the network monitor and it's still a display bug if we aren't, and one that we should definitely fix, and for which the testcase Valentin asked for would definitely still be useful.
(In reply to :Gijs (he/him) from comment #7)
Firefox should still be consistent about how we display such requests in the network monitor and it's still a display bug if we aren't, and one that we should definitely fix, and for which the testcase Valentin asked for would definitely still be useful.
Nice. I'm almost finishing a simple test project with Docker. If I manage to reproduce it there, I'll provide the code.
This is the Docker setup you can use to reproduce the glitches. The source code is included and it will be copied into the document root and managed by a volume.
Start it:
docker-compose up
Then in your /etc/hosts
file, add these lines:
127.0.0.1 sitea.test
127.0.0.1 siteb.test
Then in your browser, open http://siteb.test:8080/
.
With the Network tab open in the inspector, start clicking the "fetch!" button on the page to trigger a fetch to http://sitea.test:8080/index.php
.
Observations
You'll notice that most of the blocked requests result in a 0-byte response. They show up as blocked and reveal no data in the Response tab, as expected. But sometimes you'll see a response size larger than 0 bytes, select it and navigate to the Response tab. The JSON data is there, unlike other blocked requests. This happens a lot, it's much more common glitch.
And, as I initially described, you'll also see responses with code 200 plus the JSON data available in the Response tab. These do not seem to be so common. I had to click many many times to make it happen.
Reporter | ||
Comment 10•4 years ago
|
||
Comment 11•4 years ago
|
||
Honza, are you able to look at the testcase here and see why the network monitor isn't consistently displaying CORS blockage?
Comment 12•4 years ago
|
||
@bomsy - I am forwarding the NI to you (per our offline discussion), thank you for looking into this.
Honza
Comment 13•4 years ago
|
||
(In reply to :Gijs (he/him) from comment #11)
Honza, are you able to look at the testcase here and see why the network monitor isn't consistently displaying CORS blockage?
i was able to reproduce this on Firefox 84, but i see something slightly different. I see blocked and unblocked (200) requests. All the requests, both blocked and unblocked show the data in the response tab and the header tab also always show the 200 OK response code. See the attached
In Firefox Nightly (86) though, i only see blocked requests, so it seems that part of the issue is fixed.
But there seems to be a couple of issues left to fix.
- There should be no size shown for blocked requests
- In the details panel on right response information should not be shown
This is likely a devtools issue.
Thanks
Updated•4 years ago
|
Comment 14•4 years ago
|
||
(In reply to Hubert Boma Manilla (:bomsy) from comment #13)
- In the details panel on right response information should not be shown
I think you know this, but just to be explicit: I think it's a good idea if that panel makes it clear that the JS on the page won't have access to the info, but we should avoid the impression that Chrome makes here, that the request didn't make it to the server and was processed there (because it did!). :-)
Comment 15•4 years ago
•
|
||
(In reply to :Gijs (he/him) from comment #14)
I think you know this, but just to be explicit: I think it's a good idea if that panel makes it clear that the JS on the page won't have access to the info, but we should avoid the impression that Chrome makes here, that the request didn't make it to the server and was processed there (because it did!). :-)
That's right. There were discussions about showing some sort of notifications in the headers panel to detail that info. Thanks for the note.
Description
•