Closed Bug 1404744 Opened 7 years ago Closed 6 years ago

Block ftp subresource requests inside http(s) pages

Categories

(Core :: DOM: Security, defect, P3)

defect

Tracking

()

RESOLVED FIXED
mozilla61
Tracking Status
relnote-firefox --- 61+
firefox58 --- wontfix
firefox59 --- wontfix
firefox60 - wontfix
firefox61 --- fixed

People

(Reporter: jan, Assigned: evilpie)

References

(Blocks 1 open bug)

Details

(4 keywords, Whiteboard: [parity-Chrome][domsecurity-backlog2][wptsync upstream][adv-main61-])

Attachments

(3 files, 3 obsolete files)

This would perfectly fix bug 1361848.

https://groups.google.com/a/chromium.org/forum/#!topic/blink-dev/bIJdwwoQ98U
> Intent to Remove: Support for legacy protocols (`ftp:`) in subresource requests.

https://bugs.chromium.org/p/chromium/issues/detail?id=435547#c44
merged. August 2017

https://chromium.googlesource.com/chromium/src.git/+/b43863ad9ea99cc432eb95f99c15375bd73bee5e
> Chrome blocks 'ftp:' subresource request inside 'http'/'https' pages.
See Also: → 1374114
This wouldn't be fixed in the FTP code itself but more likely in the broader security checks (maybe should be CAPS instead of DOM:Security). Probably can't do it with existing Protocol flags since any "CheckMayLoad()" check would block navigating or downloading an FTP resource as well, and we'd want to allow those.
Component: Networking: FTP → DOM: Security
Keywords: sec-want
Priority: -- → P3
Whiteboard: [parity-Chrome] → [parity-Chrome][domsecurity-backlog2]
Blocks: 1361848
OS: Linux → All
Hardware: x86_64 → All
See Also: 13618481281434
Firefox already blocks ftp being loaded into https as part of the mixed content blocker. I'm not 100% where we can place the checks to prevent sub resource loads at this point in time.
(In reply to Jonathan Kingston [:jkt] from comment #4)
> Firefox already blocks ftp being loaded into https as part of the mixed
> content blocker. I'm not 100% where we can place the checks to prevent sub
> resource loads at this point in time.

Honza, ultimately I think we should try to eliminate FTP from Firefox alltogether. Probably we are not quite there yet, but this bug could be a step towards that goal. Anyway, I read your comment over in Bug 1361848 [1], where you suggested the change (blocking) should happen within NS_SecurityCompareURIs. That was a few month back, is this still the place where the blocking should happen or did things change in the meantime and we should block somewhere else?

I am not sure if we can block right away, but we could at least add some telemetry to see how often ftp is loaded inside top-level http(s) pages.

What do you think?

[1] https://bugzilla.mozilla.org/show_bug.cgi?id=1361848#c6
Flags: needinfo?(honzab.moz)
(In reply to Christoph Kerschbaumer [:ckerschb] from comment #5)
> (In reply to Jonathan Kingston [:jkt] from comment #4)
> > Firefox already blocks ftp being loaded into https as part of the mixed
> > content blocker. I'm not 100% where we can place the checks to prevent sub
> > resource loads at this point in time.
> 
> Honza, ultimately I think we should try to eliminate FTP from Firefox
> alltogether. Probably we are not quite there yet, but this bug could be a
> step towards that goal. Anyway, I read your comment over in Bug 1361848 [1],
> where you suggested the change (blocking) should happen within
> NS_SecurityCompareURIs. That was a few month back, is this still the place
> where the blocking should happen or did things change in the meantime and we
> should block somewhere else?
> 
> I am not sure if we can block right away, but we could at least add some
> telemetry to see how often ftp is loaded inside top-level http(s) pages.
> 
> What do you think?
> 
> [1] https://bugzilla.mozilla.org/show_bug.cgi?id=1361848#c6

Except that when I reread the comment now, it's probably wrong (how can ever http://foo.com/index.html be compared to ftp://foo.com/my-friednly-authenticating-subresource/ and expect the same-origin check to pass?  I was probably missing some coffees that day...), that comment is about blocking authentication dialogs, not blocking the whole load.

If we want to block subresource loads, CSP is probably a better place for that, similarly as we do block mixed content.
Flags: needinfo?(honzab.moz)
(In reply to Honza Bambas (:mayhemer) from comment #6)
> If we want to block subresource loads, CSP is probably a better place for
> that, similarly as we do block mixed content.

Ok, got it, but probably the CSP code or also the mixed content code is also not quite the right fit. However, we do block toplevel data: URI navigations within the contentSecuritymanager, probably somewhere around those lines we could also block FTP subresource loads.
(In reply to Christoph Kerschbaumer [:ckerschb] from comment #7)
> (In reply to Honza Bambas (:mayhemer) from comment #6)
> > If we want to block subresource loads, CSP is probably a better place for
> > that, similarly as we do block mixed content.
> 
> Ok, got it, but probably the CSP code or also the mixed content code is also
> not quite the right fit. However, we do block toplevel data: URI navigations
> within the contentSecuritymanager, probably somewhere around those lines we
> could also block FTP subresource loads.

Another possibility, yes.
Keywords: site-compat
The current beta will be the next ESR on 2018-05-09. Could you imagine to fix and uplift this, maybe allowing to disable the fix by pref and make a release note? The influential reporter of bug 1361848 did a reminder on Twitter.
[Tracking Requested - why for this release]:
See comment 9 and https://twitter.com/hanno/status/976773604779184128.
This is in the team's backlog, I don't think I need to revisit their triage decisions and track this for 60.
I am interested in working on this bug. I tried the approach of blocking FTP subresources with a content policy, which seems to work.
For testing purposes I added the code to nsDataDocumentContentPolicy.cpp. This is not the right place, but I wasn't sure where else to put it. There a various ContentPolicy implementations, but those all seems to serve a specific purpose. Would adding a new ContentPolicy implementation the right way to go here?
Assignee: nobody → evilpies
Attachment #8962327 - Flags: feedback?(ckerschb)
Comment on attachment 8962327 [details] [diff] [review]
POC Use content policy to block FTP iframes

Review of attachment 8962327 [details] [diff] [review]:
-----------------------------------------------------------------

Thanks for picking this one up!

(In reply to Tom Schuster [:evilpie] from comment #12)
> This is not the right place, but I wasn't sure where else to put it. There a
> various ContentPolicy implementations, but those all seems to serve a
> specific purpose. Would adding a new ContentPolicy implementation the right
> way to go here?

Adding a new contentpolicy is an option, but in my opinion not worth the trouble since we only want to block ftp subresource loads. The better option in my opinion is to add a function similar to AllowInsecureRedirectToDataURI [1] within the contentSecurityManager.

Somethine like BlockFTPSubResourceLoad(nsiChannel* aChannel). From the loadinfo we can query the document's URI, if that is http(s) and the URI of the channel is FTP, then we block the load and log something to the console.

The entry point into that function could be within ::CheckChannel() which would then also allow redirects to be guarded.

[1] https://dxr.mozilla.org/mozilla-central/source/dom/security/nsContentSecurityManager.cpp#114
Attachment #8962327 - Flags: feedback?(ckerschb)
Thanks for the hint, this is a new patch based on CheckChannel and AsyncOnChannelRedirect. 
I wasn't quite sure how to get the loading URI, I assume that is the TriggeringPrincipal?
Attachment #8962327 - Attachment is obsolete: true
Attachment #8962488 - Flags: feedback?(ckerschb)
Comment on attachment 8962488 [details] [diff] [review]
WIP Check for FTP resource in ContentSecurityManager

Review of attachment 8962488 [details] [diff] [review]:
-----------------------------------------------------------------

Overall that looks pretty good. It think we also need an automated test for that.

::: dom/locales/en-US/chrome/security/security.properties
@@ +86,4 @@
>  BlockTopLevelDataURINavigation=Navigation to toplevel data: URI not allowed (Blocked loading of: “%1$S”)
>  BlockSubresourceRedirectToData=Redirecting to insecure data: URI not allowed (Blocked loading of: “%1$S”)
>  
> +BlockSubresourceFTP=Loading FTP resource at “%1$S”

I think we should rename to something like:
Loading FTP subresource within http(s) page not allowed. Blocking “%1$S” within “%2$S”

Please also add a LOCALIZATION note.

::: dom/security/nsContentSecurityManager.cpp
@@ +159,4 @@
>    return false;
>  }
>  
> +/* static */ bool

Please add a small comment explaining what we are doing, either here or in the *.h file.

@@ +160,5 @@
>  }
>  
> +/* static */ bool
> +nsContentSecurityManager::AllowRedirectToFtpURI(nsIChannel* aNewChannel)
> +{

I suppose we want all ftp sub resources loads to be blocked within an http(s) page, not only redirects, right?
If so, then please rename that function to AllFTPSubresourceLoad(nsIChannel* aChannel);

@@ +171,5 @@
> +  if (type != nsIContentPolicy::TYPE_OBJECT &&
> +      type != nsIContentPolicy::TYPE_DOCUMENT &&
> +      type != nsIContentPolicy::TYPE_SUBDOCUMENT) {
> +    return true;
> +  }

To capture all subresource loads, I would just return early for loads of TYPE_DOCUMENT.

@@ +173,5 @@
> +      type != nsIContentPolicy::TYPE_SUBDOCUMENT) {
> +    return true;
> +  }
> +
> +  nsCOMPtr<nsIURI> newURI;

nit: maybe just 'uri' instead of newURI.

@@ +186,5 @@
> +  }
> +
> +  if (nsContentUtils::IsSystemPrincipal(loadInfo->TriggeringPrincipal())) {
> +    return true;
> +  }

Do we need that check? I suppose that would only happen if our own chrome code would load an ftp resource, right? And I hope we never do that. I would prefer to have that check removed. If we need it, please add a comment and potentially move that check further up before querying the uri from the channel.

@@ +194,5 @@
> +  nsCOMPtr<nsIURI> uri;
> +  loadInfo->TriggeringPrincipal()->GetURI(getter_AddRefs(uri));
> +
> +  if (!nsContentUtils::SchemeIs(uri, "http") &&
> +      !nsContentUtils::SchemeIs(uri, "htts")) {

https not htts

@@ +196,5 @@
> +
> +  if (!nsContentUtils::SchemeIs(uri, "http") &&
> +      !nsContentUtils::SchemeIs(uri, "htts")) {
> +    return true;
> +  }

I think it's better query the loadingNode from the loadInfo and then query the ownerDoc from that. From the ownerDoc you can query the document URI. |doc->GetDocumentURI()| that is the URI of the document the subresource is about to be loaded into.

One scenario to think about (not sure if we have to consider):
*) top-level http page
*) loads a frame
*) that frame loads and ftp subresource.

In that case the ownerdocument would be the frame. Not sure if we need to walk the ancestor chain in that case to walk all the way to the root document. Depending on how we implement that function, I would imagine that we already block the frame if needed. But worth thinking about it.

@@ +199,5 @@
> +    return true;
> +  }
> +
> +  nsAutoCString dataSpec;
> +  newURI->GetSpec(dataSpec);

that part we only did because data: URIs can be incredibly long. I guess if we end up at this point in the function, then we already know it's an ftp subresource, right? In that case I think we don't need to truncate.

@@ +730,5 @@
> +  // Do not allow loading ftp: resources into http(s)
> +  if (!AllowRedirectToFtpURI(aNewChannel)) {
> +    aOldChannel->Cancel(NS_ERROR_CONTENT_BLOCKED);
> +    return NS_ERROR_CONTENT_BLOCKED;
> +  }

I think CheckChannel is called in the regular load case as well as after a redirect. In case of a redirect we would run through that function twice, right? Probably we can place it only within ::CheckChannel. If ChekcChannel fails, then AsyncOnChannelRedirect() cancels the old channel for you.
Attachment #8962488 - Flags: feedback?(ckerschb) → feedback+
While talking about IRC we decided that disallowing the loading of FTP content as a subresource everywhere is the most straightforward solution. This does not yet contain tests.
Attachment #8962488 - Attachment is obsolete: true
Attached patch tests (obsolete) — Splinter Review
So I added a test for loading iframes/images from FTP. The biggest problem is that I am not sure when we can be sure all console messages that we expect actually arrived. A one second timeout seems to work, but that is rather brittle of course.

I also tried adding a redirect from HTTP to FTP test, but for those cases the console message is not delivered to the tab's webconsole and just the browser console.
Attachment #8964187 - Flags: feedback?(ckerschb)
Comment on attachment 8964187 [details] [diff] [review]
tests

Review of attachment 8964187 [details] [diff] [review]:
-----------------------------------------------------------------

(In reply to Tom Schuster [:evilpie] from comment #17)
> So I added a test for loading iframes/images from FTP. The biggest problem
> is that I am not sure when we can be sure all console messages that we
> expect actually arrived. A one second timeout seems to work, but that is
> rather brittle of course.

I browsed a bunch of DXR code. Maybe we can use an approach similar to the following [1] which would allow us not having to rely on timeouts.
[1] https://dxr.mozilla.org/mozilla-central/source/devtools/client/webconsole/new-console-output/test/mochitest/browser_webconsole_csp_violation.js
 
> I also tried adding a redirect from HTTP to FTP test, but for those cases
> the console message is not delivered to the tab's webconsole and just the
> browser console.

Hm, yeah, that is tricky. I suppose our *.sjs files can't handle FTP right?

I don't know how to query the browser console, but given the API we have in [1] we might also have an API that allows you to query the browser API.

Clearing the feedback for now. If my suggestion doesn't work, then I can take another look at your provided test. Might be worth pinging Brian Grinstead, he knows a lot about writing such tests.
Attachment #8964187 - Flags: feedback?(ckerschb)
I think I found a neat solution based on your tip with using BrowserTestUtils.waitForCondition.

Can I work on the redirect test as a follow-up?
Attachment #8964187 - Attachment is obsolete: true
Attachment #8965003 - Flags: review?(ckerschb)
Attachment #8963105 - Flags: review?(ckerschb)
Comment on attachment 8963105 [details] [diff] [review]
v1 - Block loading FTP as a subresource everywhere

Review of attachment 8963105 [details] [diff] [review]:
-----------------------------------------------------------------

r=me with nits addressed - thanks!

::: dom/locales/en-US/chrome/security/security.properties
@@ +86,4 @@
>  BlockTopLevelDataURINavigation=Navigation to toplevel data: URI not allowed (Blocked loading of: “%1$S”)
>  BlockSubresourceRedirectToData=Redirecting to insecure data: URI not allowed (Blocked loading of: “%1$S”)
>  
> +BlockSubresourceFTP=Blocked loading FTP content at “%1$S” as a subresource. Please use HTTPS for web-accessible resources.

I don't have a strong opinion but you might consider updating the message to something like the following to be consistent with what we have for blocking top level data: URIs. Something like:

Loading FTP subresource within http(s) page not allowed. (Blocked loading of: “%1$S”)

::: dom/security/nsContentSecurityManager.cpp
@@ +159,5 @@
>    return false;
>  }
>  
> +/* static */ bool
> +nsContentSecurityManager::AllowFTPSubresourceLoad(nsIChannel* aNewChannel)

same nit here; just rename to aChannel.

::: dom/security/nsContentSecurityManager.h
@@ +36,4 @@
>    static bool AllowTopLevelNavigationToDataURI(nsIChannel* aChannel);
>    static bool AllowInsecureRedirectToDataURI(nsIChannel* aNewChannel);
>  
> +  static bool AllowFTPSubresourceLoad(nsIChannel* aNewChannel);

nit: please use aChannel; We only used aNewChannel to be more explicit which one to use in the redirect case.
Attachment #8963105 - Flags: review?(ckerschb) → review+
Comment on attachment 8965003 [details] [diff] [review]
v1 - Simple sub-resource only test

Review of attachment 8965003 [details] [diff] [review]:
-----------------------------------------------------------------

r=me. Adding a redirect test as a follow up is fine with me.

::: dom/security/test/general/browser_test_FTP_console_warning.js
@@ +1,3 @@
> +/*
> + * Description of the test:
> + *   Ensure that CORS warnings are printed to the web console.

CORS warnings? I guess that's copy/paste error, right? Please fix that entire comment to match what we are actually testing.

@@ +59,5 @@
> +  ok(false, message);
> +};
> +
> +var webconsole = null;
> +var seen_files = ["a.html", "b.html", "c.html", "d.png"];

please add a comment somewhere that these files don't actually exist and we are only looking for the console message to make sure they are blocked.

@@ +87,5 @@
> +  // due to opening a new tab with the web console.
> +  requestLongerTimeout(4);
> +  registerCleanupFunction(do_cleanup);
> +
> +  let test_uri = "http://mochi.test:8888/browser/dom/security/test/general/file_FTP_console_warning.html";

Please don't hardcode the URI here. Search for |getRootDirectory| and use that, which makes it easier in case the test needs to be moved to a different directory.
Attachment #8965003 - Flags: review?(ckerschb) → review+
Pushed by evilpies@gmail.com:
https://hg.mozilla.org/integration/mozilla-inbound/rev/11ff93811572
Block loading FTP as a subresource everywhere. r=ckerschb
https://hg.mozilla.org/integration/mozilla-inbound/rev/1035ca43655e
Simple sub-resource only test. r=ckerschb
Pushed by evilpies@gmail.com:
https://hg.mozilla.org/integration/mozilla-inbound/rev/42cda784d2c5
Change more web-platform tests to avoid using FTP. r=me CLOSED TREE
Tom, Anne just pointed out that we are now blocking FTP before CSP gets a chance. Since that behavior is observable, e.g. through policy violation events, we need to make sure we are spec compliant here. Can you check please? What's chrome doing? Since we updated a web-platform test I would imagine Chrome and Firefox' behavior differs. Please verify.
Flags: needinfo?(evilpies)
Oh I hadn't thought about the difference here. The difference being that we don't send a CSP report anymore for this case. But Chrome does indeed seem to apply CSP before the FTP blocking.

With CSP:

Refused to load the image 'ftp://ftp.mirror.nl/pub/Museum/hyperwrt/icons/back.gif' because it violates the following Content Security Policy directive: "img-src http://*".

Without:

[Deprecation] Subresource requests using legacy protocols (like `ftp:`) are blocked. Please deliver web-accessible resources over modern protocols like HTTPS. See https://www.chromestatus.com/feature/5709390967472128 for details.

The current implementation of nsContentSecurityManager::doContentSecurityCheck, calls CheckChannel before DoContentSecurityChecks. So we probably have to move AllowFTPSubresourceLoad out of CheckChannel, and into doContentSecurityCheck and AsyncOnChannelRedirect.
Flags: needinfo?(evilpies)
Keywords: leave-open
I implemented the change I outlined above. I wasn't quite sure how we want to error handling in two cases, so I left a comment there. This means we can also revert those test changes and get test coverage for this behavior at the same time.
Attachment #8965496 - Flags: review?(ckerschb)
Blocks: 1451938
Comment on attachment 8965496 [details] [diff] [review]
Check for FTP subresource after applying CSP

Review of attachment 8965496 [details] [diff] [review]:
-----------------------------------------------------------------

thanks tom!
Attachment #8965496 - Flags: review?(ckerschb) → review+
Keywords: leave-open
Pushed by evilpies@gmail.com:
https://hg.mozilla.org/integration/mozilla-inbound/rev/b2c0699b9bd4
Check for FTP subresource after applying CSP. r=ckerschb
https://hg.mozilla.org/mozilla-central/rev/b2c0699b9bd4
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla61
Whiteboard: [parity-Chrome][domsecurity-backlog2] → [parity-Chrome][domsecurity-backlog2][wptsync upstream error]
Depends on: 1452701
Created web-platform-tests PR https://github.com/w3c/web-platform-tests/pull/10413 for changes under testing/web-platform/tests
Whiteboard: [parity-Chrome][domsecurity-backlog2][wptsync upstream error] → [parity-Chrome][domsecurity-backlog2][wptsync upstream]
Thank you for fixing this! :)
Would you consider requesting an uplift to Beta (the next ESR)?
(In reply to Jan Andre Ikenmeyer [:darkspirit] from comment #33)
> Thank you for fixing this! :)
> Would you consider requesting an uplift to Beta (the next ESR)?

Dan, what do you think. Worth upifting?
Flags: needinfo?(dveditz)
Maybe earlier in the cycle, but we don't have too many more betas left and we don't know what this is going to break.
Flags: needinfo?(dveditz)
Tom, would you mind requesting a release not addition for this bug please? We probably want to add it at least to our Nightly notes. Here are instructions about release notes if you don't know the process https://wiki.mozilla.org/Release_Management/Release_Notes#How_to_nominate_a_bug_for_release_notes_addition.3F

Thanks!
Flags: needinfo?(evilpies)
We are going to block FTP subresources on all pages. Navigating to top-level FTP URLs is still possible. We are working on a blogpost that is not ready yet.
relnote-firefox: --- → ?
Flags: needinfo?(evilpies)
Depends on: 1458449
Whiteboard: [parity-Chrome][domsecurity-backlog2][wptsync upstream] → [parity-Chrome][domsecurity-backlog2][wptsync upstream][adv-main61-]
I'm using 61.0beta14, now Firefox can't save images directly on FTP sites.

Reproduce steps:
1. Visit ftp://ftp.mirror.nl/pub/Museum/hyperwrt/icons/back.gif
2. Right click on page and choose "Save Page As"
3. Failed to download, messages in browser console said FTP subresource was blocked.

It's not a "subresource", and should not be blocked.
(In reply to Xu Zhen from comment #39)
> I'm using 61.0beta14, now Firefox can't save images directly on FTP sites.
> 
> Reproduce steps:
> 1. Visit ftp://ftp.mirror.nl/pub/Museum/hyperwrt/icons/back.gif
> 2. Right click on page and choose "Save Page As"
> 3. Failed to download, messages in browser console said FTP subresource was
> blocked.
> 
> It's not a "subresource", and should not be blocked.

Yeah, I guess that's a valid case. Tom, any chance you could file a new bug and investigate? I guess the problem is that we only return early for TYPE_DOCUMENT within CheckFTPSubresourceLoad() and I guess this case doesn't use TYPE_DOCUMENT. Maybe TYPE_OTHER? I am not sure, but most likely the problem is around the content type - thanks!
Flags: needinfo?(evilpies)
Depends on: 1469536
See bug 1469536.
Flags: needinfo?(evilpies)
Depends on: 1469891
Depends on: 1470295
You need to log in before you can comment on or make changes to this bug.