Closed Bug 1420296 (CVE-2020-6809) Opened 3 years ago Closed 1 year ago

Webextensions can access local files

Categories

(WebExtensions :: General, defect, P1)

57 Branch
defect

Tracking

(firefox-esr68 wontfix, firefox72 wontfix, firefox73 wontfix, firefox74 verified)

VERIFIED FIXED
mozilla74
Tracking Status
firefox-esr68 --- wontfix
firefox72 --- wontfix
firefox73 --- wontfix
firefox74 --- verified

People

(Reporter: jan, Assigned: robwu, Mentored)

References

Details

(Keywords: sec-moderate, Whiteboard: webext-sec [post-critsmash-triage][adv-main74+])

Attachments

(6 files)

User Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/62.0.3202.89 Chrome/62.0.3202.89 Safari/537.36

Steps to reproduce:

Try the code from this stack overflow answer at a webextension:

https://stackoverflow.com/questions/42108782/firefox-webextensions-get-local-files-content-by-path/44516256#44516256

And you'll see that it is possible to read local files without user interaction.

More information:

As far as I know this is at the moment used by Stylus and a beta version of Tampermonkey (my extension). At Tampermonkey it's necessary that the user _explicitly_ enables file access, just like it's necessary at Chrome, in order to use it. 

However, since the stack overflow answer was posted on 2017-05-13, I guess there are some more extensions that use it. That's why and since AMO currently is using a faster/automatic review I'm a little bit concerned that other extension might use it in a non-good-natured way, because the review process is unable to detect it (in opposition to Firefox's legacy API add-ons where the reviewers took account of this).

Finally, please think about implementing user-permitted file _read_ access. It's one of the most requested features of Tampermonkey.
https://bugzilla.mozilla.org/show_bug.cgi?id=1246236
https://bugzilla.mozilla.org/show_bug.cgi?id=1266960


Actual results:

It's possible to read local files without any user interaction.


Expected results:

It should (not/only after user confirmation) be possible to access local files.
Only files under $HOME, or also elsewhere? (background: when e10s is enabled, I would expect sandboxing to prohibit accessing files outside of $HOME)
Group: firefox-core-security → toolkit-core-security
Component: Untriaged → WebExtensions: General
Flags: needinfo?(jan)
Product: Firefox → Toolkit
Looks like it can be anywhere in my testing of the function in Stylus. 

https://addons.mozilla.org/en-US/firefox/addon/styl-us/

To reproduce
1) install stylus
2) go to about:debugging
3) click debug and in the browser console type something like:
 
  download('file:///etc/passwd').then(e => console.log(e));

Output:

 ##
 # User Database
 # 
 # Note that this file is consulted directly only when the system is running
 # in single-user mode.  At other times this information is provided by
 # Open Directory.
 #
 # See the opendirectoryd(8) man page for additional information about
 # Open Directory.
 ##
 nobody...

Also pinging Jim to let the sandboxing team know.
Flags: needinfo?(kmaglione+bmo)
Flags: needinfo?(jmathies)
Priority: -- → P1
This is a bug in fetch & web extension context, when the {mode:'same-origin'} flag is set. I wonder if it affects any other contexts.

MDN says: "same-origin — If a request is made to another origin with this mode set, the result is simply an error. You could use this to ensure that a request is always being made to your origin." 

But somehow, moz-extension//{UUID} and file:// are considered same origin when you set this flag. 

STR:
Run this script in the background context of any web extension.

var url = file:///SOME/LOCAL/FILE/PATH
fetch(url, {mode:'same-origin'}).then(f=>console.log("read file",f.url)).catch(e=> console.log('Error trying to read file'))

Expected:
Console: Error trying to read file

Actual
Console: read file file:///SOME/LOCAL/FILE/PATH
This isn't exactly unexpected. If the extension has the <all_urls> permission, it has same origin permissions for file: URLs, so this is pretty much the way it should work.
Flags: needinfo?(kmaglione+bmo)
(In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're blocked) from comment #4)
> This isn't exactly unexpected. If the extension has the <all_urls>
> permission, it has same origin permissions for file: URLs, so this is pretty
> much the way it should work.

Why does this ONLY work when you set the 'same-origin' flag.

Also note that this request completely bypasses the sandbox.
(In reply to Paul Theriault [:pauljt] from comment #5)
> (In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're
> blocked) from comment #4)
> > This isn't exactly unexpected. If the extension has the <all_urls>
> > permission, it has same origin permissions for file: URLs, so this is pretty
> > much the way it should work.
> 
> Why does this ONLY work when you set the 'same-origin' flag.

Because our fetch support kind of sucks, and if you don't specify same-origin, it tries to treat it as a CORS request.

> Also note that this request completely bypasses the sandbox.

As far as I recall, extension processes don't have filesystem sandboxing yet, so as long as it doesn't work from content scripts, that's not too surprising.
(In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're blocked) from comment #4)
> This isn't exactly unexpected. If the extension has the <all_urls>
> permission, it has same origin permissions for file: URLs,

Maybe we should change that? :-)

> so this is pretty much the way it should work.

I mean, that wfm, but then we need to be clearer about expectations here, and this doesn't really explain to me why the add-on has to (apparently?) jump through hoops to read files...
(In reply to :Gijs from comment #7)
> (In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're
> blocked) from comment #4)
> > This isn't exactly unexpected. If the extension has the <all_urls>
> > permission, it has same origin permissions for file: URLs,
> 
> Maybe we should change that? :-)

We should, but it's basically blocked on bug 1246236, and figuring out how we want to deal with filesystem access. I suppose we could add an exception, and ignore file: URL permissions for requests but still allow them for content scripts and such, but I'm not super happy about the idea...

> > so this is pretty much the way it should work.
> 
> I mean, that wfm, but then we need to be clearer about expectations here,
> and this doesn't really explain to me why the add-on has to (apparently?)
> jump through hoops to read files...

They have to jump through hoops to make any cross-origin requests that don't support CORS. Like I said, our fetch() support kind of sucks.
Allowing the ability to read arbitrary files, by default is surprising and hard to justify from a security model perspective. Am I missing something here? Why does fetch() fail, and fetch({mode:'same-origin'}). I was taken completely by surprise about this, and I have been looking at web extensions for months. On one hand we say file:// is forbidden [1] but we do clearly state that match patterns will match on file:/// [2]

Regardless, default access to file:/// that bypasses the sandbox is risky and means extensions undermine the sandbox. At a minimum I think,  we need to have users opt-in to extensions being allow to read file: URIs. Something like with a checkbox or menu option in about:addons. (Actually I thought there was a bug on file for that but I can't find it).

[1] https://wiki.mozilla.org/WebExtensions/Filesystem
[2] https://developer.mozilla.org/en-US/Add-ons/WebExtensions/Match_patterns
Status: UNCONFIRMED → NEW
Ever confirmed: true
There's no default access to file: URLs. Extensions can access file: URLs if they have permissions for those URLs listed in their manifests, which works the same way for file: URLs as it does for other URLs.

For the third time: fetch() can only fetch those URLs in same-origin mode because our fetch() support sucks, and by default it tries to make a CORS request, which file: URLs don't support, rather than a same-origin request that leverages the extension permissions. Requests to http: URLs work the same way.
(In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're blocked) from comment #10)
> There's no default access to file: URLs. Extensions can access file: URLs if
> they have permissions for those URLs listed in their manifests, which works
> the same way for file: URLs as it does for other URLs.

Right, but we don't tell the user that in the prompt that is shown for <all_urls>. 

> 
> For the third time: fetch() can only fetch those URLs in same-origin mode
> because our fetch() support sucks, and by default it tries to make a CORS
> request, which file: URLs don't support, rather than a same-origin request
> that leverages the extension permissions. Requests to http: URLs work the
> same way.

Sorry I missed comment 6, and thus didnt understand comment 8. 

(In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're blocked) from comment #6)
> (In reply to Paul Theriault [:pauljt] from comment #5)
> > (In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're
> > blocked) from comment #4)
> > > This isn't exactly unexpected. If the extension has the <all_urls>
> > > permission, it has same origin permissions for file: URLs, so this is pretty
> > > much the way it should work.
> > 
> > Why does this ONLY work when you set the 'same-origin' flag.
> 
> Because our fetch support kind of sucks, and if you don't specify
> same-origin, it tries to treat it as a CORS request.
> 
> > Also note that this request completely bypasses the sandbox.
> 
> As far as I recall, extension processes don't have filesystem sandboxing
> yet, so as long as it doesn't work from content scripts, that's not too
> surprising.

Ok I didn't know that - I thought that all content processes had the same rules, except for the one for loading File:// uris.
I guess Jim can clarify there.

In any case, I think we need to _something_ here. My preference would be to deny file:// access by default, and allow the user to opt-in. Not sure how complex that is though, so maybe in the meantime, perhaps we should improve the permission prompt to at least advise the user that the extension can read your local files.
Changing the permission prompt is a bit of a rabbit hole with quite a few problems (how do you message people who installed this already, for example). I think we should deny this case, uplift it to 58 (if possible) and work with the ACE team to message affected add-ons.

Then we keep on the course of figuring out, with all the relevant teams, what use cases for file access to allow and how to do that in the future.
Duplicate of this bug: 1420360
Flags: needinfo?(jmathies)
(In reply to Paul Theriault [:pauljt] from comment #11)
> (In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're
> blocked) from comment #10)
> > There's no default access to file: URLs. Extensions can access file: URLs if
> > they have permissions for those URLs listed in their manifests, which works
> > the same way for file: URLs as it does for other URLs.
> 
> Right, but we don't tell the user that in the prompt that is shown for
> <all_urls>. 

I think that's the crux of the problem. Not that extensions shouldn't be able to read files (if given permission it's a valid thing to do in some cases) but that many/most users will be thinking "Web" when they see that prompt and will not expect "rummage around on my disk" to be part of it. That is, do we need to add a "<file_urls>" permission?
Keywords: sec-moderate
(In reply to Daniel Veditz [:dveditz] from comment #14)
> I think that's the crux of the problem. Not that extensions shouldn't be
> able to read files (if given permission it's a valid thing to do in some
> cases) but that many/most users will be thinking "Web" when they see that
> prompt and will not expect "rummage around on my disk" to be part of it.
> That is, do we need to add a "<file_urls>" permission?

I think we should probably make the <all_urls> permission not apply to files, and require extensions to explicitly specify file://*/. And we should probably require that to be an optional permission.
The question of how we expose local files to be read is a question for a few groups (security, UX, product) and doing it on a security restricted bug is probably not the best way to get to a solution. There's some good stuff here but I think we should probably stick to the focus of shutting off local file access to fetch in this bug and moving the rest of the discussion to a public bug or forum.
I probably have some spare cycles to shut off the file access here if no one else gets to it before me.

I had some ideas about how we can handle files but as :andym mentioned we can discuss elsewhere after this is resolved.
I think baku and catalinb would have thoughts here, too.
Flags: needinfo?(amarchesini)
Assignee: nobody → amarchesini
Flags: needinfo?(amarchesini)
The spec says that: 
"To perform a scheme fetch using request, switch on request’s current url’s scheme, and run the associated steps: [...] file: For now, unfortunate as it is, file and ftp URLs are left as an exercise for the reader."

Let's force no-cors for file URLs.
Attachment #8936952 - Flags: review?(bkelly)
Why does no-cors mode help here if we treat the principal's as same-origin?  A no-cors request can return a basic response if it's same-origin.
Comment on attachment 8936952 [details] [diff] [review]
file_same_origin.patch

I think this is a bug in our security checks and we probably should not try to paper over them here.  This patch relies on this particular bug to work.  I think I understand what is going wrong and will post a follow-up comment with details.
Attachment #8936952 - Flags: review?(bkelly) → review-
So, both 'cors' and 'no-cors' mode reject because we end up calling DoCheckLoadURIChecks() here:

https://searchfox.org/mozilla-central/rev/110706c3c09d457dc70293b213d7bccb4f6f5643/dom/security/nsContentSecurityManager.cpp#724

This in turn ends up calling CheckLoadURIFlags() eventually which checks URI_IS_LOCAL_FILE:

https://searchfox.org/mozilla-central/rev/110706c3c09d457dc70293b213d7bccb4f6f5643/caps/nsScriptSecurityManager.cpp#938

This block ends up failing and the file URL is blocked as intended.  Note, this is done *before* checking for URI_LOADABLE_BY_ANYONE.

For 'same-origin' mode, though, we do not call DoCheckLoadURIChecks().  Instead we call DoSOPChecks() which basically calls NS_HasBeenCrossOrigin():

https://searchfox.org/mozilla-central/rev/110706c3c09d457dc70293b213d7bccb4f6f5643/dom/security/nsContentSecurityManager.cpp#711

Which then ends up calling BasePrincipal::CheckMayLoad().  This happily accepts any URL if URI_FETCHABLE_BY_ANYONE is set:

https://searchfox.org/mozilla-central/rev/110706c3c09d457dc70293b213d7bccb4f6f5643/caps/BasePrincipal.cpp#152

As Kris mentioned, web extensions set this flag based on manifest data:

https://searchfox.org/mozilla-central/rev/110706c3c09d457dc70293b213d7bccb4f6f5643/netwerk/protocol/res/ExtensionProtocolHandler.cpp#382

This seems like a policy or security check problem.  I don't think trying to fix this in fetch() is the right place.
To clarify what I think is at least one problem:

In the cross-origin security checks we explicitly check for and disallow file URLs.  In same-origin security checks we don't do this because we assume they will be cross-origin and fail anyway.  The flag breaks that assumption, though.
Other same-origin requests I see in the tree which would be able to touch local files:

* worker scripts
* HTMLTrackElement
* Some FontFace loading code
* nsManifestCheck
* etc

These probably don't have the simple attack vector of fetch(), but I think we should fix them in a way that addressed them all if we can.
Flags: needinfo?(ckerschb)
Boris, I think Bens assessment in comment 22 and comment 23 is correct. I think it wouldn't hurt to also explicitly add an  disallow file URL access check to same origin checks. I assume the problem was just the assumption that SOP checks would fail file: loads because they are considered cross-origin anyway, right?
Flags: needinfo?(ckerschb) → needinfo?(bzbarsky)
Can you also add a test, at least for high level local file access through WebExtensions, and maybe unit tests for the content security checks, please? :)
I am not actually following comment 22.  That "FETCHABLE_BY_ANYONE" flag would be on the _uri_, not the _principal_.  Webextensions set that on _webextension_ URIs, which doesn't obviously have anything to do with attempts to load file:// URIs.

I have a slightly different question.  Why is file:// special?  Can extensions use this same mechanism to get at chrome:// URIs or whatnot?

Or put another way, should <all_urls> basically mean "all URLs that are URI_LOADABLE_BY_ANYONE", as opposed to "all URLs"?  That will presumably restrict it to the things users expect....
Flags: needinfo?(bzbarsky)
(In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're blocked) from comment #15)
> (In reply to Daniel Veditz [:dveditz] from comment #14)
> > I think that's the crux of the problem. Not that extensions shouldn't be
> > able to read files (if given permission it's a valid thing to do in some
> > cases) but that many/most users will be thinking "Web" when they see that
> > prompt and will not expect "rummage around on my disk" to be part of it.
> > That is, do we need to add a "<file_urls>" permission?
> 
> I think we should probably make the <all_urls> permission not apply to
> files, and require extensions to explicitly specify file://*/. And we should
> probably require that to be an optional permission.

FWIW I like this proposal, even though it differs from chrome.

I'm here though now, to bring up bug 1427448 which complicates the fix here slightly i think (at least i think this bug will depend on 1427448).
Depends on: 1427448
This seems to have stalled, it got listed as sec-moderate but a P1 by the WebExtensions team, so it keeps popping up in triages for us. Should we drop the priority down to reflect the sec- status? Is there anything we can do to help move this forward baku?
I think the fix is probably more in Christoph's wheelhouse in the security manager.  Christoph, can you take this?
Flags: needinfo?(ckerschb)
(In reply to Ben Kelly [:bkelly] from comment #30)
> I think the fix is probably more in Christoph's wheelhouse in the security
> manager.  Christoph, can you take this?

Unfortunately I don't have the cycles to tackle this one at the moment. I guess we have to lower the priority until we get more engineering power back in content security.
Flags: needinfo?(ckerschb)
Flags: needinfo?(jan)
(In reply to Andy McKay [:andym] from comment #29)
> This seems to have stalled, it got listed as sec-moderate but a P1 by the
> WebExtensions team, so it keeps popping up in triages for us. Should we drop
> the priority down to reflect the sec- status? Is there anything we can do to
> help move this forward baku?

No, keep it P1 - this is fundamental inconsistency in the security model. Either we need to fix this, or we need to make it clear that web extensions are allowed to read files from the file system. Web Extension bugs tend to get lower rating due to exploitation being predicated on installing a malicious extension - not a huge barrier, but more than if the issue was web exploitable. So while only being moderate in the scheme of firefox, I'd say this should be a P1 in the context of web extensions.
(In reply to Paul Theriault [:pauljt] from comment #28)
> (In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're
> blocked) from comment #15)
> > (In reply to Daniel Veditz [:dveditz] from comment #14)
> > > I think that's the crux of the problem. Not that extensions shouldn't be
> > > able to read files (if given permission it's a valid thing to do in some
> > > cases) but that many/most users will be thinking "Web" when they see that
> > > prompt and will not expect "rummage around on my disk" to be part of it.
> > > That is, do we need to add a "<file_urls>" permission?
> > 
> > I think we should probably make the <all_urls> permission not apply to
> > files, and require extensions to explicitly specify file://*/. And we should
> > probably require that to be an optional permission.
> 
> FWIW I like this proposal, even though it differs from chrome.
> 
> I'm here though now, to bring up bug 1427448 which complicates the fix here
> slightly i think (at least i think this bug will depend on 1427448).

What's needed to move ahead with this proposal? I don't think the fix for bug 1427448 will necessarily make much of a difference here. AIUI to fix this type of thing, we'd need to:

1) add a new permission for <file_urls>
2) require it to be optional
3) in CAPS, when the triggering/subject principal is an add-on and the target URI chain has the local file flag, check if the add-on has the relevant permission

Does that sound right? Do we need further buy-in on this plan to move forward here?
Flags: needinfo?(kmaglione+bmo)
Flags: needinfo?(ddurst)
(In reply to :Gijs from comment #33)
> What's needed to move ahead with this proposal? I don't think the fix for
> bug 1427448 will necessarily make much of a difference here. AIUI to fix
> this type of thing, we'd need to:
> 
> 1) add a new permission for <file_urls>
> 2) require it to be optional
> 3) in CAPS, when the triggering/subject principal is an add-on and the
> target URI chain has the local file flag, check if the add-on has the
> relevant permission
> 
> Does that sound right? Do we need further buy-in on this plan to move
> forward here?

No. <all_urls> should still match file URLs, and so should file://*

What we need is a separate setting that allows users to opt into giving an extension access to local files, similar to what Chrome does. I don't particularly care when/where we prompt users to opt into this, but the status of that permission should at least be displayed on the add-on's about:addons detail page, and be togglable there.
Flags: needinfo?(kmaglione+bmo)
(In reply to :Gijs from comment #33)
> 3) in CAPS, when the triggering/subject principal is an add-on and the
> target URI chain has the local file flag, check if the add-on has the
> relevant permission

Which means adding a new boolean flag to WebExtensionPolicy[1] and checking it in MatchPattern[2]. Which probably means adding an optional policy property to MatchPattern instances.

[1]: https://searchfox.org/mozilla-central/source/dom/webidl/WebExtensionPolicy.webidl
[2]: https://searchfox.org/mozilla-central/rev/d03ad8843e3bf2e856126bc53b0475c595e5183b/toolkit/components/extensions/MatchPattern.cpp#405
(In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're blocked) from comment #34)
> No. <all_urls> should still match file URLs, and so should file://*

To me this seems like the opposite of what comment #15 said:

(In reply to Kris Maglione [:kmag] (long backlog; ping on IRC if you're blocked) from comment #15)
> I think we should probably make the <all_urls> permission not apply to
> files, and require extensions to explicitly specify file://*/. And we should
> probably require that to be an optional permission.

Can you clarify what you mean more precisely?
Flags: needinfo?(kmaglione+bmo)
Assignee: amarchesini → nobody
(In reply to Kris Maglione [:kmag] from comment #34)
> (In reply to :Gijs from comment #33)
...
> 
> What we need is a separate setting that allows users to opt into giving an
> extension access to local files, similar to what Chrome does. I don't
> particularly care when/where we prompt users to opt into this, but the
> status of that permission should at least be displayed on the add-on's
> about:addons detail page, and be togglable there.

We just spoke about this at all hands, and as asked Im adding my strong support for the approach suggested by Kris above (i.e. a toggle where users can OPT IN to allowing extensions to have file access, and prevent access in the regular case).
With a side note that, in addition to preventing direct loading of file:// uris, we also need to prevent extensions open in tabs with file:// URIs (and then injecting a content script into it). From a discussion here it sounds like we don't want to still allow extensions even without the "file access" granted to inject content scripts into tabs with File:// uris that the USER opened. (im not super pumped about that, but it sounds also you shouldn't have to grant a random extension access to your file system, just so that it can inject some content script into the one specific document you opened from the file system.
Product: Toolkit → WebExtensions
Assignee: nobody → kmaglione+bmo
Flags: needinfo?(ddurst)
I'm going to handle the back-end portion of this. I might punt the front-end portion to Mark, since I have a lot of things on my plate at the moment.
Flags: needinfo?(kmaglione+bmo)
> I'm going to handle the back-end portion of this.

Does "this" mean to prevent content script file:// access by default _AND_ to implement "a toggle where users can OPT IN to allowing extensions to have file access"?

I'm asking because the first "fix" will break Tampermonkey's ability to install scripts from local file URIs unless content scripts that are defined at the manifest are exempted from this change.

Thanks, Jan

{
    "manifest_version": 2,
...
    "content_scripts":
    [
        {...}, {
            "js": [
                 "rea/content.js"
            ],
            "matches": [ "file:///*.user.js" ],
            "run_at": "document_start"
        }
    ],
...
}
Assignee: kmaglione+bmo → tomica
Mentor: kmaglione+bmo
I'm looking into this, but there are several linked bugs that I can't access.  Can someone please go through bug 1427448, bug 1420360, and bug 1316343, and CC me to those pertinent to this issue?



(In reply to Kris Maglione [:kmag] from comment #34)
> What we need is a separate setting that allows users to opt into giving an
> extension access to local files, similar to what Chrome does. I don't
> particularly care when/where we prompt users to opt into this, but the
> status of that permission should at least be displayed on the add-on's
> about:addons detail page, and be togglable there.

Do we want this as a new "public" permission, or just an internal flag that extension can never request, and can only be manually toggled though the UI?

And what should we do with already installed extensions in either case?



(In reply to Paul Theriault [:pauljt] from comment #38)
> With a side note that, in addition to preventing direct loading of file://
> uris, we also need to prevent extensions open in tabs with file:// URIs (and
> then injecting a content script into it). From a discussion here it sounds
> like we don't want to still allow extensions even without the "file access"
> granted to inject content scripts into tabs with File:// uris that the USER
> opened. (im not super pumped about that, but it sounds also you shouldn't
> have to grant a random extension access to your file system, just so that it
> can inject some content script into the one specific document you opened
> from the file system.

I don't understand this the way it's worded.  And it seems all of the discussion in this bug is about direct access to reading local files, content scripts are a separate issue, and can be a separate bug.



(In reply to Kris Maglione [:kmag] from comment #39)
> I might punt the front-end portion to Mark


This can be done in a separate public bug, if it needs to involve UX and/or others, right?
(In reply to Tomislav Jovanovic :zombie from comment #41)
> I'm looking into this, but there are several linked bugs that I can't
> access.  Can someone please go through bug 1427448, bug 1420360, and bug
> 1316343, and CC me to those pertinent to this issue?

Done this for you, can't comment on the other stuff.
(In reply to Tomislav Jovanovic :zombie from comment #41)
> I'm looking into this, but there are several linked bugs that I can't
> access.  Can someone please go through bug 1427448, bug 1420360, and bug
> 1316343, and CC me to those pertinent to this issue?
> 
> 
> 
> (In reply to Kris Maglione [:kmag] from comment #34)
> > What we need is a separate setting that allows users to opt into giving an
> > extension access to local files, similar to what Chrome does. I don't
> > particularly care when/where we prompt users to opt into this, but the
> > status of that permission should at least be displayed on the add-on's
> > about:addons detail page, and be togglable there.
> 
> Do we want this as a new "public" permission, or just an internal flag that
> extension can never request, and can only be manually toggled though the UI?
> 
> And what should we do with already installed extensions in either case?
> 
> 
> 
> (In reply to Paul Theriault [:pauljt] from comment #38)
> > With a side note that, in addition to preventing direct loading of file://
> > uris, we also need to prevent extensions open in tabs with file:// URIs (and
> > then injecting a content script into it). From a discussion here it sounds
> > like we don't want to still allow extensions even without the "file access"
> > granted to inject content scripts into tabs with File:// uris that the USER
> > opened. (im not super pumped about that, but it sounds also you shouldn't
> > have to grant a random extension access to your file system, just so that it
> > can inject some content script into the one specific document you opened
> > from the file system.
> 
> I don't understand this the way it's worded.  And it seems all of the
> discussion in this bug is about direct access to reading local files,
> content scripts are a separate issue, and can be a separate bug.

Sorry, my comment is barely readable. I meant that we should have a holistic approach to restricting file system access. If we close this fetch hole, but still allow extensions to open and interact with tabs which have a file:// url open, then extensions will still have read access to the file system (since file: uris are treated as same-origin as files in the same directory). But you are right, that perhaps should be a separate bug.
(In reply to Paul Theriault [:pauljt] from comment #43)
> (since file: uris are treated as same-origin as files in the same
> directory)

We should just finally fix this - bug 803143.

ni? to Kris and Mike for what to do with already installed extensions here (comment 41), similar to the situation about private browsing being solved right now.

Flags: needinfo?(mconca)
Flags: needinfo?(kmaglione+bmo)
Group: toolkit-core-security → firefox-core-security

(In reply to Tomislav Jovanovic :zombie from comment #45)

ni? to Kris and Mike for what to do with already installed extensions here (comment 41), similar to the situation about private browsing being solved right now.

This particular question is being rolled up into a larger discussion around exactly how/when extensions can access local files.

Flags: needinfo?(mconca)

I'm confused by this bug; the docs describe a function to check this capability: https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/extension/isAllowedFileSchemeAccess and describe "the user-controlled 'Allow access to File URLs' checkbox" (on https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/extension#getURL() )

Are the docs incorrect or has much of what has been discussed in this bug been implemented?

(In reply to Tom Ritter [:tjr] from comment #47)

I'm confused by this bug; the docs describe a function to check this capability: https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/extension/isAllowedFileSchemeAccess and describe "the user-controlled 'Allow access to File URLs' checkbox" (on https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/API/extension#getURL() )

Are the docs incorrect or has much of what has been discussed in this bug been implemented?

That currently only applies to Chrome, who we inherited this API from.

Flags: needinfo?(kmaglione+bmo)
Depends on: 803143
Whiteboard: webext-sec
Assignee: tomica → nobody

I'm currently unable to reproduce this with either the stack overflow example from comment 0 or the simplified version in comment 3.

I can still reproduce on Linux with Nightly 72.0a1, buildID 20191021215155 .

  1. manifest.json with
    "name": "Extension with file access",
    "version": "1",
    "manifest_version": 2,
    "background": {"scripts": ["bg.js"]},
    "permissions": ["file:///*"]
}
  1. bg.js
async function test() {
    let r = await fetch('file:///etc/passwd', {mode:'same-origin'});
    let t = await r.text();
    console.log(t)
}
test();

Result: The content of /etc/passwd.

If I add a catch on the fetch, I get "TypeError: NetworkError when attempting to fetch resource.".

Nightly OSX 10.14

So, I am still unable to reproduce this on osx regardless of the privacy.file_unique_origin value. However, on a fresh profile, same extension, I can reproduce this on linux. Either it is unrelated to file_unique_origin or it is linux specific. Due to this, I'll set my vm to dealing with windows for the rest of the afternoon.

Attached file fileaccess.xpi

This xpi does two fetches, one that is just normal (was making sure I didn't have some stupid problem) and one that does a file fetch. This same xpi fails to fetch the file on osx, but works fine on linux.

How do I check whether I'm reproducing with that extension? I installed the extension, started the browser, loaded example.com, but I don't know which console it's logging to and hence whether it's managing to do the fetches...

Ah, nevermind. If I add some dump() calls I can see it running at startup. Poking at things...

This reproduces on Windows 10 and latest Nightly 72 from today.

So the fetch('file:///etc/passwd') call happens in a content process. The triggering principal (and loading principal) for the fetch is a ContentPrincipal with the URI "moz-extension://01cf4af3-f9f4-4f63-8222-1f2c1db5bffb/_generated_background_page.html".

We end up in ContentPrincipal::MayLoadInternal and the AddonAllowsLoad call there returns true, because the WebExtensionPolicy::CanAccessURI call returns true. Given the manifest claims "file:///*" in the permissions this doesn't seem surprising, right? Anyway, that means BasePrincipal::CheckMayLoad returns true, and that's our general "Should this principal be considered same-origin with this URI?" method. So we consider this a same-origin load (e.g. NS_HasBeenCrossOrigin returns false, so DoSOPChecks returns success, so nsContentSecurityManager::CheckChannel returns success, so the load is allowed.

After things things Just Work: the fetch successfully opens the FileChannelChild channel, that remotes the actual filesystem access to the parent process (which is why sandboxing is nor relevant), etc.

Is there some expectation that given our current code this should not work?

fetch('file:///etc/passwd'); does not load the resource (and neither does XMLHttpRequest) - this is desired as explained below.
fetch('file:///etc/passwd', {mode:'same-origin'}); does load the resource.

They should behave consistently.

file:///* is automatically granted when the <all_urls> permission is requested, which is very common.
Most of the extensions that request <all_urls> don't really need file access.
I'd like to block file access by default unless a user explicitly consented, to protect the general population whose add-ons don't need to access files.

(FWIW, in Chrome <all_urls> also includes file:///*, but access is withheld until the user manually grants access at the extension management page.)

Assignee: nobody → rob
Status: NEW → ASSIGNED

Next week (or maybe the week after, at the start of release cycle of 73), I intend to land a patch that closes this hole.
The hole does already not work on macOS (comment 49), and starting with my patch, it won't work either on Linux, Windows and Android.

Content scripts are allowed to run on file:-URLs, but other forms of
file access from extensions are not allowed. This has been the behavior
for a while, but there were no tests. This commit adds tests to ensure
that the functionality continues to work as intended.

  • comment 60 is a fix for this bug and bug 1487353 (a unit test is attached in that other bug).
  • comment 61 is a set of unit tests that already pass, independently of the bugfix. These tests were added to make sure that content scripts for files do not regress.
  • comment 61 is a regression test for this bug, and should land in the future, when this bug becomes public.
Blocks: 1487353

The patch in comment 60 seems fine, but I want to understand answers to a few questions beforehand:

  1. This proposed fix is different than the agreed consensus in most of this bug, up to and including your second-to-last comment 58, which was putting this access behind a special permission prompt/checkbox. Why did you change your approach? Given recent urgency in bug 1598946, should we bundle the two bugs together, behind the same optional permission?

  2. Why doesn't this work on MacOS? Is it a recent "regression", or some configuration/platform difference?

  3. I'm looking up code related to comment 23, perhaps this cross-origin check should be lower in the chain to cover more than just addons usecase.

(In reply to :Tomislav Jovanovic :zombie from comment #64)

The patch in comment 60 seems fine, but I want to understand answers to a few questions beforehand:

  1. This proposed fix is different than the agreed consensus in most of this bug, up to and including your second-to-last comment 58, which was putting this access behind a special permission prompt/checkbox. Why did you change your approach? Given recent urgency in bug 1598946, should we bundle the two bugs together, behind the same optional permission?

I implemented it like this because it fixes file:-access related bugs without breaking known supported cases.
We probably still want to support a prompt/checkbox, but that requires more efforts and review/feedback cycles. That fits better in bug 1598946 or a follow-up to that bug (and some of the ideas expressed here already align with the tentative plan).

  1. Why doesn't this work on MacOS? Is it a recent "regression", or some configuration/platform difference?

The paths that we tested were blocked by the sandbox on macOS. If I choose a whitelisted location, e.g. file:///usr/share/ , then the load is still accepted.

  1. I'm looking up code related to comment 23, perhaps this cross-origin check should be lower in the chain to cover more than just addons usecase.

I'm not aware of any other generic file:-whitelisting other than extensions. Judged by extension permissions only, file:-access should have been allowed (and is so, according to WebExtensionPolicy::CanAccessURI). But we don't have a separate permission warning or any other obvious indication that an extension is allowed access to local files, so it was a nice "feature" that file:-URLs were somehow already blocked access (through fetch / XMLHttpRequest) despite being allowed by CanAccessURI. The patch changes the implementation of CanAccessURI to match the current default behavior for cors-fetches, and does it at that place to cover most scenarios (without affecting content scripts, for which we want to continue supporting access). If we ever decide to open up file:-access (rather than restricting it), then we probably need to revisit the implementation, but until then, I'd rather be on the safe side.

I think that file:-access to extensions can be a useful feature in some cases, but it should by no means be the silent default behavior.

What's your advice on landing the patches?

I want to land two patches, the fix itself, and unit tests that verifies that the supported behaviors are still working as intended.

Do you think that it is fine to land the following at the same time:

(and the regression test in the future - https://phabricator.services.mozilla.com/D54543 ).

... or should I only land the fix now, and land the tests later?
One interesting point to consider is that the bug itself has been public for over two years by now:
https://stackoverflow.com/questions/42108782/firefox-webextensions-get-local-files-content-by-path/44516256#44516256

Flags: needinfo?(dveditz)

Given the public stackoverflow item I'm fine with you landing all of it at once, including D54543.

Flags: needinfo?(dveditz)
Group: firefox-core-security → core-security-release
Status: ASSIGNED → RESOLVED
Closed: 1 year ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla73

I've only landed the already-passing tests; I'll land the other parts once I've verified that the impact on extensions is minimal.

Status: RESOLVED → REOPENED
Resolution: FIXED → ---

I checked all public add-ons on AMO, and the following add-ons are relying on this bug / "feature". I've listed the number of users and the use cases:

The total number of users is about 500k. Since the functionality is already not working on macOS, and the loss of use cases is not extremely critical, I'll land the patch that disables local file access, next week.

Many of those affected addons would be better served with <input type=file>, it seems.

Status: REOPENED → RESOLVED
Closed: 1 year ago1 year ago
Resolution: --- → FIXED
Target Milestone: mozilla73 → mozilla74

Doesn't feel like the kind of change we'd want to land on an ESR branch, but should we consider uplifting this to Beta for Fx73?

Flags: needinfo?(rob)

This vulnerability has been public for multiple years, and consequently some extensions rely on it for legitimate reasons. I want to let it ride the trains to give add-on developers the opportunity to work with the limitations.

Flags: needinfo?(rob)
Flags: qe-verify+
Whiteboard: webext-sec → webext-sec [post-critsmash-triage]

Verified fixed using the fileaccess.xpi extension provided in comment 53 On Windows 10 Pro 64-bit, MacOS Catalina 10.15 and Ubuntu 16.04 LTS on FF 74.0a1 (20200121215203).
Checked through loading the extension from about debugging and inspecting it on a fixed version and also installing it (after setting xpinstall.signatures.required to false) in about:addons on a Nightly build from before the push and then upgrading the browser.
A security error declaring that data may not be loaded from "file:///etc/passwd" is logged in the extension console log after the fix.

Status: RESOLVED → VERIFIED
Whiteboard: webext-sec [post-critsmash-triage] → webext-sec [post-critsmash-triage][adv-main74+]
Regressions: 1621935
Group: core-security-release
You need to log in before you can comment on or make changes to this bug.