Closed
Bug 230606
Opened 21 years ago
Closed 17 years ago
Tighten the same-origin policy for local files (file: URLs, trusted, security)
Categories
(Core :: Security: CAPS, defect)
Core
Security: CAPS
Tracking
()
RESOLVED
FIXED
mozilla1.9alpha8
People
(Reporter: jruderman, Assigned: dveditz)
References
(Depends on 4 open bugs, Blocks 2 open bugs)
Details
(5 keywords, Whiteboard: [Saved pages and other local HTML files can read user's files])
Attachments
(3 files, 1 obsolete file)
3.34 KB,
application/zip
|
Details | |
4.11 KB,
text/html
|
Details | |
17.54 KB,
patch
|
dveditz
:
review+
dveditz
:
superreview+
|
Details | Diff | Splinter Review |
Loading an HTML file from a file:/// URL allows it to read any other HTML or plain-text file on the HD (as long as it knows the filename, mod bug 209234). This is a problem because users save web pages, receive zipped HTML files in e-mails, etc. The same-origin policy should consider each file or each directory to be a separate origin. Btw, fixing this security hole might make it unnecessary to restrict links to file:/// URLs. See bug 84128 comment 190. Suggested exploits: * Malicious site containing useful information (or porn) says it's going away due to bandwidth costs, getting users to save the page. * Send someone a zipped HTML file.
Jesse: i can think of two: - a web developer testing his site locally - a html system in HTML in both cases, it's only a problem if the site uses frames and uses scripts for the frames to communicate
oops, i was quoting jesse from an AIM conversation, not responding to him Personally, I vote WONTFIX precisely because of those exceptions.
Assignee | ||
Comment 3•21 years ago
|
||
I'm not sure what "a html system in HTML" means, but for a developer testing a site locally we can posit some cluefullness. Education about the security.checkloaduri pref (perhaps even exposing it on a debug pref panel?) or other mechanism might suffice. If the other item means some sort of disk-based kiosk or something the same workaround could work. There are definitely trade-offs and maybe we'll ultimately reject this, but I wouldn't jump to WONTFIX so fast.
Reporter | ||
Comment 4•21 years ago
|
||
"A html system in HTML" was supposed to be "a help system in HTML" (on a CD or on a hard drive). I don't think help systems are very likely to use frames and cross-frame scripting.
Comment 5•21 years ago
|
||
Security is great, but is it Mozilla's job to enforce it across the whole filesystem(s). Respectfully, that is a huge bite to chew, and there is already substantial support at the OS level (at least on *NIX). My concern in this comment is for uses of HTML/XUL that are locally contrived, without any web at all. Such uses shouldn't be knobbled by the assumption that content must have once originated from the Web. In the special case noted here, local development of HTML or XUL systems might be hampered by the restriction that they couldn't reach "up" out of their own sub-directory (say, to a library of common files). That is very restrictive for developers. In the general case, the special case is an example of maintaining a security meta-layer for local hyperlinks across the whole filesystem(s). It is a grand statement to propose that Mozilla have any of that functionality. At a lower level, symbolic links/shortcuts are not perceived to be security problems. Why should hyperlinks be? Finally, in the dumb case, PC developers will likely scream if their C: HTML files can't link to D: E: or F: HTML files. Imagine an intranet based on publically mounted and well-known network drives, with no Web server. That stumbles directly into this issue. Is it the case (rather) that the File|Save dialog is a permission portal to the local filesystem and the user should be duly advised when using it. Another vote for WONTFIX. - N.
Reporter | ||
Comment 6•21 years ago
|
||
> Security is great, but is it Mozilla's job to enforce it across the whole > filesystem(s). Yes. Users often save web pages and then open them, expecting that the saved page will not be able to read their other files. > PC developers will likely scream if their C: HTML files can't link to D: E: or > F: HTML files. I'm talking about the same-origin policy, not the linking policy. The same-origin policy only affects cross-frame and cross-window communication by JavaScript, both of which are somewhat rare. The same-origin policy does not affect linking or loading js libraries. > Is it the case (rather) that the File|Save dialog is a permission > portal to the local filesystem and the user should be duly advised > when using it. That would look silly, wouldn't be secure (people would ignore the warning because it would look silly), and would only cover one way of getting users to open an untrusted HTML file on their hard drives.
msdn help is framed, nethelp was framed, mozilla/5 help has frames which happen to be chrome. framed help, at least for a searchable index/toc/... is rather common. I'm not sure how often you need to script across. msdn help does for sync TOC functionality. I haven't used NetHelp in ages and I don't consider mozilla/5 help up to par. I agree, //afs/evil.edu/e/v/evil/fun.html should not have access to ~/.ssh/identity One approach would be to restrict what files can be linked to based on whether the files are "public". For that definition, (posix)g+r or (ntacl)Builtin\Users:R is a public file, and ~/.ssh/identity fails this test (g-r) as should an equivalent nt file. It's not great, and I /might/ be able to drive a truck through it (I suppose jesse could drive a truck through it, but i'd like watching him do it). Alternatively, can we instead play on file ownership? This doesn't help the case of when the user saves a file and therefore establishes the user as the owner, but at least it's a way to address the preexisting file case (ok, it fails miserably in poorly configured nfs systems where uids can be made to match the user).
Comment 8•21 years ago
|
||
> Users often save web pages and then open them, expecting that the saved > page will not be able to read their other files. There's a clear conflict here between what end-users might want (your use case) and what app. and web developers might want (flexibility). It is not persuasive to roundly state that because one constituency (or use-case) needs a certain kind of censorship that all other constituencies must silently bow to that need. There are obvious counter cases. > [File Open] That would look silly, wouldn't be secure ... I'm not suggesting File|Save is secure. I'm saying it is a clear portal to insecure actions on the part of the user, and likely to remain so. You use-case shows how ill-advised use of that portal can be problematic for users. You can't fudge security into something that isn't intended to be secure in the first place. Once files are exported from the Mozilla environment by the user to the HD, security responsibility has passed to the user and O/S. Not your job to tell that user that his locally hacked copies won't be viewable because his behaviour is a security threat. He might know what he's doing. > I'm talking about the same-origin policy, not the linking policy. My bad expression. Substitute "cross-script" for "link". My point still stands. > //afs/evil.edu/e/v/evil/fun.html ... ~/.ssh/identity It's not your place to make policy here. You can cat(1) a binary file, can you not? Are you also proposing that the Linux kernel stop you from doing that? That's a very slippery slope indeed. I propose a counter-subject for this bug. Because spectral testers exist, and because such testers can crash just about any software that reads their output, Mozilla should not load any file that has a URL with "spectral" in it, since it might affect the user's browsing experience. A silly example, but not far from what is proposed here. You can't put a condom on the user's use of their own computer. If they choose to save and view files, that's choice. A "super secure centrally controlled and state authorised" browser app might special-case that libertarian rule away, but surely not a popular consumer product. - N.
Reporter | ||
Comment 9•21 years ago
|
||
> I'm not suggesting File|Save is secure. I'm saying it is a clear portal to > insecure actions on the part of the user, and likely to remain so. Saving a file is dangerous, but only because of this bug. > You can't fudge security into something that isn't intended to be secure in the > first place. Once files are exported from the Mozilla environment by the user > to the HD, security responsibility has passed to the user and O/S. What makes you think that opening an HTML file from your hard drive "wasn't intended to be secure"? HTML files should be as safe to open as text files because almost all users treat them the same way. > Substitute "cross-script" for "link". My point still stands. No it doesn't. Nobody would expect a file on C: to be able to do cross-site scripting with a file on D:. > > //afs/evil.edu/e/v/evil/fun.html ... ~/.ssh/identity > It's not your place to make policy here. You can cat(1) a binary file, > can you not? Are you also proposing that the Linux kernel stop you from > doing that? That's a very slippery slope indeed. Sure, users should be able to run |cat| on any file they have read access to. But users' *documents* shouldn't be able to run |cat| on any file the user has read access to. I would say the same even if the document weren't a common web format like HTML.
Comment 10•21 years ago
|
||
A perl script is a document that can run cat(1) from inside its application engine. The Perl engine does not prevent that when loading such a document. That is a flexible arrangement. Until the question of local development use cases is deconstructed, this enhancement is at best part-complete. - N.
Reporter | ||
Comment 11•21 years ago
|
||
A perl script is not a document. A perl script is a program that generally has whatever permissions you do.
Comment 12•21 years ago
|
||
The line between "document" and "program" is far too blurry for
anyone to stake their life on the difference as you have.
Even so, the argument against this enhancement is just a
utility argument, not a semantic one.
> Nobody would expect a file on C: to be able to do cross-site
> scripting with a file on D:.
No-one expected Jackson Pollock to do dribble pictures either.
Should we have stopped that too? Fiddling with local files
is an act of user volition. It's the user's Desk Top,
not Mozilla's. Users can bookmark interesting pages and keep
attachments in email folders if they want to stay within the
trusted Mozilla environment.
HomeBase is/was a more comprehensive "managed" user environment
where an enhancement such as this might be appropriate. There,
the application has a clear mandate to be a complete proxy between
user and local computing resources. Neither the Mozilla Suite nor
Firebird appears aimed that way. They are residents of the desktop,
not desktops themselves.
- N.
Updated•20 years ago
|
QA Contact: ian
Summary: Tighten the same-origin policy for local files (file: URLs) → Tighten the same-origin policy for local files (file: URLs, trusted, security)
Assignee | ||
Updated•20 years ago
|
Assignee: security-bugs → dveditz
Flags: blocking1.8b?
Flags: blocking-aviary1.1?
Whiteboard: [sg:fix]
Updated•20 years ago
|
Flags: blocking1.8b?
Flags: blocking1.8b+
Flags: blocking-aviary1.1?
Flags: blocking-aviary1.1+
Comment 13•20 years ago
|
||
- for 1.0.1 needs to go on the trunk first if we do something
Flags: blocking-aviary1.0.1-
Comment 15•19 years ago
|
||
Dan, is this something we should still be shooting for in b2 or can we move it out to 1.8b3?
Updated•19 years ago
|
Flags: blocking1.8b2+ → blocking1.8b3+
Comment 16•19 years ago
|
||
This is bumped to 1.9 unless a miracle happens.
Flags: blocking1.8b3-
Flags: blocking1.8b3+
Flags: blocking-aviary1.1-
Flags: blocking-aviary1.1+
*** Bug 273419 has been marked as a duplicate of this bug. ***
Blocks: 289662
Comment 18•19 years ago
|
||
My only possible solution to "both-sides" is perhaps an "Info-Bar" warning (or error) that [xyz] local page is attempting to access [bar] local page, with options to temporarily allow, always allow, always allow for all local files etc. This "allowance" can be stored on a "last-modified" state of the files in question, and be per-file... or some-such. This would allow the web-developer use, to know, somewhat whats going on, (as most web-developers who need this to be allowed, will be able to allow it). and most end-users to know "Something was wrong, but is ok" which is the feeling they get from the pop-up notification with the bar. Otherwise, I as well feel the consequences of "Fixing" this bug far outweigh the benefits of leaving it as-is. (Unless someone comes up with a better solution)
Reporter | ||
Comment 19•19 years ago
|
||
*** Bug 319801 has been marked as a duplicate of this bug. ***
Reporter | ||
Updated•19 years ago
|
Flags: blocking-aviary2?
Comment 20•19 years ago
|
||
Needs to fix this soon, attached code can find user profile dir and read any file in profile dir. works Windows and Linux testcase uses XMLHttpRequest but you can also do window.open().document.body.innerHTML in a small iframe to achive the same.
Reporter | ||
Comment 21•19 years ago
|
||
Microsoft Internet Explorer's solution (in the IE6 in XPSP2) is to disable scripts in local files completely, with an information bar for turning them back on temporarily. That annoys me all the time and usually isn't necessary for avoiding the kind of security problem described by this bug, since most scripts don't access other iframes with other URLs. If we need UI to help web developers figure out how to test web apps locally, Callek's idea makes more sense.
Comment 22•19 years ago
|
||
(In reply to comment #21) > Microsoft Internet Explorer's solution (in the IE6 in XPSP2) is to disable > scripts in local files completely, with an information bar for turning them > back on temporarily. So MS-IE solution wont help. It block safe JavaScript also. So user will turn on JS to see other JavaScript stuff and become vunrable again. If user turn own JS in MS-IE I know how to read many sensitive file thru IE What we want is following * restrict access for Javascript on page with "file:" protocol to its directory and sub dir * dir access should NOT be given for page in - root drive - windows folders and sub folders - My Documents, My Pictures, My *** - Desktop (it is ok to access if it is in sub folder of Desktop, My Documents, My Pictures, etc.) - User Profile folders - on Linux the root folder, home folder, under any .folder - if there is a hidden folder in the path - on *nix all symbolic links should be resolved to check whether it is really under current folder
Comment 23•19 years ago
|
||
It is really pathetic to know mozilla.org for two years did not listen Jesse Ruderman, a authority on javascript ( http://www.squarefree.com/bookmarklets/ ) person who really use its power.
Comment 24•19 years ago
|
||
Biju, Jesse works for Mozilla -- we take him seriously. This bug was triaged out of 1.8 for reasons that are not clear to me. Dan, was the severity not understood or is this considered enough of a luring attack that higher priority security bugs took precedence? /be
Updated•19 years ago
|
Flags: blocking1.8.1?
Comment 25•19 years ago
|
||
remove script bookmaklet Can somebody check the attached bookmaklet will remove all the possible script from a web page. I tried on normal page, a FRAME, IFRAME, onevent attribute, href javascript:, href data: Let me know I missed anything. I would also like to is there any plans for mozilla.org to fix this. If not please close this bug and let public know the danger and suggest work arround
Reporter | ||
Updated•19 years ago
|
Whiteboard: [sg:fix] → [sg:high] Saved pages and other local HTML files can read user's files
Updated•19 years ago
|
Flags: blocking-aviary2?
Reporter | ||
Comment 26•18 years ago
|
||
> Suggested exploits:
> * Malicious site containing useful information (or porn) says it's going away
> due to bandwidth costs, getting users to save the page.
> * Send someone a zipped HTML file.
Some more:
* Email someone an HTML file, and make the message preview look broken enough that the user is encouraged to double-click the safe-looking HTML attachment.
* Include malicious code in the HTML file you distribute with your pirated music torrent. (Many pirated music torrents already include HTML files, which makes me think this would work well.)
Reporter | ||
Comment 27•18 years ago
|
||
See also bug 332676.
Comment 29•18 years ago
|
||
This is rediculous and it needs to be fixed. I discovered this bug independently and wrote my own files here: http://www.zabbey.com/labs/firefox/ I shouldn't have to worry about opening an HTML file, ever. A simple solution would be to block the scriptss as IE does. To develope on a local machine just require a different file ext. (ie file.ehtml) Warn users when opening ehtml files that scripting is enabled for these file types.
Comment 30•18 years ago
|
||
*** Bug 343047 has been marked as a duplicate of this bug. ***
Reporter | ||
Updated•18 years ago
|
Flags: blocking1.9a1?
OS: Windows XP → All
Hardware: PC → All
Comment 31•18 years ago
|
||
what about this: make "file" protocol do not execute js (like "mailbox") and introduce a new protocol "file-unsafe" for developers that need javascript locally.
Updated•18 years ago
|
Flags: blocking1.9a1? → blocking1.9+
Comment 32•18 years ago
|
||
Another demo to have user first saving html (containing js) locally then open it (either immediately in %temp%/cache(?) or in chosen path): http www heise-security.co.uk/services/browsercheck/demos/moz/mozdemo1.shtml To have the user saying 'ok' the server sends html as attachment: content-disposition: attachment; filename=cttest.html Content-Type: text/html; charset=iso-8859-1 And if a user is not aware of the privileges a locally saved html file has...
Reporter | ||
Comment 33•18 years ago
|
||
*** Bug 354059 has been marked as a duplicate of this bug. ***
Comment 34•18 years ago
|
||
another exploit vector is: https://doctor.mozilla.org/doctor.cgi?file=mozilla-org/html/roadmap.html&action=download after clicking on the link the user has 3 options: 1. to open the html file in firefox - when it is opened, it is opened from file:///.../tmp/filename.html 2. to save the file locally, then open it locally 3. cancel if the user wants to view the html file, both 1 and 2 allow at least reading directories/local files
Comment 35•18 years ago
|
||
hello, It's my first post. I think this to be a very high security problem. The attacker can set the trap really really easily. May I upload the exploit code? Reading the local file is useful for the developer. However, a lot of people might not need it. I think that an at least "dangerous setting should not be default".
Comment 36•18 years ago
|
||
This will get fixed for 1.9 (Firefox 3). /be
Comment 39•17 years ago
|
||
Bug 369462 shows at least one easy exploit vector for this.
Comment 40•17 years ago
|
||
an easy low cost fix for this is the noscript extension. probably it may even allow whitelisting of directories for developers.
Comment 41•17 years ago
|
||
not exactly an "arch" bug but using the keyword to get it on the list for some upcoming discussions
Keywords: arch
Comment 42•17 years ago
|
||
(In reply to comment #36) > This will get fixed for 1.9 (Firefox 3). > doesn't seem to be in 3.0pre-latest
Comment 43•17 years ago
|
||
one more reason to disallow chrome uris in location bar: http://lists.grok.org.uk/pipermail/full-disclosure/2007-April/053878.html [Full-disclosure] Firefox 2.0.0.3 DoS crash carl hardwick hardwick.carl at gmail.com Thu Apr 19 19:22:06 BST 2007 Firefox 2.0.0.3 DoS crash PoC: chrome://pippki/content/editcacert.xul chrome://pippki/content/editemailcert.xul chrome://pippki/content/editsslcert.xul
Comment 44•17 years ago
|
||
There are three options: (1) Consider all file: URLs come from the same origin (current behavior) (2) Consider no two file: URLs come from the same origin, or equivalently disable XMLHttpRequest or other stuffs in file: URLs (3) Make some intermediate choice, probably according to directory hierarchy (1) is undesirable because of the cases described in Comment #0, whereas (2) is undesirable because of Comment #1. In short, both behaviors are needed in some situations. On the other hand I prefer simple rules. Complex rules often leave some vulnerabilities, and I do not think hard-coding (3) in Mozilla is good. I prefer (2), because developers testing their applications can avoid the problem by setting up an HTTP server locally. If this is inconvenient, I guess it is possible (not for me but for wizards) to write an extension to establish a new URL scheme which refers to local files, in a similar way to chrome: URLs but without the chrome privilege. Such an extension will set up one URL scheme (say pseudo-http:), and it should be configurable so that pseudo-http://test1/ refers to one local directory and pseudo-http://test2/ refers to another directory. This way we keep the core simple, people use applications safely, and fine-grained control will be available to those who want it.
Assignee | ||
Updated•17 years ago
|
Target Milestone: --- → mozilla1.9alpha5
Assignee | ||
Updated•17 years ago
|
Target Milestone: mozilla1.9alpha5 → mozilla1.9alpha6
Comment 46•17 years ago
|
||
Note that some Unix machines have a world-readable locate database in some standard place, e.g. "/var/cache/locate/locatedb". So, the script can read it and get filenames from it.
Comment 47•17 years ago
|
||
It seems an OK compromise for a solution to allow the page to access everything that is at the same level or below itself in the directory hierarchy.
Comment 48•17 years ago
|
||
This is still dangerous if the user stores the file in his home directory, and maybe even in /tmp. So, there should be a list of base directories (which the powerful user can modify in his preferences) for which access to any local file is forbidden.
Comment 49•17 years ago
|
||
punting remaining a6 bugs to b1, all of these shipped in a5, so we're at least no worse off by doing so.
Target Milestone: mozilla1.9alpha6 → mozilla1.9beta1
Assignee | ||
Comment 50•17 years ago
|
||
There were basically two options, either don't let scripts run in file: uris (the IE7 approach) or limit which other files scripts could access. Blocking scripts entirely meant you needed to have an override (as in IE, because some locally-saved pages won't make any sense without scripts not to mention people developing webpages locally), and then you still need to implement access limits or you haven't really solved the problem. This first patch implements the most strict access controls, file URIs are only same-origin with themselves. More lenient options (including the traditional "access any file") are available as pref settings for future developers and so we can easily play with them in case FILEURI_SOP_SELF setting turns out to be too restrictive. Also includes a fix for related bug 209234. At the reviewers discretion I could do that separately in the other bug, or add another pref setting so beta testers could easily switch back to the old behavior if necessary.
Attachment #279562 -
Flags: superreview?(bzbarsky)
Attachment #279562 -
Flags: review?(bzbarsky)
Comment 51•17 years ago
|
||
does the subdirectory policy treat directory traversal - '../../' correctly?
Comment 52•17 years ago
|
||
The localfile implementations are such (the Linux version of GetParent just chops off a component, Contains is a starts-with comparison, and definitely similar things on Windows the last time I looked) that I think you probably will need to rely on the given path being resolved such that '..' never occurs in it. The implementations are too inconsistent, in my opinion, to safely rely on any of the weak API guarantees nsIFile or nsILocalFile provide. I looked at the GetFile implementation, and I don't see any special logic to handle '.' or '..', so I suspect if you get past there things might get wacky. Note also that normalizing the files probably isn't an option, because on Linux normalization is realpath(3), which resolves symbolic links and might work against user expectations. Some tests for this new set of policies would be nice. The Mochitest server probably has the privileges to use an iframe to a local file, and we should be able to figure out a way to get the necessary file path into the test without too much effort, even if it's just running the test through the preprocessor and copying that file into the tests directory.
Comment 53•17 years ago
|
||
does "same-origin" mean "don't access the content, but be able to load another file in iframe" - as in the web sense?
Assignee | ||
Comment 54•17 years ago
|
||
"same-origin" in the web sense. Currently all file: URIs are treated as a single host which means any file: document can read the content of any other file: document (see comment 0). This is particularly a problem with saved web pages, HTML mail attachments, and malicious files served Content-Disposition:attachment (huh, that's funny. Oh well, I'll just open it locally to see the cool stuff this guy promised me). The original behavior is preserved as FILEURI_SOP_ANYFILE so we can easily switch back if this causes too many problems, or so an individual developer can do so while working on a local project intended to move to the web. The intermediate two modes are intended for experimentation should the default be too restrictive. Nothing in this patch restricts a file: document from loading other files, just as the same-origin policy does not prevent a web sites from loading a page from another site. Local file reference documentation or help systems implemented as multiple pages should still work just fine as long as one page doesn't need to inspect for modify the content of another. As to normalization of "../" that should be done when the URI is normalized before the file is gotten from it. There's currently a problem there (bug 380994, got worse on trunk) but I'm signed up to fix that one, too.
Assignee | ||
Comment 55•17 years ago
|
||
Comment on attachment 279562 [details] [diff] [review] patch v1 bz is out, switching review requests to jst
Attachment #279562 -
Flags: superreview?(jst)
Attachment #279562 -
Flags: superreview?(bzbarsky)
Attachment #279562 -
Flags: review?(jst)
Attachment #279562 -
Flags: review?(bzbarsky)
Assignee | ||
Comment 56•17 years ago
|
||
The SUBDIR "contains" test is not fooled by ../ (or %2e%2e/). It does suffer from bug 380994 on trunk (and linux on branch), but that will be fixed with that bug. It's also fooled by ..\..\ on windows I guess I'd need to fix that, though I'd be just as happy to have just the original and new default behavior and change the pref to security.fileuri.enable_unsafe_access
Comment 57•17 years ago
|
||
(In reply to comment #54) > "same-origin" in the web sense. Currently all file: URIs are treated as a opening /dev/mouse on linux used to be dos imo files should be allowed to even open other files. developers may easily set a light httpd server. what may break are html based offline resources (cd, dvd), don't know if they are common.
Comment 58•17 years ago
|
||
...files should NOT be allowed to open other files
Comment 59•17 years ago
|
||
Comment on attachment 279562 [details] [diff] [review] patch v1 - In nsScriptSecurityManager::SecurityCompareURIs(): // Compare schemes nsCAutoString targetScheme; - nsresult rv = targetBaseURI->GetScheme(targetScheme); nsCAutoString sourceScheme; - if (NS_SUCCEEDED(rv)) - rv = sourceBaseURI->GetScheme(sourceScheme); - if (NS_FAILED(rv) || !targetScheme.Equals(sourceScheme)) { + + if (NS_FAILED( targetBaseURI->GetScheme(targetScheme) ) || + NS_FAILED( sourceBaseURI->GetScheme(sourceScheme) ) || + !targetScheme.Equals(sourceScheme) ) You could optimize this a bit while you're here by eliminating the sourceScheme string and doing something like: PRBool sameScheme; + if (NS_FAILED( targetBaseURI->GetScheme(targetScheme) ) || + NS_FAILED( sourceBaseURI->SchemeIs(targetScheme, &sameScheme) ) || + !sameScheme ) - In nsScriptSecurityManager::SecurityCompareFileURIs(): + if (!targetFileURL || + NS_FAILED( targetFileURL->GetFile(getter_AddRefs(targetFile)) ) || + NS_FAILED( targetFile->IsDirectory(&targetIsDir) ) || + targetIsDir) Does this properly deal with a case where the target file points to a symbolic link that then points to a directory? + return sameParent; + } + else if (mFileURIOriginPolicy == FILEURI_SOP_SUBDIR) else after return. r+sr=jst with that fixed or appropriate follow-up bugs filed.
Attachment #279562 -
Flags: superreview?(jst)
Attachment #279562 -
Flags: superreview+
Attachment #279562 -
Flags: review?(jst)
Attachment #279562 -
Flags: review+
Assignee | ||
Comment 60•17 years ago
|
||
Made the changes jst requested. Also as discussed IRL added a pref setting to turn off the directory-blocking bit added for bug 209234 so we really can go all the way back to current behavior should we need to. This is what I'm landing tonight, will file follow-on bug(s) about the escaping issues in the non-default modes.
Attachment #279873 -
Flags: superreview+
Attachment #279873 -
Flags: review+
Assignee | ||
Updated•17 years ago
|
Attachment #279562 -
Attachment is obsolete: true
Assignee | ||
Comment 61•17 years ago
|
||
checking this in broke reftests: * mozilla/layout/reftests/bugs/322461-1.xml the URIs differ by a trailing "#", shouldn't matter * mozilla/layout/reftests/ib-split/insert-into-split-inline-5.html -moz-binding is same-origin (for non-chrome bindings) * mozilla/layout/reftests/xul-document-load/test001.xul mozilla/layout/reftests/xul-document-load/test002.xul mozilla/layout/reftests/xul-document-load/test007.xul mozilla/layout/reftests/xul-document-load/test011.xul mozilla/layout/reftests/xul-document-load/test014.xul mozilla/layout/reftests/xul-document-load/test015.xul mozilla/layout/reftests/xul-document-load/test016.xul mozilla/layout/reftests/xul-document-load/test017.xul mozilla/layout/reftests/xul-document-load/test018.xul mozilla/layout/reftests/xul-document-load/test019.xul mozilla/layout/reftests/xul-document-load/test020.xul mozilla/layout/reftests/xul-document-load/test021.xul xul overlays. tests 11,14 and 15 have them in subdirectories. To completely fix the builds I had to change the default value of security.filreuri.origin_policy to 2, the asymetric subdirectory option. We need to figure out if the build tests ought to make a local pref change (and then we revert this back in the tree), or if this is a sign that blocking access to all other files is too dracconian.
Comment 62•17 years ago
|
||
on some distros iframe src="file:///dev/console" completely blocks the keyboard. some are safe, depending on owner/permissions of /dev/console
Comment 63•17 years ago
|
||
We can't really have an asymmetric option here without a lot more work. This code is called from symmetric comparison functions (nsIPrincipal::Equals most notably), and there's no guarantee that they will be ordering things the right way. Up until this checkin SecurityCompareURIs was actually symmetric on trunk (though not on branch), and we now have trunk code that depends on that... If we're reintroducing asymmetry here we need to audit all the callers of Equals and SecurityCompareURIs in the tree, and likely change some of them. In at least some cases we're really doing a "foo and bar can access _each other_" check, and we'd need to replace that with two separate checks (which we used to have some places, and removed on trunk when asymmetry went away) if we reintroduce asymmetry. Also the asymmetric subdir thing _does_ break "save as, complete" as things stand. The toplevel page would be able to access subframes, but not the other way around. Could we only allow foo to access foo_files (and anything inside it, stepping down through *_files subdirs as needed), and vice versa (so that "save page complete" works) and make reftests do that as needed for subresources? Also note that the "access anything in same dir" thing we have here means that any downloaded web page can access any other downloaded web page. I'm not entirely sure we can fix that easily without breaking things like simple local versions of stuff to be uploaded to an HTTP server. :( Of course non-simple ones would break anyway...
Comment 64•17 years ago
|
||
(In reply to comment #63) > Also the asymmetric subdir thing _does_ break "save as, complete" as things > stand. The toplevel page would be able to access subframes, but not the other > way around. > change "save complete" to save in a jar with .zip extension, probably adding simple entry .html file? this may need some work for local jars, but seems less PITA than complicated directory checks.
Comment 65•17 years ago
|
||
if html files and saved web sites can be dealt with, the troubling case are developers. developers are expected to deal with trusted by them content. possible solutions are: 1. whitelist a directory "my web programs" - everything there have full access in all directories under it (subdirectories may access parent, probably up to "my web programs") or 2. advice developers to use lightweight httpd server - unless developers use ONLY relative uri-s js development is useless from a disc.
Comment 66•17 years ago
|
||
(In reply to comment #65) > if html files and saved web sites can be dealt with, the troubling case are > developers. developers are expected to deal with trusted by them content. Not just developers. What about contents from other media such as USB keys or CD-ROM? > possible solutions are: > 1. whitelist a directory "my web programs" - everything there have full access > in all directories under it (subdirectories may access parent, probably up to > "my web programs") I think this would be the best solution, but the user should be able to declare various permission settings. For instance, a directory could be declared as a close domain (this includes subdirectories), with no access to the outside: a page could probably have access to other, unrelated pages in this directory (filenames can be guessed, e.g. index.html), but cannot send any information about it to remote hosts; so, I think that's safe enough. A developer may want to set up a directory with full access to the outside. However, a httpd server (as you suggested) could also be used for this purpose.
Comment 67•17 years ago
|
||
(In reply to comment #66) > Not just developers. What about contents from other media such as USB keys or > CD-ROM? > yes. but cdrom can be malicious. > a > page could probably have access to other, unrelated pages in this directory > (filenames can be guessed, e.g. index.html), but cannot send any information > about it to remote hosts; so, I think that's safe enough. > i am ready to bet 'no access to remote hosts' can't be enforced in the current situation - this means breaking img.src="" and setting a lot of other attributes.
Comment 68•17 years ago
|
||
What about blocking other sources and showing one of those yellow infp-bars that says "FF has blocked some elements for your protection. click to show them" or something like this?
Comment 69•17 years ago
|
||
(In reply to comment #67) > yes. but cdrom can be malicious. Yes, that's why I think the user should declare the corresponding directory as a closed domain. > i am ready to bet 'no access to remote hosts' can't be enforced in the current > situation - this means breaking img.src="" and setting a lot of other > attributes. Access to remote images from a local file without user confirmation is already bad. For instance, a spammer could send such files to validate the e-mail address (and the fact that the user opened the message) from the logs of the web server. BTW, that's why I never open HTML messages in Firefox without looking at them with lynx first. (In reply to comment #68) > What about blocking other sources and showing one of those yellow infp-bars > that says "FF has blocked some elements for your protection. click to show > them" or something like this? I like this idea. And of course, there could also be whitelists...
Comment 70•17 years ago
|
||
(In reply to comment #68) > What about blocking other sources and showing one of those yellow infp-bars > that says "FF has blocked some elements for your protection. click to show > them" or something like this? > imho users are less than clever. and the fox tells them to click, click, click ;)
Comment 71•17 years ago
|
||
(In reply to comment #70) > (In reply to comment #68) > > What about blocking other sources and showing one of those yellow infp-bars > > that says "FF has blocked some elements for your protection. click to show > > them" or something like this? > > > > imho users are less than clever. and the fox tells them to click, click, click > ;) > From my experience, Users just ignore the bar. They dont even notice it. but some kind of notification should be done. Why not re-use the info element from popup blocking, addons blocking, ... ?
Assignee | ||
Comment 73•17 years ago
|
||
(In reply to comment #63) > We can't really have an asymmetric option here without a lot more work. I almost didn't even create the subdir option, I only put it in there as a fallback in case something broke. Unfortunately the initial checkin broke a bunch of reftests and the "samedir" option was not able to fix all of them. I'd be happy to change the tests and go back to a "samedir" approach, letting developers who need more flip the pref themselves and rely on herd-immunity. > Also the asymmetric subdir thing _does_ break "save as, complete" as things > stand. The toplevel page would be able to access subframes, but not the other > way around. I'm OK with that -- is that a case that comes up often? or ever? > Could we only allow foo to access foo_files (and anything inside it, stepping > down through *_files subdirs as needed), and vice versa (so that "save page > complete" works) and make reftests do that as needed for subresources? thought of that, but the "_files" part is a localized resource that lives in browser which makes it hard-ish for caps to deal with gracefully. > Also note that the "access anything in same dir" thing we have here means that > any downloaded web page can access any other downloaded web page. True, but they have to guess at the names now. This is in part why my initial check-in had the strictest "self" setting.
Updated•17 years ago
|
QA Contact: ian → caps
Comment 75•17 years ago
|
||
dveditz note that this bug is related to offline js `apps' and offline cache in firefox 3
Comment 76•17 years ago
|
||
Looks like I never got cced... > I'm OK with that -- is that a case that comes up often? or ever? I don't know. > thought of that, but the "_files" part is a localized resource Urgh. :( I thought about the asymmetry thing some more, and we _could_ probably do it with changes to nsPrincipal::Equals to make both calls, auditing of all other SecurityCompareURIs callers, and maybe changes to make nsPrincipal::Subsumes not be the same as nsPrincipal::Equals (though it may not matter). But it would still break a lot of subframe communication cases. How about a slightly different approach? If we're just doing this on trunk, we could implement the "checkConnect(nsIPrincipal, nsIURI)" function sicking and I have been talking about which will be used for things like XMLHttpRequest and XBL loads. There it will be clear that one thing is the subject and the other is the object, and we make things asymmetric in that check. For everything else, where we're actually doing a comparison of two principals, have symmetric checks that fail for file:/// principals that are not part of the same "group", and propagate the group to the subframe or new window any time a page does a load of something that passes the "checkConnect" check above. Or something along those lines. In other words, instead of worrying about what file:// URIs are same-origin with what other file URIs, worry about which principals are same-origin with other principals, and store more state in the principals.
Comment 77•17 years ago
|
||
Note that pretty much any sane fix to this bug will regress bug 204140 unless the XBL is placed in the profile's "chrome" dir. I _think_ that's OK, at least as a first cut. But eventually maybe we want to flag some file:// dirs as being able to link to anything they want on disk by default... Or maybe we just want to give the user stylesheet the system principal.
Comment 78•17 years ago
|
||
(In reply to comment #64) > change "save complete" to save in a jar with .zip extension, probably adding > simple entry .html file ? The standard for that need is mhtml (RFC 2557) - please do not consider inventing another method - real mhtml support in Fx would be a great thing in every respect
Comment 79•17 years ago
|
||
Note that bug 402781 is running into the asymmetry issues here quite directly. We really need to get that addressed.
This bug has been fixed. For our sanity, please open new bugs on things that broke over this. And mark them as blocking this bug as we usually do for regressions.
Status: NEW → RESOLVED
Closed: 17 years ago
Resolution: --- → FIXED
Assignee | ||
Comment 83•17 years ago
|
||
This change is most likely too incompatible to land on the 1.8 branch but nominating nonetheless.
Flags: wanted1.8.1.x?
Depends on: 428815
Keywords: dev-doc-needed
Assignee | ||
Updated•16 years ago
|
Flags: wanted1.8.1.x? → wanted1.8.1.x-
Updated•16 years ago
|
Flags: in-testsuite?
Comment 84•16 years ago
|
||
Auch, guys, can you please tell me a possible exploit of xml-stylesheet processing instruction linking XSLT stylesheet in parent directory (<?xml-stylesheet type="text/xsl" href="../../descriptor.xsl"?> in locally saved XML)? And what is wrong with a stylesheet including other one to use common functionality? After reading this discussion I see, that fixing/wont-fixing this bug is viewed as users vs. developers point of view. But that is not the case. In a typical use-case in our company a user reads documentation from file system (locally saved XML files on a shared file system). The documentation is structured in the directories, XML files reference common stylesheet library, which is located in parent directory. And the resulting HTML links JS, which is in the XML's parent directory... This bug is, unfortunately, "solved" in RC1. I think the issue we are dealing with is much more complex. Please, revert to wont-fix or provide a solid solution which takes into consideration also use cases beyond the e-mail one.
Comment 85•16 years ago
|
||
Maros, This bug is fixed, and correctly labeled as such. Your issue is bug 397894. See bug 397894 comment 7 for your question about a possible exploit.
Comment 86•16 years ago
|
||
*Please* forgive my infinite ignorance, but what was done here? Does my local file have access to local subdirectories? I've read this whole thread and think I grock the issues, but I NO UNERSTAND WHAT U DID :-( What I have is an xmlhttprequest to just a subdirectory name (one plain alpha word), looking to get a directory listing in return. This still works great in http:, and worked great in file: in ff2. Now in ff3 in file: I get "Access to restricted URI denied". Could I have coded it wrong somehow? Or is this turned off now? What does my local file now have access to? I kowtow profusely. I will not presume to offer any opinion here, but will be happy to explain myself if anyone is curious. Please smack me upside the cranium with an aluminum clue stick.
As far as I know that should work. Yes, you are allowed to access subdirectories. The only thing I can think of is if getting directory listings have also been disabled for file://. I don't know if we do this or not, but it wouldn't be related to this bug. I suggest you bring your problem up in the mozilla.dev.security or mozilla.dev.tech.dom newsgroups as that is a better place to have general discussions.
Comment 88•16 years ago
|
||
Yes, you can access subdirectories as long as they're not symlinked (at least on a file system that supports such things). Trying to traverse symlinked subdirectories causes an error. I was going to report this as a separate bug. Anyone have any input here, or should I just file it? Cheers, - Bill
Comment 89•16 years ago
|
||
Uh... Accessting subdirectories should not in fact work, per the patches checked in for this bug. It was disabled as part of this patch to fix bug 209234, as far as I can tell. You can access files in subdirectories, but not directory listings themselves.
Comment 90•16 years ago
|
||
Boris - Sorry I was being unclear before. You are correct. You cannot access the subdirectories themselves (i.e. to get directory listings, etc.). What I was referring to was a file that's in a symlinked subdirectory. I am accessing a file, except I'm accessing it through a subdirectory that is a symlink and that is failing. Cheers, - Bill
Comment 91•16 years ago
|
||
William, I was responding to comment 86 and 87. What you're seeing sounds like a bug, and you should definitely file it. Please cc me on the bug.
Comment 92•16 years ago
|
||
OK I see now, thanks for the info. It kind of seems a shame to turn off local directory listings, does anything else use the application/http-index-format format? http://www.mozilla.org/projects/netlib/dirindexformat.html But I guess I can see the point. I don't want to beat a dead horse anyway. Thanks again people. btw the security.fileuri.strict_origin_policy switch made this work, which helps me with testing for the web, but still blocks it from also being a local app :-(
Comment 93•15 years ago
|
||
Can someone summarize for me the end result of this bug? There are a lot of comments here and I want to be sure I don't misinterpret anything when I write up this change.
Comment 94•15 years ago
|
||
Added https://developer.mozilla.org/en/Same-origin_policy_for_file%3a_URIs to cover this.
Keywords: dev-doc-needed → dev-doc-complete
Reporter | ||
Updated•11 years ago
|
Whiteboard: [sg:high] Saved pages and other local HTML files can read user's files → [Saved pages and other local HTML files can read user's files]
Comment 95•11 years ago
|
||
this doesn’t work at all. i have /dir1/index.html in it is an iframe going to dir2/index.html (relative path, so it goes to /dir1/dir2/index.html) JS in the former fails to access the iframe’s contentDocument/Window security.fileuri.strict_origin_policy=false does nothing. wasted hour on this **** for nothing. how the **** do i make it work?
Comment 96•11 years ago
|
||
flying sheep, I just tried the case you describe and it works fine, as expected. Whatever problem you're seeing, your description in comment 95 is leaving out some crucial aspect of the steps to reproduce. Please file a new bug with the _exact_ code you're using attached and cc me on that bug.
Comment 97•11 years ago
|
||
seems to be a caching issue in firefox: i had a file:// main page with a http://localhost:port/ iframe. then i changed the iframe’s src attribute to point to another file:// url. when i right clicked the iframe, and selected “only show current frame”, the the page with the localhostz url (which it previously pointed to) came up, while accessing the iframe DOM element’s “src” attribute faithfully showed its new value. the server running on localhost wasn’t even running anymore and ctrl+R on the main page did nothing. where shall i report that?
Comment 98•11 years ago
|
||
That sounds like the expected behavior if you were doing non-forced reloads, for what it's worth.
Comment 99•11 years ago
|
||
i forced the reload on the main page. it worked only after i accessed the (cached) frame and force(!)-reloaded *that*, which resulted in a 404, switched back to the main page and reloaded that. shouldn’t force-reloading a page also ensure that all iframes in the page are loaded from the point the iframes’ “src” points to? but this is the wrong place to discuss this: which component sould i file a bug for?
Comment 100•11 years ago
|
||
> shouldn’t force-reloading a page also ensure that all iframes in the page are > loaded from the point the iframes’ “src” points to? I believ no. It force-reloads their actual history entries. > which component sould i file a bug for? Document Navigation, if a bug is needed.
Assignee | ||
Updated•5 years ago
|
Flags: in-testsuite?
You need to log in
before you can comment on or make changes to this bug.
Description
•