null (%00) in filename fakes extension (ftp, file)
Categories
(Core :: Networking: File, defect, P3)
Tracking
()
People
(Reporter: mindwarper, Assigned: jesup)
References
Details
(4 keywords, Whiteboard: [necko-triaged])
Attachments
(5 files)
1.31 KB,
patch
|
Details | Diff | Splinter Review | |
1.43 KB,
patch
|
Details | Diff | Splinter Review | |
3.61 KB,
patch
|
bzbarsky
:
review+
darin.moz
:
superreview+
asa
:
approval-aviary+
asa
:
approval1.7.5+
|
Details | Diff | Splinter Review |
2.56 KB,
patch
|
caillon
:
approval1.4.3+
|
Details | Diff | Splinter Review |
6.12 KB,
patch
|
jduell.mcbugs
:
feedback+
|
Details | Diff | Splinter Review |
User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.7) Gecko/20040707 Firefox/0.9.2 Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.7) Gecko/20040707 Firefox/0.9.2 The first bug is that firefox stores the cache data in a known location. This location depends partially on the OS, but in windows 2000 the path look as following: C:\Documents and Settings\Administrator\Application Data\Mozilla\Firefox\Profiles\default.nop\Cache. There are 3 files in this directory that have known names, _CACHE_001_, _CACHE_002_, and _CACHE_003_. The second bug is the famous NULL bug. By submiting the following URI "file://C:\Documents and Settings\Administrator\Application Data\Mozilla\Firefox\Profiles\default.nop\Cache\_CACHE_001_" we cause firefox to pop up a download window, but since we want to cause firefox to execute the javascript code inside one of the CACHE files we can just do the following: "file://C:\Documents and Settings\Administrator\Application Data\Mozilla\Firefox\Profiles\default.nop\Cache\_CACHE_001_%00.txt" or "file://C:\Documents and Settings\Administrator\Application Data\Mozilla\Firefox\Profiles\default.nop\Cache\_CACHE_001_%00.html" Firefox thinks that we are calling a html/text file, yet it still only open the cache file without the .html. The combination of these 2 bugs could lead to the following situations: If someone finds a way to redirect a user in Internet Zone to a file:// location, it is possible to execute code on a victim in Local Zone and thus compromising the victims computer. If the attacker can make a user visit his website (to store the malicious code in one of the cache files) and then make him go to one of the 2 urls shown above, the attacker can take over the victims computer. I hope you take this seriously, it's not as bad as the Internet Explorer vulnerabilities, but it's close. - Mindwarper - mindwarper@mlsecurity.com - http://www.mlsecurity.com Reproducible: Always Steps to Reproduce: Clearly explained in Details Section. Actual Results: I was able to execute javascript in local zone. Expected Results: Software should create an unkown path to the cache directory, and ignore %00 when reading local files.
openning bug since it was disclosed by the reporter
Comment 2•20 years ago
|
||
deleting useless URL (it's not required, please only fill in if it's useful as a testcase or similar). The profile directory name contains three random characters. Guessing them would not be as trivial as you suggest. Web content is not allowed to open file:/// urls "local" files don't have any special powers that web content doesn't have, except the ability to open other file:/// urls The fixed-name cache files contain random bits of different files, it would be extremely hard to get your specific attack file to come up first against a random victim. The null-in-filename bug definitely needs to be fixed, but I don't see how it's exploitable on its own. It's also not firefox-specific: reassigning
Updated•20 years ago
|
Updated•20 years ago
|
Reporter | ||
Comment 3•20 years ago
|
||
This vulnerability also appears in other protocols that are not http. For example, you can easily tell the difference between the following two results: ftp://ftp.gnu.org/welcome.msg and ftp://ftp.gnu.org/welcome.msg%00.html Also while testing this I found out that when requesting user:pass@domain through firefox, firefox does not hash or hide the pass field, instead it leaves it in plain text allowing anyone with access to the computer to view other users's ftp user/pass.
Comment 4•20 years ago
|
||
This patch makes unix versions of Mozilla refuse file URLs generating suspicious filenames: - including a null character (from %00) - including /../ or trailing /.. (from /..%2f, /.%2e etc--URL parsing does not grok these encoded sequences) - not starting with a slash (including empty filenames) Cases #2 and #3 are not known to have any dangerous consequences (and #3 should never happen) but we all know the rule: better safe than sorry. I do not dare to write a similar piece of code for Windows because their interpretation of filenames is too magical for me (see canonical_filename() in Apache). Other platforms may need other specific checks.
Comment 5•20 years ago
|
||
I've just observed a really odd side-effect of %00 in a file URL: I've got a text file called /blah/blah (no extension). file:///blah/blah is displayed as text/plain but file:///blah/blah%00 (still no extension) is displayed as text/html!
That patch isn't enough. And I'd argue it's the wrong approach. We should fix
the problem at/before the call to NS_UnescapeURL not after it's munged the string.
Windows goes through net_GetFileFromURLSpec (nsurlhelperwin.cpp)
for people playing along at home:
> necko.dll!net_GetFileFromURLSpec(const nsACString & aURL={...}, nsIFile * *
result=0x00ed24cc) Line 133 C++
necko.dll!nsStandardURL::GetFile(nsIFile * * result=0x0012fe40) Line 2223 C++
necko.dll!nsFileChannel::GetClonedFile(nsIFile * * result=0x0012fe5c) Line 84 C++
necko.dll!nsFileChannel::EnsureStream() Line 99 C++
necko.dll!nsFileChannel::AsyncOpen(nsIStreamListener * listener=0x00eb7870,
nsISupports * ctx=0x00ed2930) Line 455 C++
TestProtocols.exe!StartLoadingURL(const char * aUrlString=0x00364928) Line
685 + 0xf C++
TestProtocols.exe!main(int argc=0x00000003, char * * argv=0x003648d8) Line
835 + 0x7 C++
TestProtocols.exe!mainCRTStartup() Line 400 + 0xe C
kernel32.dll!GetCurrentDirectoryW() + 0x44
TestProtocols.exe -verbose "file:///R:/testxpc.html%00.foo"
Comment 7•20 years ago
|
||
There's also code that checks for \n and \r when creating FTP URL objects (in nsFtpProtocolHandler::NewURI), but it does that w/o unescaping them. Should we unescape there too, and check for embedded nulls there as well?
Comment 8•20 years ago
|
||
+ // test for bad stuff: missing leading /, nulls, /../ Why does / or /../ matter in a file url?
Comment 9•20 years ago
|
||
> We should fix the problem at/before the call > to NS_UnescapeURL not after it's munged the string. Checking after NS_UnescapeURL (or any transformation in general) is good because it no bad things can reappear during the transformation. (Look at the infamous IIS "Unicode bug": they checked URI before the last transformation from UTF-8 to ASCII and that transformation (due to sloppy coding accepting illegal UTF-8 sequences, indeed) was able to introduce bad things that should have already been dealt with (/../)). It might make some sense to do these checks at the end of NS_UnescapeURL. Unfortunately, there are two problems: - there are different policies for different protocols/platforms (POSIX filesystem API cannot handle \0, Win32 filesystem API should be protected from \0 as well as magical device names (*) or god knows what else, FTP cannot handle \r and \n and \0), - the interface of (most variants of) NS_UnescapeURL cannot report any errors to the caller, (*) Try this on a Windows machine: <img src="con.png"> BTW: A comment in nsEscape.h reads "Expands URL escape sequences... beware embedded null bytes!" Very funny... > Prevent loading of ftp:// URI's with %00 in the path... The code in netwerk/protocol/ftp is quite messy IMHO. The check in nsFtpProtocolHandler::NewURI is (almost) pointless as it checks the URI before it is unescaped (see above). Moreover, usernames and passwords and should be checked for nulls too (to make things worse, they are checked for \r and \n in UCS-2 form before they are converted to ASCII with AppendWithConversion()...it appears untranslateable chars are converted to a rather bening '.' but it is still ugly). > Why does / or /../ matter in a file url? It is "a *missing* leading / or ..." Anyway, such things should never appear in the resulting filename. This is a proactive measure. E.g. one day, someone might decide to divide file:-URL namespace into multiple mutually untrusting domains or something similar and such a check will stop attempts to fool Mozilla with things like /.%2e/. (Hmm...I guess it might make sense to forbid /./ as well.)
Comment 10•20 years ago
|
||
jst: what about other control characters. maybe we want to pass esc_SkipControl to NS_UnescapeURL? or if we know that everything needs to be unescape for FTP, then maybe we want to verify that the resulting string does not contain any control characters?
Comment 11•20 years ago
|
||
(In reply to comment #10) > jst: what about other control characters. maybe we want to pass esc_SkipControl > to NS_UnescapeURL? or if we know that everything needs to be unescape for FTP, > then maybe we want to verify that the resulting string does not contain any > control characters? Darin, I guess the question is do we want to skip control characters, or flag URIs with such characters as invalid and refuse to do anything with them? I'd vote for the latter.
Comment 12•20 years ago
|
||
Yeah, you make a good point. Rejecting these URLs is probably best.
Comment 13•20 years ago
|
||
Any update on this bug? It would be nice if we could get a fix in soonish here...
Updated•20 years ago
|
Comment 14•20 years ago
|
||
Updated•20 years ago
|
Comment 15•20 years ago
|
||
Comment on attachment 154673 [details] [diff] [review] Prevent creation of ftp: URI's with nulls in them >Index: netwerk/protocol/ftp/src/nsFtpProtocolHandler.cpp > nsFtpProtocolHandler::NewURI(const nsACString &aSpec, ... >+ char* fwdPtr = spec.BeginWriting(); nit: |char *fwdPtr| sr=darin
Comment 16•20 years ago
|
||
Aren't we unescaping twice now? Also, why just ftp? Isn't this a problem in general?
Comment 17•20 years ago
|
||
For the record, IE cuts an FTP URL at the %00 mark, and we'll refuse to recognize it as a valid URL (i.e. a link with a bad FTP URL in it won't show up as a link, and won't be clickable). *I* doubt that would affect any real usage.
Comment 18•20 years ago
|
||
Comment on attachment 154673 [details] [diff] [review] Prevent creation of ftp: URI's with nulls in them r=bzbarsky (For the curious, the answers are: 1) Yes, but we unescape on a copy the first time and don't use it thereafter. 2) For other protocols (eg HTTP), %00 is valid in a URI.)
Comment 19•20 years ago
|
||
I do think that file: has the same problem (that's why I put it into the summary :) )
Comment 20•20 years ago
|
||
> Win32 filesystem API should be protected from > \0 as well as magical device names (*) > > (*) Try this on a Windows machine: <img src="con.png"> If anyone wants to do the magical device name protection, there is some code for it at: http://lxr.mozilla.org/mozilla/source/widget/src/windows/nsDataObj.cpp#507 At bug 103468 comment #36 I did some testing for control characters in filenames and found that Windows seems to protect itself from them.
Comment 21•20 years ago
|
||
ftp: fix checked in on the trunk.
Updated•20 years ago
|
Comment 22•20 years ago
|
||
Comment 23•20 years ago
|
||
Comment on attachment 154673 [details] [diff] [review] Prevent creation of ftp: URI's with nulls in them a=asa for the 1.7.2 mini-branch and the aviary branch.
Comment 24•20 years ago
|
||
Comment on attachment 154726 [details] [diff] [review] FTP patch for 1.4 branch a=blizzard for 1.4
Comment 26•20 years ago
|
||
Can someone supply a testcase with details of expected behavior. I just ran this: "file://C:\Documents and Settings\Administrator\Application Data\Mozilla\Firefox\Profiles\default.nop\Cache\_CACHE_001_%00.html" It launched a testcase from another security bug that I had just verified. Should that have happened?
Comment 27•20 years ago
|
||
Note: The Common Vulnerabilities and Exposures project (cve.mitre.org) has assigned the name CAN-2004-0760 to this issue.
Comment 28•20 years ago
|
||
(In reply to comment #26) > It launched a testcase from another security bug that I had just verified. > Should that have happened? It should have. Sort of. This is the "expected" wrong behaviour: it interprets the cache file as HTML and runs whatever JS it finds there. A simpler and more deterministic test is as follows: 1. create evil.txt containing the following text <html><body onload="alert('Gotcha!')"></body></html> 2. open file://.../evil.txt it will display the text, ok 3. open file://.../evil.txt%00.html if a "Gotcha!" alert pops up then you have a problem
Comment 29•20 years ago
|
||
(In reply to comment #28) > 3. open file://.../evil.txt%00.html > if a "Gotcha!" alert pops up then you have a problem No alert, only display of file://.../evil.txt. But that is the error. If comment 18 states that a file URL with %00 in it is valid, I EXPECT that I get a file not found error for file://.../evil.txt%00.html because that file doesn't exist (or are not allowed due to file system restrictions) and I don't want Mozilla to fix up that url internally to file://.../evil.txt and display the content without changing the (valid) URL in the address bar.
Comment 30•20 years ago
|
||
(In reply to comment #28) > It should have. Sort of. This is the "expected" wrong behaviour: it interprets > the cache file as HTML and runs whatever JS it finds there. note that the patch here does not fix file: > 3. open file://.../evil.txt%00.html > if a "Gotcha!" alert pops up then you have a problem for some definition of "problem", since the only issue is that the url shown in the urlbar does not quite match the content. (In reply to comment #29) > I don't want Mozilla to > fix up that url internally to file://.../evil.txt and display the content > without changing the (valid) URL in the address bar. that "fixup" is not done intentionally, it's a side-effect of how this code stores strings and does unescaping, I'm sure.
Comment 31•20 years ago
|
||
from the following: A simpler and more deterministic test is as follows: 1. create evil.txt containing the following text <html><body onload="alert('Gotcha!')"></body></html> 2. open file://.../evil.txt it will display the text, ok ---yes, opens the text file as is--- 3. open file://.../evil.txt%00.html if a "Gotcha!" alert pops up then you have a problem ---a file not found error message. no file type conversion---
Comment 32•20 years ago
|
||
On Linux, I am seeing this on Firefox 0.9+ from today(0804) and the latest Firefox 0.9.3 bits. Mac and Windows 0.9+ builds looked good.
Comment 33•20 years ago
|
||
just tested with the ftp test in comment #3 and this passed on linux ff 0.9.3
Comment 34•20 years ago
|
||
(In reply to comment #32) Note: your test is for file URIs (how did this bug morph?). Anyway, the word from jst is (transcribed from #developers on IRC): Jul 29 22:14:20 <caillon> jst, yt? Jul 29 22:17:18 <jst> caillon: y Jul 29 22:18:19 <caillon> jst, is 250906 ready for landing? Jul 29 22:18:41 <caillon> jst, (comment 19 and 20 seem to say no...) Jul 29 22:21:06 <jst> caillon: Yeah, the ftp: part of that bug is ready... we kinda don't care about file: for now, since you can't link to that from web content anyways Jul 29 22:21:21 <jst> caillon: that part should of course be fixed too, at some point Jul 29 22:21:38 <caillon> jst, ok that's what i thought. Jul 29 22:21:46 <caillon> jst, i just wanted to make sure before i started backporting it
Updated•20 years ago
|
Updated•20 years ago
|
Updated•20 years ago
|
Comment 35•20 years ago
|
||
The NULL bug is still present in Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.5) Gecko/20041107 Firefox/1.0 This is a security bug, so why has it been fixed, you guys? I think %00 should be omitted in every URL, unless there's a legitemate use for it. Just my 2 euro cents
Updated•18 years ago
|
Comment 37•11 years ago
|
||
Mavericks, are you interested in taking a crack at this bug? It's not clear to me what part was fixed and what's left.
Comment 38•11 years ago
|
||
Tested as per comment #28 on FF 18.0.2 on Win Vista Basic(x86) file://<path>.txt shows text file as is file://<path>.txt%00.html does nothing
Comment 39•11 years ago
|
||
Builds on top of Pavel's patch 153078 Compiles and pending testing. Some testcases are at http://www-archive.mozilla.org/quality/networking/testing/filetests.html and few more mentioned in comment #20
Updated•11 years ago
|
Comment 40•11 years ago
|
||
Comment on attachment 725268 [details] [diff] [review] Patch that prevents null(%00) and addresses windows specific issues in file url Review of attachment 725268 [details] [diff] [review]: ----------------------------------------------------------------- Mostly seems good, though it seems we're only fixing checking for "." and ".." here, not %00? I'm also not the right reviewer for the forbidden name stuff. Given that this bug already has patches that landed in 2004 in Firefox 0.9 :) we should definitely move this new patch into a new bug (please CC me on it). You also need a test--an xpcshell test shouldn't be too hard to write, and it looks like netwerk/test/unit/test_bug396389.js has the basic infrastructure. ::: netwerk/base/src/nsURLHelperUnix.cpp @@ +82,5 @@ > + // test for bad stuff: missing leading /, /../, /./ > + nsACString::const_iterator iter, end; > + path.BeginReading(iter), path.EndReading(end); > + int slashdotdot = 0; > + if (iter == end || *iter != '/') return NS_ERROR_MALFORMED_URI; put return statements on a new line. @@ +85,5 @@ > + int slashdotdot = 0; > + if (iter == end || *iter != '/') return NS_ERROR_MALFORMED_URI; > + for (; iter != end; ++iter) { > + if (*iter == '/') { > + if (slashdotdot == 3 || slashdotdot == 2) return NS_ERROR_MALFORMED_URI; Why do we allow slashdotdot == 1 here, too--is it important to allow "//"? @@ +89,5 @@ > + if (slashdotdot == 3 || slashdotdot == 2) return NS_ERROR_MALFORMED_URI; > + slashdotdot = 1; > + } > + else if (*iter == '.') { > + if (slashdotdot > 0 && slashdotdot <= 3) ++slashdotdot; I assume that clamping slashdotdot to 4 here and checking only for 2 or 3 above is so that you only fail for "." and "..", but allow "..." and "....", etc to be valid filenames? If true leave a comment saying that. @@ +93,5 @@ > + if (slashdotdot > 0 && slashdotdot <= 3) ++slashdotdot; > + } > + else slashdotdot = 0; > + } > + //check for /. , /.., etc as filenames comprising of periods only are not allowed Your code appear to allow names of 3 periods or more, so comment wrong? ::: netwerk/base/src/nsURLHelperWin.cpp @@ +94,5 @@ > } > > NS_UnescapeURL(path); > + > + // check for %00 Does this patch in fact check for %00? It seems to be checking mostly for "." and "..", the forbiddenChars checks are only on windows, and even there I don't see '\00' listed as a forbidden char. @@ +114,5 @@ > + else if (*iter == '.') { > + if (slashdotdot > 0 && slashdotdot <= 3) ++slashdotdot; > + } > + else { > + if ( forbiddenChars.FindChar(*iter) != -1 ) return NS_ERROR_MALFORMED_URI; So the 20 lines or so of code here are common across unix/windows except for this check of forbiddenChars. It seems like you could just always check forbiddenChars (the list evaluates to empty on Unix, but it ought to work fine?) and stick this code in some common central location. @@ +126,5 @@ > + // check for \con or \con.xyz > + // CLOCK$ as filename or CLOCK$.txt's allowed on Win Vista > + // XXX move it to proper location for reuse > + static const char* forbiddenNames[] = { > + "COM1", "COM2", "COM3", "COM4", "COM5", "COM6", "COM7", "COM8","COM9", "LPT1", "LPT2", "LPT3", "LPT4", "LPT5", "LPT6", "LPT7","LPT8", "LPT9", "CON", "PRN", "AUX", "NUL" }; //"CLOCK$" I'm not the right person to know if this magic list is correct. Not sure who the right reviewer is, either--sorry. Ask bsmedberg?
Comment 41•2 years ago
|
||
In the process of migrating remaining bugs to the new severity system, the severity for this bug cannot be automatically determined. Please retriage this bug using the new severity system.
Comment 42•1 year ago
|
||
We don't support FTP anymore, so moving this to Networking: File - not sure if this is still an issue or not.
Updated•1 year ago
|
Assignee | ||
Comment 43•3 months ago
|
||
The %00 issue appears to have been fixed; testing per comment 28 does not generate an alert.
That said, we could land an updated version of this patch. Question: should it guard against trying create a file: URL that ends in /. or /.. ? If you try to open those, you'll fail anyways.
Ditto with the checks for restricted characters and special names on Windows (CON, COM0, LPT1, etc) - do we want to restrict the ability to create a file: URL to those? Does it actually gain anything?
Assignee | ||
Comment 44•3 months ago
|
||
%00 was fixed in bug 383478
Description
•